AWS Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer with 4+ years of experience, focusing on Snowflake data migration. It offers a hybrid work location in Greater Hartford, CT, with a pay rate based on skills. A Bachelor's degree is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 26, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Greater Hartford
🧠 - Skills detailed
#EC2 #Databricks #Consul #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Data Migration #Airflow #Data Processing #SQS (Simple Queue Service) #Data Science #Scala #Migration #SNS (Simple Notification Service) #Snowflake #Lambda (AWS Lambda) #Data Quality #Data Orchestration #Cloud #Data Pipeline #Data Engineering #Data Integrity
Role description
• • Hybrid | Greater Hartford, CT • • Our client is looking for an AWS Data Engineer to come in and support a Snowflake data migration initiative We can facilitate w2 and corp-to-corp consultants. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance. Skills, experience, and other compensable factors will be considered when determining pay rate. The pay range provided in this posting reflects a W2 hourly rate; other employment options may be available that may result in pay outside of the provided range. W2 employees of Eliassen Group who are regularly scheduled to work 30 or more hours per week are eligible for the following benefits: medical (choice of 3 plans), dental, vision, pre-tax accounts, other voluntary benefits including life and disability insurance, 401(k) with match, and sick time if required by law in the worked-in state/locality. Please be advised- If anyone reaches out to you about an open position connected with Eliassen Group, please confirm that they have an Eliassen.com email address and never provide personal or financial information to anyone who is not clearly associated with Eliassen Group. If you have any indication of fraudulent activity, please contact InfoSec@eliassen.com. Responsibilities • Design, develop, and maintain robust data pipelines and ETL processes using Databricks, Snowflake, and AWS services. • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. • Optimize and tune data processing workflows for performance and scalability. • Implement data quality checks and ensure data integrity across all data pipelines. • Monitor and troubleshoot data pipeline issues, ensuring high availability and reliability. • Stay up-to-date with the latest industry trends and best practices in data engineering and cloud technologies. • Contribute to the design and architecture of our data infrastructure, ensuring it is secure, scalable, and cost-effective. • Document data engineering processes, workflows, and best practices. Experience Requirements • Experience with AWS, Databricks, Snowflake • Work with AWS services such as EC2, SEs, Pinpoint, Lambda, SQS, SNS, Glue • Data orchestration with AWS airflow or similar tools • 4+ years of data engineer experience Education Requirements • Bachelor's degree