Senior Data Engineer - AWS (W2, CTH Role, Need 12+ Years of Experience)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 12+ years of experience, offering a 6-month contract-to-hire, remote work (CST/EST), and a focus on Snowflake, AWS services, advanced SQL, and Python for data ingestion and ETL processes.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Data Quality #dbt (data build tool) #S3 (Amazon Simple Storage Service) #Snowflake #Data Integrity #DevOps #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Lambda (AWS Lambda) #Data Ingestion #AWS Lambda #Azure #Automation #SQL (Structured Query Language) #Python #Data Extraction #Data Warehouse #Fivetran #Azure DevOps #Data Pipeline #SNS (Simple Notification Service) #Scala #Airflow #Data Engineering
Role description
Position: Sr. Data Engineer Work Location: Remote (working CST or EST) and able to travel quarterly for PI planning Duration: 6 Month (Contract-to-Hire) Manager Notes: Sr. Data Engineer Experience on several DE projects, hands on Strong snowflake expertise - administrations Advanced SQL and Python skills Extracting data from APIs ETL processes, data ingestion AWS - Lambda, airflow Nice to have DBT - nice to have Job Summary We are looking for a hands-on Senior Data Engineer with expertise in developing data ingestion pipelines. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance. Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and DBT is a plus. Job Responsibilities β€’ Design, build, test, and implement scalable data pipelines using Python and SQL. β€’ Maintain and optimize our Snowflake data warehouse’s performance, including data ingestion and query optimization. β€’ Extract data from APIs using Python and AWS Lambda and automate workflows with AWS Airflow. β€’ Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality. β€’ Collaborate with other data engineers and architect to develop new pipelines and/or optimize existing ones. β€’ Maintain code via CI/CD processes as defined in our Azure DevOps platform. Job Qualifications β€’ 7+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion. β€’ Expertise in Snowflake, including data ingestion and performance optimization. β€’ Strong SQL skills for writing efficient queries and optimizing existing ones. β€’ Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc. β€’ Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc. β€’ Highly self-motivated and detail-oriented with strong communication skills. β€’ Familiarity with ETL/ELT processes. β€’ Experience with Fivetran and DBT is a plus.