Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract-to-hire, remote (CST/EST) with quarterly travel. Requires 7+ years in data engineering, expertise in Snowflake, advanced SQL, Python, AWS Lambda, and workflow automation using AWS Airflow.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 22, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Data Warehouse #dbt (data build tool) #AWS (Amazon Web Services) #Data Extraction #Data Pipeline #AWS Lambda #S3 (Amazon Simple Storage Service) #Fivetran #SQL (Structured Query Language) #SNS (Simple Notification Service) #Data Engineering #Scala #"ETL (Extract #Transform #Load)" #Data Integrity #Airflow #Data Quality #Automation #Azure DevOps #Lambda (AWS Lambda) #DevOps #Snowflake #Data Ingestion #Azure #Python
Role description
Hi, Greetings! We are looking for Sr. Data Engineer for our client in Remote Location. Below are more details on it. Please do let me know if you/your friends would be interested/available. Thank you Job Title: - Sr. Data Engineer Assignment Type: 6 month contract-to-hire Location: Remote (CST or EST) w/ quarterly travel for PI planning Job Summary We are looking for a hands-on Senior Data Engineer with expertise in developing data ingestion pipelines. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance. Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and DBT is a plus. Job Responsibilities β€’ Design, build, test, and implement scalable data pipelines using Python and SQL. β€’ Maintain and optimize our Snowflake data warehouse’s performance, including data ingestion and query optimization. β€’ Extract data from APIs using Python and AWS Lambda and automate workflows with AWS Airflow. β€’ Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality. β€’ Collaborate with other data engineers and architect to develop new pipelines and/or optimize existing ones. β€’ Maintain code via CI/CD processes as defined in our Azure DevOps platform. Job Qualifications β€’ 7+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion. β€’ Expertise in Snowflake, including data ingestion and performance optimization. β€’ Strong SQL skills for writing efficient queries and optimizing existing ones. β€’ Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc. β€’ Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc. β€’ Highly self-motivated and detail-oriented with strong communication skills. β€’ Familiarity with ETL/ELT processes. β€’ Experience with Fivetran and DBT is a plus.