

Kellton
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract-to-hire, remote (CST or EST) with quarterly travel. Requires 7+ years in data engineering, strong Snowflake and SQL skills, Python proficiency, and experience with AWS services and ETL processes.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 22, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Fixed Term
-
π - Security
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Data Ingestion #dbt (data build tool) #Automation #Data Warehouse #SNS (Simple Notification Service) #Data Extraction #Data Integrity #Data Quality #Airflow #Fivetran #Data Pipeline #Azure #Azure DevOps #SQL (Structured Query Language) #Scala #Python #Lambda (AWS Lambda) #Snowflake #DevOps #Data Engineering #AWS (Amazon Web Services) #S3 (Amazon Simple Storage Service) #AWS Lambda #"ETL (Extract #Transform #Load)"
Role description
Hi,
Greetings!
We are looking for Sr. Data Engineer for our client in Remote Location.
Below are more details on it. Please do let me know if you/your friends would be interested/available.
Thank you
Job Title: - Sr. Data Engineer
Assignment Type: 6 month contract-to-hire
Location: Remote (CST or EST) w/ quarterly travel for PI planning
Job Summary
We are looking for a hands-on Senior Data Engineer with expertise in developing data ingestion pipelines. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance.
Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and DBT is a plus.
Job Responsibilities
β’ Design, build, test, and implement scalable data pipelines using Python and SQL.
β’ Maintain and optimize our Snowflake data warehouseβs performance, including data ingestion and query optimization.
β’ Extract data from APIs using Python and AWS Lambda and automate workflows with AWS Airflow.
β’ Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality.
β’ Collaborate with other data engineers and architect to develop new pipelines and/or optimize existing ones.
β’ Maintain code via CI/CD processes as defined in our Azure DevOps platform.
Job Qualifications
β’ 7+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion.
β’ Expertise in Snowflake, including data ingestion and performance optimization.
β’ Strong SQL skills for writing efficient queries and optimizing existing ones.
β’ Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc.
β’ Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc.
β’ Highly self-motivated and detail-oriented with strong communication skills.
β’ Familiarity with ETL/ELT processes.
β’ Experience with Fivetran and DBT is a plus.
Hi,
Greetings!
We are looking for Sr. Data Engineer for our client in Remote Location.
Below are more details on it. Please do let me know if you/your friends would be interested/available.
Thank you
Job Title: - Sr. Data Engineer
Assignment Type: 6 month contract-to-hire
Location: Remote (CST or EST) w/ quarterly travel for PI planning
Job Summary
We are looking for a hands-on Senior Data Engineer with expertise in developing data ingestion pipelines. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance.
Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and DBT is a plus.
Job Responsibilities
β’ Design, build, test, and implement scalable data pipelines using Python and SQL.
β’ Maintain and optimize our Snowflake data warehouseβs performance, including data ingestion and query optimization.
β’ Extract data from APIs using Python and AWS Lambda and automate workflows with AWS Airflow.
β’ Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality.
β’ Collaborate with other data engineers and architect to develop new pipelines and/or optimize existing ones.
β’ Maintain code via CI/CD processes as defined in our Azure DevOps platform.
Job Qualifications
β’ 7+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion.
β’ Expertise in Snowflake, including data ingestion and performance optimization.
β’ Strong SQL skills for writing efficient queries and optimizing existing ones.
β’ Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc.
β’ Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc.
β’ Highly self-motivated and detail-oriented with strong communication skills.
β’ Familiarity with ETL/ELT processes.
β’ Experience with Fivetran and DBT is a plus.