

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract position in Fort Mill, SC, offering a pay rate of "X" for "Y" duration. Key skills required include AWS Glue, Snowflake, DBT, and Python, with 3+ years of relevant experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 21, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Fort Mill, SC
-
π§ - Skills detailed
#Data Quality #Documentation #Data Extraction #Fivetran #Apache Airflow #Data Processing #SQL (Structured Query Language) #Data Integration #Data Engineering #Data Pipeline #Data Transformations #Snowflake #Database Management #Programming #AWS (Amazon Web Services) #Python #dbt (data build tool) #Computer Science #AWS Glue #Scala #AWS Lambda #Airflow #Lambda (AWS Lambda) #Data Science #Automation #Snowpark #"ETL (Extract #Transform #Load)" #Scripting
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Engineer
Location: Fort Mill, SC (Hybrid)
Position Type: Contract
Must Have Skills: AWS Glue, Snowflake, DBT,
Good to Have: Lambda, Airflow, Fivetran, HVR
Job Description:
Responsible for designing, developing, and maintaining scalable data pipelines and ETL processes using AWS Glue. This role involves working closely with data engineers, analysts, and other IT professionals to ensure data is efficiently integrated, transformed, and made available for business use.
1. AWS services (Airflow): The candidate should have hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows and pipelines.
1. AWS services (Lambda): Proficiency in AWS Lambda for serverless computing and event-driven architecture is essential for this role.
1. AWS services (Glue): The candidate should be well-versed in AWS Glue for ETL (Extract, Transform, Load) processes and data integration.
1. Snowflake, Snowpark: The candidate should have a deep understanding of Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics.
1. DBT: Experience with DBT (Data Build Tool) for modeling data and creating data transformation pipelines is a plus.
1. Fivetran (HVR); Working knowledge and handson experience on Fivetran HVR.
1. Python: Strong programming skills in Python are required for developing data pipelines, data transformations, and automation tasks.
Responsibilities:
β’ Design, develop, and maintain data pipelines and ETL processes using Snowflake, AWS services, Python, and DBT.
β’ Collaborate with data scientists and analysts to understand data requirements and implement solutions.
β’ Optimize data workflows for performance, scalability, and reliability.
β’ Troubleshoot and resolve data-related issues in a timely manner.
β’ Stay updated on the latest technologies and best practices in data engineering.
β’ Design and implement scalable ETL solutions using AWS Glue
β’ Collaborate with data engineers and analysts to understand data requirements
β’ Develop and manage data extraction, transformation, and loading processes
β’ Optimize and improve existing data pipelines and ETL workflows
β’ Ensure data quality and integrity during the transformation process
β’ Monitor and troubleshoot issues with ETL jobs
β’ Maintain detailed documentation of data workflows and processes
β’ Stay updated with the latest AWS services and tools
Qualifications:
β’ Bachelor's degree in Computer Science, Engineering, or related field.
β’ Proven experience in data engineering roles with a focus on Snowflake, AWS services, Python, and DBT.
β’ Strong analytical and problem-solving skills.
β’ Excellent communication and teamwork abilities.
β’ Technical certifications are a plus.
β’ Bachelor's degree in Computer Science, Information Technology, or a related field
β’ 3+ years of experience working with AWS Glue and other AWS data services
β’ Proven experience in designing and maintaining ETL processes
β’ Strong knowledge of SQL and database management
β’ Familiarity with data warehousing concepts and tools
β’ Experience with Python or other scripting languages
β’ Excellent problem-solving and analytical skills
β’ Strong communication and teamwork abilities
If you believe you are qualified for this position and are currently in the job market or interested in making a change, please email me the resume along with contact details at roshni@nytpcorp.com