

AWS Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer on a contract basis, requiring strong skills in shell scripting, Python, SQL (preferably with Snowflake), and AWS S3. Experience with Apache Airflow and data warehouse environments is essential. Contract length and pay rate are unspecified.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
-
ποΈ - Date discovered
June 27, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#SQL (Structured Query Language) #Security #Migration #Data Engineering #AWS (Amazon Web Services) #S3 (Amazon Simple Storage Service) #AWS S3 (Amazon Simple Storage Service) #Data Transformations #Data Pipeline #Scala #Redshift #"ETL (Extract #Transform #Load)" #Shell Scripting #SQL Queries #Scripting #Airflow #Data Ingestion #Storage #Snowflake #Cloud #Python #DevOps #Data Warehouse #Apache Airflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Type: Contract
Job Category: IT
Job Description
Job Title: AWS Data Engineer
Job Summary:
We are seeking a highly skilled Data Engineer to support the migration and enhancement of data workflows from Tidal to Apache Airflow, along with optimizing data processes on Snowflake and AWS. The ideal candidate will have strong experience in shell scripting, Python, SQL, and working in modern data warehouse environments.
Responsibilities:
Analyze and understand existing Tidal job dependencies, schedules, and scripts.
Develop shell scripts and Python-based workflows to replicate and enhance current Tidal processes in Airflow.
Write and optimize SQL queries for Snowflake-based data transformations and validations.
Collaborate with cross-functional teams to ensure data pipelines are robust, scalable, and well-documented.
Work with AWS S3 for data ingestion and storage processes.
Ensure high reliability and performance of data pipelines post-migration.
Provide support during testing and post-migration validation.
Required Skills:
Shell scripting β strong experience required
Python β strong experience required
SQL β strong experience required, preferably with Snowflake
AWS S3 β hands-on experience
Solid Data Warehouse background and understanding of ETL/ELT best practices
Preferred Skills (Nice to Have):
Experience with Apache Airflow
Prior experience with scheduling tool migrations, especially Tidal
Familiarity with CI/CD practices and DevOps for data pipelines
#AWSDataEngineer #DataEngineering #CloudComputing #AWS #DataSolutions #DataPipelines #ETL #DataQuality #DataScience #DataAnalytics #AWSGlue #S3 #EMR #Redshift #DataStorage #DataRetrieval #SQL #Python #Scala #DataModeling #DataWarehousing #DataEngineeringJobs #USJobs.
Required Skills DevOps Engineer Senior Email Security Engineer