
AWS Data Engineer - 12+|| F2F
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer with 12+ years of experience, focusing on ETL, cloud migration, and Python. Contract length exceeds 6 months, with a pay rate of $60-$65/hour. Location is hybrid in Dallas, TX.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
August 22, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX 75244
-
π§ - Skills detailed
#Spark (Apache Spark) #AWS (Amazon Web Services) #Agile #Data Pipeline #Programming #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #Code Reviews #Security #Data Engineering #Scala #"ETL (Extract #Transform #Load)" #Unit Testing #Infrastructure as Code (IaC) #Data Processing #Data Integrity #Airflow #Data Quality #Classification #Data Modeling #Automation #PySpark #Apache Airflow #Lambda (AWS Lambda) #DevOps #Data Security #IAM (Identity and Access Management) #Terraform #Documentation #Cloud #Python #Migration
Role description
Job Title: AWS Data EngineerLocation: Dallas, TX (Hybrid β 3 days per week in-office)Interview Process: In-Person InterviewProfiles Required: 12+ Years of Experience
Locals & Non-Locals Can Apply
Core Skills: Strong AWS Services, Python/PySpark, Advanced SQLSubmission Details: Resume, LinkedIn URL, Work Authorization, Location Details
Job Description
Our esteemed client is seeking an experienced AWS Data Engineer with strong expertise in ETL testing, cloud migration, and Python programming in a production-grade AWS environment. You will design and maintain scalable data pipelines, ensure data quality through rigorous ETL testing, and play a key role in large-scale cloud migration initiatives.
This role is ideal for someone passionate about building efficient, secure, and reliable data solutions, applying functional design principles, and leveraging automation to deliver business impact.
Key Responsibilities
- Develop and maintain scalable ETL pipelines within the AWS ecosystem.
- Conduct ETL testing to ensure data integrity, accuracy, and performance.
- Lead and support large-scale cloud migration projects.
- Use Python (primary language) along with SQL and PySpark to develop data processing and automation solutions.
- Orchestrate data workflows using Apache Airflow (including MWAA).
- Collaborate with cross-functional teams to design data models and implement industry-standard data security and classification methodologies.
- Monitor, troubleshoot, and optimize pipelines for reliability and performance.
- Document design, conduct code reviews, and apply test-driven development (TDD) practices.
- Work in Agile environments, participating in sprint planning and daily stand-ups.
- Ensure a smooth transition of data during cloud migrations, applying DevOps and CI/CD practices where needed
Qualifications
- 12+ years of experience in Data Engineering, focusing on ETL, Cloud Migration, and Python development.
- 5+ years of hands-on experience with Python (primary), SQL, and PySpark, applying functional design principles and software design patterns.
- 5+ years building and deploying ETL pipelines across on-prem, hybrid, and cloud environments, with orchestration experience using Airflow.
- 5+ years of production-level AWS experience (MWAA, Glue/EMR (Spark), S3, ECS/EKS, IAM, Lambda).
- Solid understanding of core statistical principles, data modeling, and data security best practices.
- 5+ years working in Agile development with unit testing, TDD, and design documentation.
- 2+ years of DevOps/CI-CD experience, including Terraform or other Infrastructure-as-Code (IaC) platforms.
- Excellent problem-solving skills, with the ability to troubleshoot complex data and performance issues.
- Strong communication skills and ability to work effectively in a hybrid model (3 days onsite in Dallas).
Flexible work from home options available.
Job Types: Full-time, Contract
Pay: $60.00 - $65.00 per hour
Ability to Commute:
Dallas, TX 75244 (Required)
Ability to Relocate:
Dallas, TX 75244: Relocate before starting work (Required)
Work Location: In person
Job Title: AWS Data EngineerLocation: Dallas, TX (Hybrid β 3 days per week in-office)Interview Process: In-Person InterviewProfiles Required: 12+ Years of Experience
Locals & Non-Locals Can Apply
Core Skills: Strong AWS Services, Python/PySpark, Advanced SQLSubmission Details: Resume, LinkedIn URL, Work Authorization, Location Details
Job Description
Our esteemed client is seeking an experienced AWS Data Engineer with strong expertise in ETL testing, cloud migration, and Python programming in a production-grade AWS environment. You will design and maintain scalable data pipelines, ensure data quality through rigorous ETL testing, and play a key role in large-scale cloud migration initiatives.
This role is ideal for someone passionate about building efficient, secure, and reliable data solutions, applying functional design principles, and leveraging automation to deliver business impact.
Key Responsibilities
- Develop and maintain scalable ETL pipelines within the AWS ecosystem.
- Conduct ETL testing to ensure data integrity, accuracy, and performance.
- Lead and support large-scale cloud migration projects.
- Use Python (primary language) along with SQL and PySpark to develop data processing and automation solutions.
- Orchestrate data workflows using Apache Airflow (including MWAA).
- Collaborate with cross-functional teams to design data models and implement industry-standard data security and classification methodologies.
- Monitor, troubleshoot, and optimize pipelines for reliability and performance.
- Document design, conduct code reviews, and apply test-driven development (TDD) practices.
- Work in Agile environments, participating in sprint planning and daily stand-ups.
- Ensure a smooth transition of data during cloud migrations, applying DevOps and CI/CD practices where needed
Qualifications
- 12+ years of experience in Data Engineering, focusing on ETL, Cloud Migration, and Python development.
- 5+ years of hands-on experience with Python (primary), SQL, and PySpark, applying functional design principles and software design patterns.
- 5+ years building and deploying ETL pipelines across on-prem, hybrid, and cloud environments, with orchestration experience using Airflow.
- 5+ years of production-level AWS experience (MWAA, Glue/EMR (Spark), S3, ECS/EKS, IAM, Lambda).
- Solid understanding of core statistical principles, data modeling, and data security best practices.
- 5+ years working in Agile development with unit testing, TDD, and design documentation.
- 2+ years of DevOps/CI-CD experience, including Terraform or other Infrastructure-as-Code (IaC) platforms.
- Excellent problem-solving skills, with the ability to troubleshoot complex data and performance issues.
- Strong communication skills and ability to work effectively in a hybrid model (3 days onsite in Dallas).
Flexible work from home options available.
Job Types: Full-time, Contract
Pay: $60.00 - $65.00 per hour
Ability to Commute:
Dallas, TX 75244 (Required)
Ability to Relocate:
Dallas, TX 75244: Relocate before starting work (Required)
Work Location: In person