Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 6-12 month contract, offering remote work. Requires 12+ years of experience with Apache Airflow, ETL pipelines, and scalable data processing using Spark or Java. Amazon experience preferred.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 25, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Pipeline #Apache Airflow #Data Architecture #Data Quality #Scala #Spark (Apache Spark) #Java #Airflow #Version Control #Data Engineering #Data Processing
Role description

Position: Data Engineer

Location: Remote

Duration: 6 – 12 Months Contract

USC/GC/H4EAD/L2EAD/TN Permit/ Refugee/Asylee who are willing to work on W2 can apply for this job

Job Description

   • 12+ Years of experience in Designing and maintaining data workflows using Apache Airflow.

   • Candidate should have worked at Level 5 or 6.

   • Prior Amazon FTE/Contract experience strongly preferred.

   • Able to build scalable data processing pipelines using Spark or Java.

   • Leverage deep experience with Amazon’s internal data engineering tools and systems.

   • Able to Develop and optimize ETL pipelines in distributed environments.

   • Ensure data quality, reliability, and availability across all pipelines.

   • Collaborate with cross-functional teams on data architecture and design.

   • Apply best practices in version control, CI/CD, and testing for data pipelines.

   • Monitor, troubleshoot, and improve performance of large-scale data jobs.

   • Deliver production-ready solutions in a fast-paced, high-standard engineering culture.