ETL Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer on an 8-month contract in Chicago, IL, offering $65-70/hr. Key skills include Python or PySpark, SQL development, and AWS services. Requires 6+ years of ETL experience and expertise in big data processing.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
560
🗓️ - Date discovered
April 22, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Chicago, IL
🧠 - Skills detailed
#Aurora #Compliance #Apache Iceberg #S3 (Amazon Simple Storage Service) #Big Data #AWS RDS (Amazon Relational Database Service) #Data Profiling #MySQL #Databases #Cloud #PySpark #SQL (Structured Query Language) #Debugging #SNS (Simple Notification Service) #Monitoring #Data Processing #Dynatrace #Programming #Redshift #AWS (Amazon Web Services) #SQS (Simple Queue Service) #Lambda (AWS Lambda) #Data Cleansing #Observability #Spark (Apache Spark) #Datadog #RDS (Amazon Relational Database Service) #"ETL (Extract #Transform #Load)" #Python #Data Pipeline #Scala #AWS Glue #Automation #Data Engineering
Role description

Swoon is partnering with a leading global airline to find a skilled ETL Developer for an exciting initial 8-month contract opportunity. This hybrid role is based in Chicago, IL and offers the chance to work with a high-performing team modernizing data pipelines and delivering enterprise-scale solutions that support critical business operations. With a strong potential for extension or conversion based on performance, this is a prime opportunity to join a globally recognized organization at the forefront of innovation in data engineering.

As an ETL Developer, you’ll be instrumental in designing, building, and optimizing complex data workflows using Python or PySpark, AWS Glue, and a suite of modern cloud technologies. You’ll collaborate with cross-functional teams to support scalable big data environments, develop efficient ETL solutions, and drive operational excellence through automation and observability.

If you're passionate about data engineering, enjoy working in dynamic environments, and bring expertise in AWS, SQL development, and big data processing, apply today to join a collaborative, future-focused team making a real impact in the aviation industry.

Here are the details:

Location: Chicago, IL (Must be Chicago location at this point we go to office every Tue/Wed alternate week but may change in future) No travel at this time.

Duration: Initial 8-month contract (through 1/30/2026) and high potential to extend/convert based on performance

Pay Rate: $65-70/hr W2 (W2 only)

Job #: 15271

Top 5 Skill sets

  1. Python or PySpark

  1. Complex SQL Development, debugging, optimization

  1. AWS – Glue, Step Functions,

  1. Knowledge of inner working of Databases – like AWS RDS MySQL

  1. Big Data Processing

Nice to have skills or certifications:

   • Experience as a lead for decent sized ETL team

   • Experience with Apache Iceberg

   • Observability tools like Dynatrace or DataDog

Job Summary:

An ETL developer needs to design, build, test and maintain systems that extract, load and transform data from multiple different systems.

Primary Responsibilities:

   • Leads, Designs, implements, deploys and optimizes backend ETL services.

   • Support a massive scale enterprise data solution using AWS data and analytics services.

   • Analyze and interpret complex data and related systems and provides the efficient technical solutions.

   • Provide support to ETL schedule and maintain compliance to same.

   • Develop and maintain standards to ETL codes and maintain an effective project life cycle on all ETL processes.

   • Coordinate with cross functional teams like architects, platform engineers, other developers and product owners to build data processing procedures.

   • Perform root cause analysis on production issues and perform routine monitoring on databases and provide support to ETL environments.

   • Help create functional specifications, technical designs and working with business process area owners.

   • Implement industry best practices code and configuration for production and non-production environments in an highly automated environment.

   • Provides technical advice, effort estimate, impact analysis.

   • Provides timely project status and issue reporting to management.

Qualifications:

   • 6+ years experience using ETL tools to perform data cleansing, data profiling, transforming, and scheduling various workflows.

   • Expert level proficiency with writing, debugging and optimizing SQL.

   • 3-4 years programming experience using Python or PySpark/Glue required.

   • Knowledge of common design patterns, models and architecture used in Big Data processing.

   • 3-4 years' experience with AWS services such as Glue, S3, Redshift, Lambda, Step Functions, RDS Aurora/MySQL, Apache Iceberg, CloudWatch, SNS, SQS, EventBridge.

   • Capable of troubleshooting common database issues, familiarity with observability tools.

   • Self-starter, responsible, professional and accountable.

   • A ‘finisher’, seeing a project or task through to completion despite challenges.