ETL Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer with a contract length of "unknown," offering a pay rate of "unknown." Requires 8+ years of ETL experience, 3-4 years with AWS services, and proficiency in SQL and Python/PySpark.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 22, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Illinois, United States
🧠 - Skills detailed
#Aurora #Compliance #Apache Iceberg #S3 (Amazon Simple Storage Service) #Big Data #Data Profiling #MySQL #Databases #Cloud #PySpark #SQL (Structured Query Language) #Debugging #SNS (Simple Notification Service) #Monitoring #Data Processing #Programming #Redshift #AWS (Amazon Web Services) #SQS (Simple Queue Service) #Lambda (AWS Lambda) #Data Cleansing #Observability #Spark (Apache Spark) #RDS (Amazon Relational Database Service) #"ETL (Extract #Transform #Load)" #Python
Role description

Job Description

An ETL developer needs to design, build, test and maintain systems that extract, load and transform data from multiple different systems.

Primary Responsibilities:

   • Leads, Designs, implements, deploys and optimizes backend ETL services.

   • Support a massive scale enterprise data solution using AWS data and analytics services.

   • Analyze and interpret complex data and related systems and provides the efficient technical solutions.

   • Provide support to ETL schedule and maintain compliance to same.

   • Develop and maintain standards to ETL codes and maintain an effective project life cycle on all ETL processes.

   • Coordinate with cross functional teams like architects, platform engineers, other developers and product owners to build data processing procedures.

   • Perform root cause analysis on production issues and perform routine monitoring on databases and provide support to ETL environments.

   • Help create functional specifications, technical designs and working with business process area owners.

   • Implement industry best practices code and configuration for production and non-production environments in an highly automated environment.

   • Provides technical advice, effort estimate, impact analysis.

   • Provides timely project status and issue reporting to management.

Qualifications:

   • 8+ years experience using ETL tools to perform data cleansing, data profiling, transforming, and scheduling various workflows.

   • Expert level proficiency with writing, debugging and optimizing SQL.

   • 3-4 years programming experience using Python or PySpark/Glue required.

   • Knowledge of common design patterns, models and architecture used in Big Data processing.

   • 3-4 years' experience with AWS services such as Glue, S3, Redshift, Lambda, Step Functions, RDS Aurora/MySQL, Apache Iceberg, CloudWatch, SNS, SQS, EventBridge.

   • Capable of troubleshooting common database issues, familiarity with observability tools.

   • Self-starter, responsible, professional and accountable.

   • A ‘finisher’, seeing a project or task through to completion despite challenges.