Devfi

ETL Developer- Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer, remote for 12 months, offering competitive pay. Requires 8+ years in ETL/Data pipelines with Python, PySpark, and AWS Glue. Must reside in the U.S., have a Bachelor’s in Computer Science, and experience with Healthcare data is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 31, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Computer Science #AWS (Amazon Web Services) #MySQL #Airflow #Agile #Python #Snowflake #AWS Glue #Data Extraction #SQL (Structured Query Language) #Spark (Apache Spark) #Data Lake #SQL Queries #S3 (Amazon Simple Storage Service) #NiFi (Apache NiFi) #Apache NiFi #PySpark #Teradata #Redshift #"ETL (Extract #Transform #Load)" #Apache Airflow
Role description
Any Visa is fine & Must eligible for Public Trust clearance. Title: ETL Developer Location : Remote (United States) Duration: 12 Months Description: • • 8+ years of Experience in building end to end ETL/Data Preparation pipelines using Python, PySpark, AWS Glue. • Ability to obtain a U.S. Federal Position of Trust clearance designation. • Must reside in and be able to perform work in the United States. • Must have lived in the United States for 3 of the last 5 years. • Bachelor’s degree in computer science or equivalent. • Strong critical thinking and problem-solving skills and a desire to take initiative. • Detail-oriented drive to investigate and dissect information with a strong focus on quality. • Demonstrated skills in developing and maintaining good interpersonal relationships. • Experience working with Healthcare data specifically Medicaid/Medicare is a huge plus. • Experience working in Agile or SAFe (scaled agile) is highly preferred. Responsibilities: • Analyze and understand complex business and engineering challenges. • Building data validation pipelines using Great Expectations or similar. • Experience with workflow orchestration tools like Apache Airflow, Prefect, Apache Nifi or equivalent. • Experience working with Data Lake architectures in AWS (well conversant with S3, Glue, EMR, Lake Formation etc). • Writing complex SQL queries for data extraction and manipulation. • Experience working with Snowflake, MySQL, Redshift, DB2, Teradata or any other DBMS equivalent. Regards! Raju Chidurala 216-343-3435 rajuc@devfi.com