

Senior ETL Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior ETL Developer in Charlotte, NC, for 6 months+. Pay rate is competitive. Requires 5+ years ETL experience, proficiency in PySpark, Python, and advanced SQL, with cloud ETL architecture experience preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 7, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#PostgreSQL #Spark SQL #Apache Airflow #Agile #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Code Reviews #Airflow #SQL Server #SQL (Structured Query Language) #Databases #Security #Data Governance #Data Processing #Azure #GCP (Google Cloud Platform) #Scala #AWS S3 (Amazon Simple Storage Service) #Oracle #Python #Spark (Apache Spark) #Deployment #Scrum #PySpark #Data Transformations #"ETL (Extract #Transform #Load)" #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role: Senior ETL Developer β PySpark/Python
Location Charlotte, NC | Onsite (4 days/week)
Duration: 6 months+
Key Responsibilities
β’ Design and maintain scalable ETL pipelines using PySpark and Python
β’ Implement validation logic and manage data transformations
β’ Extract and load data across Oracle, PostgreSQL, and SQL Server
β’ Optimize Spark-based workflows on AWS cloud infrastructure
β’ Collaborate with cross-functional teams to define and deliver data requirements
β’ Perform code reviews and support testing, QA, and deployment
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Minimum Qualifications
β’ 5+ years of ETL development across large data environments
β’ Proficiency with PySpark and Python for distributed data processing
β’ Advanced SQL and hands-on experience with relational databases
β’ Experience with cloud ETL architecture (AWS, Azure, or GCP)
β’ Must be authorized to work in the U.S. and onsite 4 days/week
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Core Tech Environment
Airflow, AWS (S3, Glue, EMR), DataFrames, Oracle, PostgreSQL, PySpark (3.x), Python (3.x), SQL Server, Spark SQL
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Preferred Skills
β’ Familiarity with Apache Airflow or equivalent orchestration tools
β’ Understanding of data governance and security protocols
β’ Background in Agile or Scrum project delivery