

ETL Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer in Charlotte, NC, for 24 months at $70.00 per hour. Key skills include PySpark, ETL development, and DW/BI experience. Responsibilities involve data ingestion, transformation, and ensuring data quality across pipelines.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date discovered
August 13, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC 28270
-
π§ - Skills detailed
#Data Pipeline #"ETL (Extract #Transform #Load)" #Data Ingestion #Data Aggregation #Data Analysis #BI (Business Intelligence) #Data Warehouse #Code Reviews #Dremio #Data Lake #Deployment #PySpark #Data Quality #S3 (Amazon Simple Storage Service) #Spark (Apache Spark)
Role description
ETL Developer
Charlotte NC24 months $70.00 per hour
Engineer with strong expertise in PySpark, ETL development, and Data Warehousing/Business Intelligence (DW/BI) projects. Resource will be responsible for end-to-end development covering Financial Attribution, SCD, Booking and Referring Agreements, Data Aggregations, and SOR Onboarding
Resource will design, develop, and optimize ETL pipelines using PYSPARK, S3 and Dremio
Work on ProfitView Modernization
Work with large-scale structured and unstructured data from various sources.
Implement data ingestion, transformation, and loading processes into data lakes and data warehouses.
Collaborate with BI developers, data analysts, and business stakeholders to understand data requirements.
Ensure data quality, integrity, and governance across all data pipelines.
Monitor and troubleshoot performance issues.
Participate in code reviews, testing, and deployment processes.
Document technical solutions, data flows, and architecture
Job Types: Full-time, Contract
Pay: $70.00 per hour
Work Location: In person
ETL Developer
Charlotte NC24 months $70.00 per hour
Engineer with strong expertise in PySpark, ETL development, and Data Warehousing/Business Intelligence (DW/BI) projects. Resource will be responsible for end-to-end development covering Financial Attribution, SCD, Booking and Referring Agreements, Data Aggregations, and SOR Onboarding
Resource will design, develop, and optimize ETL pipelines using PYSPARK, S3 and Dremio
Work on ProfitView Modernization
Work with large-scale structured and unstructured data from various sources.
Implement data ingestion, transformation, and loading processes into data lakes and data warehouses.
Collaborate with BI developers, data analysts, and business stakeholders to understand data requirements.
Ensure data quality, integrity, and governance across all data pipelines.
Monitor and troubleshoot performance issues.
Participate in code reviews, testing, and deployment processes.
Document technical solutions, data flows, and architecture
Job Types: Full-time, Contract
Pay: $70.00 per hour
Work Location: In person