

CTG
ETL/Data Warehouse Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL/Data Warehouse Engineer, a 15-month remote contract position, offering $85,000 - $110,000 annually. Requires 5+ years in ETL development, strong SQL skills, and experience with Apache Spark. US Citizenship or Green Card required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
500
-
ποΈ - Date
October 2, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Remote
-
π§ - Skills detailed
#Scala #Data Warehouse #SQL (Structured Query Language) #Data Governance #Data Architecture #Data Engineering #AWS (Amazon Web Services) #Linux #SQL Queries #Documentation #Data Pipeline #Azure #Data Science #Dimensional Data Models #Database Design #Programming #Business Analysis #Data Processing #Spark (Apache Spark) #Cloud #Java #Computer Science #Data Quality #EDW (Enterprise Data Warehouse) #GCP (Google Cloud Platform) #Datasets #Data Integration #Snowflake #"ETL (Extract #Transform #Load)" #Apache Spark
Role description
Location: RemoteDuration: 15 months
We are seeking a detail-oriented and technically proficient ETL/Data Warehouse Engineer to design, develop, and maintain robust ETL processes that support our enterprise data warehouse. This role is critical to ensuring the integrity, performance, and scalability of data pipelines that enable analytics, reporting, and decision-making across the organization.
Duties:
Design, develop, and optimize ETL workflows to extract, transform, and load data from multiple sources into the enterprise data warehouse.
Collaborate with data architects and analysts to design and maintain dimensional data models (star/snowflake schemas).
Monitor and tune the performance of ETL jobs and database queries to ensure efficient and timely data processing.
Implement robust data quality, validation, cleansing, and reconciliation processes to ensure accuracy and reliability.
Develop and maintain technical documentation, adhering to established coding standards and best practices.
Partner with business analysts, data scientists, and application teams to understand requirements and deliver scalable solutions.
Diagnose and resolve ETL/data issues, including job failures, anomalies, and performance bottlenecks.
Skills:
Proficiency in ETL development tools and data integration platforms.
Strong SQL development and query optimization skills.
Solid understanding of data warehousing concepts, dimensional modeling, and relational database design.
Experience implementing Apache Spark on Linux environments.
Strong programming experience with Java and the ability to apply software development lifecycle (SDLC) best practices.
Knowledge of data governance and data quality frameworks.
Familiarity with cloud-based data platforms (AWS, Azure, GCP) is a plus.
Experience:
Due to some of the work being expert control, that requires candidates to be a US Citizen or Green Card Holder.
5+ years of professional experience in ETL development and data warehousing.
Demonstrated experience building scalable ETL pipelines and working with large datasets.
Hands-on experience with performance tuning of ETL processes and SQL queries.
Prior experience collaborating with cross-functional teams in data-driven projects.
Education:
Bachelorβs degree in Computer Science, Information Systems, Data Engineering, or related field.
Equivalent combination of education and experience will be considered.
Excellent verbal and written English communication skills and the ability to interact professionally with a diverse group are required.
Job Types: Full-time, Contract
Pay: $85,000.00 - $110,000.00 per year
Benefits:
Dental insurance
Health insurance
Paid time off
Vision insurance
Work Location: Remote
Location: RemoteDuration: 15 months
We are seeking a detail-oriented and technically proficient ETL/Data Warehouse Engineer to design, develop, and maintain robust ETL processes that support our enterprise data warehouse. This role is critical to ensuring the integrity, performance, and scalability of data pipelines that enable analytics, reporting, and decision-making across the organization.
Duties:
Design, develop, and optimize ETL workflows to extract, transform, and load data from multiple sources into the enterprise data warehouse.
Collaborate with data architects and analysts to design and maintain dimensional data models (star/snowflake schemas).
Monitor and tune the performance of ETL jobs and database queries to ensure efficient and timely data processing.
Implement robust data quality, validation, cleansing, and reconciliation processes to ensure accuracy and reliability.
Develop and maintain technical documentation, adhering to established coding standards and best practices.
Partner with business analysts, data scientists, and application teams to understand requirements and deliver scalable solutions.
Diagnose and resolve ETL/data issues, including job failures, anomalies, and performance bottlenecks.
Skills:
Proficiency in ETL development tools and data integration platforms.
Strong SQL development and query optimization skills.
Solid understanding of data warehousing concepts, dimensional modeling, and relational database design.
Experience implementing Apache Spark on Linux environments.
Strong programming experience with Java and the ability to apply software development lifecycle (SDLC) best practices.
Knowledge of data governance and data quality frameworks.
Familiarity with cloud-based data platforms (AWS, Azure, GCP) is a plus.
Experience:
Due to some of the work being expert control, that requires candidates to be a US Citizen or Green Card Holder.
5+ years of professional experience in ETL development and data warehousing.
Demonstrated experience building scalable ETL pipelines and working with large datasets.
Hands-on experience with performance tuning of ETL processes and SQL queries.
Prior experience collaborating with cross-functional teams in data-driven projects.
Education:
Bachelorβs degree in Computer Science, Information Systems, Data Engineering, or related field.
Equivalent combination of education and experience will be considered.
Excellent verbal and written English communication skills and the ability to interact professionally with a diverse group are required.
Job Types: Full-time, Contract
Pay: $85,000.00 - $110,000.00 per year
Benefits:
Dental insurance
Health insurance
Paid time off
Vision insurance
Work Location: Remote