Quantum World Technologies Inc.

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior ETL Developer on a contract basis, 100% remote, lasting over 6 months. Requires 7–10+ years of experience, expertise in AWS Glue, Airflow, SQL, and cloud data warehousing, with strong skills in data integration and automation.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 7, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Airflow #Scala #Logical Data Model #Data Quality #Azure Data Factory #"ETL (Extract #Transform #Load)" #Data Modeling #ADF (Azure Data Factory) #Data Integration #AWS (Amazon Web Services) #Data Architecture #BI (Business Intelligence) #SQL (Structured Query Language) #AWS Glue #Data Warehouse #Data Integrity #Business Analysis #SQL Queries #Data Pipeline #Automation #Cloud #Snowflake #Azure #Data Engineering
Role description
Job Title: Senior ETL Developer (Remote) Location: 100% Remote Employment Type: Contract / Full-Time Experience Level: 7–10+ Years Job Summary: We are seeking a highly analytical and detail-oriented Senior ETL Developer to design, develop, and maintain robust data integration solutions. You will be responsible for building scalable ETL/ELT pipelines that transform raw data into actionable insights. The ideal candidate will have extensive experience in cloud data warehousing, performance tuning, and automating complex data workflows. Key Responsibilities: • Pipeline Development: Design and implement complex ETL/ELT processes to ingest data from various sources (Structured, Semi-structured, and Unstructured) into cloud data warehouses. • Data Modeling: Create and optimize physical and logical data models (Star Schema, Snowflake Schema) to support business intelligence requirements. • Performance Tuning: Identify and resolve bottlenecks in data pipelines and SQL queries to ensure optimal processing speeds and cost-efficiency. • Automation & Orchestration: Build and manage automated workflows using tools like Airflow, AWS Glue, or Azure Data Factory. • Collaboration: Work closely with Data Architects and Business Analysts to translate functional requirements into technical specifications. • Data Quality: Implement rigorous data validation and cleansing rules to maintain high standards of data integrity and accuracy.