

CodeGeniusRecruit
Data Engineer | Remote
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a part-time Data Engineer position, remote, offering $90 to $125 per hour. Key skills include strong experience in data engineering, dbt, ETL/ELT pipelines, data quality, and performance optimization. Clear written communication is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
1000
-
🗓️ - Date
April 26, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Schema Design #Data Quality #Data Pipeline #Data Engineering #Documentation #dbt (data build tool) #"ETL (Extract #Transform #Load)"
Role description
Work Snapshot
Type: Part-time position
Location: Remote
Commitment: Flexible (self-paced schedule)
Commission: $90 to $125 per hour
What You Will Be Doing
• Design end-to-end data pipeline workflows with clearly defined, verifiable outputs
• Build scenarios across ETL/ELT pipelines, dbt models, and warehouse schema design
• Develop orchestration workflows and data quality tests with deterministic pass/fail criteria
• Create and review artifacts such as DAGs, schema documentation, and data contracts
• Evaluate outputs based on correctness, performance, and adherence to defined specifications
What We Are Looking For
• Strong experience in data engineering or analytics engineering
• Strong experience in dbt, data pipelines, warehouse design, or orchestration tools
• Strong experience in data quality, testing frameworks, and performance optimization
• Strong ability to create and interpret data engineering artifacts (DAGs, schemas, test suites)
• Clear written communication with structured, step-by-step reasoning
How To Apply
• Upload resume
• Interview
• Submit form
Work Snapshot
Type: Part-time position
Location: Remote
Commitment: Flexible (self-paced schedule)
Commission: $90 to $125 per hour
What You Will Be Doing
• Design end-to-end data pipeline workflows with clearly defined, verifiable outputs
• Build scenarios across ETL/ELT pipelines, dbt models, and warehouse schema design
• Develop orchestration workflows and data quality tests with deterministic pass/fail criteria
• Create and review artifacts such as DAGs, schema documentation, and data contracts
• Evaluate outputs based on correctness, performance, and adherence to defined specifications
What We Are Looking For
• Strong experience in data engineering or analytics engineering
• Strong experience in dbt, data pipelines, warehouse design, or orchestration tools
• Strong experience in data quality, testing frameworks, and performance optimization
• Strong ability to create and interpret data engineering artifacts (DAGs, schemas, test suites)
• Clear written communication with structured, step-by-step reasoning
How To Apply
• Upload resume
• Interview
• Submit form






