

CodeGeniusRecruit
Data Engineer | Remote
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a part-time Data Engineer position, remote, offering $70–$120 per hour. Key skills include SQL, Python, dbt, and data warehousing. Strong experience in data science, analytics engineering, and analytical problem-solving is required.
🌎 - Country
United Kingdom
💱 - Currency
$ USD
-
💰 - Day rate
960
-
🗓️ - Date
April 25, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Data Pipeline #BI (Business Intelligence) #Data Engineering #SQL (Structured Query Language) #Python #Data Science #dbt (data build tool)
Role description
Work Snapshot
Type: Part-time position
Location: Remote
Commitment: Flexible (self-paced schedule)
Commission: $70–$120 per hour
What You’ll Be Doing
• Design end-to-end data science and analytics workflows to evaluate model performance
• Build scenarios across SQL analysis, dashboards, experimentation, and business intelligence use cases
• Develop and assess data pipelines, warehouse schemas, and orchestration workflows
• Create and review data artifacts including queries, experiment readouts, and dashboard specifications
• Evaluate outputs based on analytical reasoning, accuracy, and execution across complex tasks
What We’re Looking For
• Strong experience in data science, analytics engineering, business intelligence, or data engineering
• Strong experience in SQL, Python, dbt, data warehousing, or pipeline orchestration tools
• Strong experience in experimentation, metric modeling, and analytical problem-solving
• Strong ability to produce and interpret data artifacts such as queries, schemas, and dashboards
• Clear written communication with structured, step-by-step reasoning
How To Apply
• Upload resume
• Interview
• Submit form
Work Snapshot
Type: Part-time position
Location: Remote
Commitment: Flexible (self-paced schedule)
Commission: $70–$120 per hour
What You’ll Be Doing
• Design end-to-end data science and analytics workflows to evaluate model performance
• Build scenarios across SQL analysis, dashboards, experimentation, and business intelligence use cases
• Develop and assess data pipelines, warehouse schemas, and orchestration workflows
• Create and review data artifacts including queries, experiment readouts, and dashboard specifications
• Evaluate outputs based on analytical reasoning, accuracy, and execution across complex tasks
What We’re Looking For
• Strong experience in data science, analytics engineering, business intelligence, or data engineering
• Strong experience in SQL, Python, dbt, data warehousing, or pipeline orchestration tools
• Strong experience in experimentation, metric modeling, and analytical problem-solving
• Strong ability to produce and interpret data artifacts such as queries, schemas, and dashboards
• Clear written communication with structured, step-by-step reasoning
How To Apply
• Upload resume
• Interview
• Submit form






