

Haystack
Data Scientist
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Scientist with a 6-month contract, offering a pay rate of "$X/hour". Work is remote. Key skills include 5+ years in analytics, SQL, dbt, Python, and experience with data pipelines and ML/AI metrics.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Airflow #Tableau #Looker #AI (Artificial Intelligence) #Data Engineering #Data Modeling #Automation #Data Warehouse #dbt (data build tool) #SQL (Structured Query Language) #Data Quality #Data Science #Data Pipeline #Python #ML (Machine Learning) #Cloud #BigQuery #Data Analysis
Role description
We are working with a leading global recruitment and HR services provider that connects talent with opportunities across various industries. This is an exciting opportunity to join a highly skilled team and contribute to impactful data initiatives.
The Role
• Drive data quality and empower cross-functional decision-making
• Build robust data pipelines and self-serve data products
• Define metrics for platform health, developer productivity, and ML/AI adoption
• Own end-to-end analytical data modeling in BigQuery using dbt
• Develop clear, story-driven dashboards using tools like Looker or Tableau
What You'll Need
• 5+ years of experience in analytics or data engineering, with deep expertise in SQL
• Extensive experience with dbt, a cloud data warehouse, and workflow orchestrators (Airflow, Dagster, Prefect, or Flyte)
• Proficiency in Python for data analysis and automation
• Experience with experimentation, ML/AI metrics, or platform productivity is a plus
• Ability to champion data quality and implement CI/CD for dbt models
What's On Offer
• Opportunity to work on critical data initiatives
• Contribute to defining key metrics for technology platforms
• Be part of a dynamic and collaborative team
• Professional growth and mentoring opportunities
Apply via Haystack today!
We are working with a leading global recruitment and HR services provider that connects talent with opportunities across various industries. This is an exciting opportunity to join a highly skilled team and contribute to impactful data initiatives.
The Role
• Drive data quality and empower cross-functional decision-making
• Build robust data pipelines and self-serve data products
• Define metrics for platform health, developer productivity, and ML/AI adoption
• Own end-to-end analytical data modeling in BigQuery using dbt
• Develop clear, story-driven dashboards using tools like Looker or Tableau
What You'll Need
• 5+ years of experience in analytics or data engineering, with deep expertise in SQL
• Extensive experience with dbt, a cloud data warehouse, and workflow orchestrators (Airflow, Dagster, Prefect, or Flyte)
• Proficiency in Python for data analysis and automation
• Experience with experimentation, ML/AI metrics, or platform productivity is a plus
• Ability to champion data quality and implement CI/CD for dbt models
What's On Offer
• Opportunity to work on critical data initiatives
• Contribute to defining key metrics for technology platforms
• Be part of a dynamic and collaborative team
• Professional growth and mentoring opportunities
Apply via Haystack today!





