Robert Half

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Contract Data Engineer position in Salt Lake City, requiring 3+ years of experience, strong SQL skills, proficiency in Python, and familiarity with cloud platforms. Pay rate and contract length are unspecified. Hybrid work is necessary.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
440
-
πŸ—“οΈ - Date
January 6, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Salt Lake City, UT
-
🧠 - Skills detailed
#Spark (Apache Spark) #Python #Infrastructure as Code (IaC) #Java #Visualization #Scala #Data Lake #Data Pipeline #Cloud #AWS (Amazon Web Services) #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #Looker #Hadoop #SQL (Structured Query Language) #dbt (data build tool) #Terraform #BI (Business Intelligence) #GCP (Google Cloud Platform) #Data Quality #Data Science #Databases #Data Engineering #Airflow #Tableau #Microsoft Power BI #Azure #Big Data #Kafka (Apache Kafka) #Data Warehouse
Role description
We are seeking a skilled Contract Data Engineer to design, build, and maintain scalable data pipelines and infrastructure that support analytics, reporting, and data-driven decision-making across our client's organization. This role partners closely with analytics, product, and engineering teams to ensure reliable, high-quality data is available when and where it’s needed. You must be able to work hybrid in Salt Lake City. What You’ll Do β€’ Design, build, and optimize scalable ETL/ELT data pipelines β€’ Develop and maintain data warehouses and data lakes β€’ Ensure data quality, integrity, and reliability across systems β€’ Integrate data from multiple sources (APIs, databases, third-party tools) β€’ Collaborate with analytics, data science, and business stakeholders β€’ Monitor, troubleshoot, and improve data pipeline performance β€’ Document data models, pipelines, and best practices Required Qualifications β€’ 3+ years of experience as a Data Engineer or similar role β€’ Strong SQL skills and experience with relational databases β€’ Proficiency in Python, Scala, or Java β€’ Experience with modern data tools (e.g., Airflow, dbt, Spark) β€’ Hands-on experience with cloud platforms (AWS, GCP, or Azure) β€’ Understanding of data warehousing concepts and dimensional modeling Preferred Qualifications β€’ Experience with big data technologies (Spark, Kafka, Hadoop) β€’ Familiarity with data visualization or BI tools (Looker, Tableau, Power BI) β€’ Knowledge of CI/CD and infrastructure as code (Terraform, CloudFormation) β€’ Experience supporting analytics or machine learning workflows