Harnham

Consulting Data Engineer (Remote)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Consulting Data Engineer (Remote), offering a six-month contract at a competitive pay rate. Key skills include 6+ years in data engineering, proficiency in Python or Scala, and expertise in cloud platforms (AWS, Azure, GCP) and Databricks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
760
-
🗓️ - Date
December 3, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Engineering #Big Data #"ETL (Extract #Transform #Load)" #Consulting #Azure #GCP (Google Cloud Platform) #Databricks #Deployment #Apache Spark #Distributed Computing #Data Architecture #Python #Migration #ML (Machine Learning) #Spark (Apache Spark) #Scala #AWS (Amazon Web Services) #AI (Artificial Intelligence) #Cloud
Role description
Consulting Data Engineer - Professional Services As a Consulting Data Engineer, you'll be at the forefront of helping enterprises transform how they manage and leverage big data and AI. In this role, you'll partner with clients to help design modern data architectures, guide migrations, and enable advanced analytics solutions that drive real business outcomes. You won't just deliver technology, you'll deliver innovation. You'll help organizations turn data into actionable insights. If you thrive in dynamic environments, love solving complex challenges, and want to make a measurable impact, this is the role for you. What You'll Do • Lead high-impact projects that enable clients to modernize their data platforms and accelerate AI adoption. • Design and implement scalable, cloud-based architectures that power advanced analytics and machine learning. • Collaborate with technical teams and business stakeholders to ensure seamless delivery and measurable success. • Act as a trusted advisor, guiding clients through migrations, deployments, and best practices for big data and AI. • Share insights and feedback to continuously improve solutions and shape the future of data innovation. What We're Looking For • 6+ years of experience in data engineering, analytics, or cloud architecture. • Strong coding skills in Python or Scala. • Expertise in at least one major cloud platform (AWS, Azure, or GCP) and familiarity with others. • Deep knowledge of distributed computing frameworks (e.g., Apache Spark). • Experience with CI/CD, MLOps, and designing end-to-end data solutions. • Databricks experience is a • MUST • . • Exceptional communication and problem-solving skills. This is a Contract Role. The initial term will be six months, with a strong possibility for extension, and the workload will be 40 hours per week. Why Join Us? You'll work on projects that shape the future of data and AI. You'll collaborate with passionate experts, learn continuously, and make an impact that matters. If you're ready to help organizations harness the power of data, apply here. • • WE WIL NOT BE UTILIZING THIRD PARTY AGENCIES FOR THIS ROLE • •