Harnham

Data Engineer (Consultant)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer (Consultant) position for a six-month contract, full-time at 40 hours per week, offering a competitive pay rate. Key skills include Python or Scala, Databricks, and expertise in AWS, Azure, or GCP.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
960
-
🗓️ - Date
December 9, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Apache Spark #Scala #Deployment #Spark (Apache Spark) #Data Architecture #Python #Big Data #Distributed Computing #Data Migration #Databricks #Cloud #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #Azure #AWS (Amazon Web Services) #Consulting #Data Engineering #AI (Artificial Intelligence) #Migration #GCP (Google Cloud Platform)
Role description
Data Engineer (Consultant) - Professional Services Role Overview As a Consulting Data Engineer, you'll play a pivotal role in helping enterprises modernize how they manage and utilize big data and AI. You'll partner closely with clients to design modern data architectures, guide cloud and platform migrations, and build advanced analytics solutions that generate real business impact. This role goes beyond technical delivery-you'll help organizations transform their data into actionable insights and drive meaningful innovation. If you thrive in dynamic environments, enjoy solving complex challenges, and want to make a measurable impact, this position offers an exciting opportunity. What You'll Do In this role, you'll lead high-impact projects that enable clients to modernize their data platforms and accelerate AI adoption. You'll design and implement scalable, cloud-based architectures that support advanced analytics and machine learning initiatives, while collaborating with technical teams and business stakeholders to ensure seamless delivery and tangible results. You'll act as a trusted advisor, guiding clients through data migrations, deployments, and best practices, and you'll contribute insights and feedback to improve solution quality and ongoing innovation. What We're Looking For We're seeking someone with at least six years of experience in data engineering, analytics, or cloud architecture, along with strong coding skills in Python or Scala. You should have expertise with at least one major cloud platform-AWS, Azure, or GCP-and familiarity with others. Deep knowledge of distributed computing frameworks like Apache Spark is essential, as is experience with CI/CD, MLOps, and architecting end-to-end data solutions. Databricks experience is a must. Exceptional communication and problem-solving abilities are also critical for success in this role. Contract Details This is a contract role with an initial six-month term, full-time at 40 hours per week, with a strong possibility for extension. Why Join Us? You'll work on projects that shape the future of data and AI while collaborating with passionate experts who prioritize continuous learning and innovation. Your work will deliver measurable impact for organizations looking to harness the full potential of their data. If this excites you, we encourage you to apply. Please Note We will not be utilizing third-party agencies for this role.