Harnham

Consulting Data Engineer - Scala Focus

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Consulting Data Engineer (Scala focus) with a contract length of 6 months, offering 40 hours per week. Key skills include Scala, cloud platforms (AWS, Azure, GCP), and Apache Spark. Hands-on Databricks experience is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
800
-
🗓️ - Date
February 3, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Strategy #Data Engineering #Spark (Apache Spark) #Datasets #Migration #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #Deployment #Apache Spark #Databricks #Data Processing #Leadership #Cloud #Scala #Data Architecture #Azure #Consulting #AI (Artificial Intelligence)
Role description
Consulting Data Engineer (Scala focus) Overview We're seeking an experienced Consulting Data Engineer to work directly with enterprise clients on modernizing data platforms and enabling advanced analytics and AI-driven use cases. This role is highly hands-on and client-facing, focused on building scalable data solutions that translate complex data into real business value. You'll play a key role in shaping data architectures, supporting cloud migrations, and helping teams unlock insights from large, distributed datasets. If you enjoy variety, technical depth, and working at the intersection of data and strategy, this role offers meaningful impact. Responsibilities • Drive end-to-end delivery of data engineering initiatives across analytics, AI, and platform modernization efforts • Architect and build cloud-native data platforms designed for scale, performance, and reliability • Partner closely with client stakeholders and internal teams to align technical solutions with business objectives • Provide technical leadership and guidance on best practices for distributed data processing, migrations, and deployments • Contribute ideas, patterns, and feedback to continuously improve consulting approaches and data solutions Qualifications • 6+ years of experience in data engineering, analytics engineering, or cloud-based data solutions • Strong development skills in Scala, especially in recent roles • Proven experience with at least one major cloud provider (AWS, Azure, or GCP), with exposure to multi-cloud environments • Advanced knowledge of distributed data processing frameworks such as Apache Spark • Experience designing and deploying end-to-end data solutions, including CI/CD and MLOps workflows • Hands-on Databricks experience is required • Excellent communication skills with the ability to explain complex concepts to both technical and non-technical audiences Engagement Details • Contract position (initial 6-month term, with strong extension potential) • 40 hours per week Why This Role You'll work alongside experienced data practitioners on initiatives that influence how organizations use data and AI. This is an opportunity to continuously learn, solve meaningful problems, and deliver solutions that have real-world impact. Note: We are not engaging third-party staffing agencies for this position.