

SBS Creatix
Data Engineer (Databricks / Scala / Spark)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Databricks / Scala / Spark) with a contract length of "unknown" and a pay rate of "unknown." Requires 3-10+ years of experience, strong skills in Scala, Spark, Databricks, and cloud platforms (AWS, Azure, or GCP).
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Big Data #Cloud #AI (Artificial Intelligence) #AWS (Amazon Web Services) #Databricks #Spark (Apache Spark) #Consulting #Data Science #Scala #Azure #"ETL (Extract #Transform #Load)" #Leadership #Documentation #Computer Science #DevOps #DataOps #Batch #GCP (Google Cloud Platform) #Data Engineering
Role description
US Citizen or Green Card holder is Required to be eligible
No C2C Request Please
Join a growing team as a Senior Data Engineer leveraging advanced engineering techniques and analytics to support business decisions. Play a critical role in building and optimizing data platforms on Databricks and cloud environments, working with data scientists, architects, and client stakeholders. Requires strong hands-on expertise in Scala, Spark, Databricks and ETL; modern data engineering practices, including both batch and streaming architectures.
Requirements
• 3-10+ years of experience as a Data Engineer or Big Data Engineer
• Strong experience across at least two major cloud platforms: AWS, Azure, or GCP
• Proven experience building production-grade data platforms on Databricks
• Experience designing both batch and streaming pipelines
• Experience implementing DataOps, CI/CD, and DevOps practices for data platforms
• Experience migrating data platforms from on-prem to cloud
• Bachelor's degree in Computer Science, Engineering or related field (or equivalent practical experience)
The Impact You Will Have
• Lead Big Data & AI Transformations: Design and implement end-to-end data platforms, including large-scale big data and AI-enabled analytics solutions.
• Champion Best Practices: Ensure Databricks, Spark, and cloud best practices are applied across all engagements.
• Support Delivery & Estimation: Partner with Professional Services leadership to estimate work, manage technical risk, and support statements of work.
• Architect Complex Solutions: Design, develop, deploy, and document complex customer solutions, serving as a technical lead.
• Enable Knowledge Transfer: Produce reusable assets, documentation, and deliver training to clients and internal teams.
• Advance Consulting Excellence: Share expertise and implementation patterns to improve delivery quality across the consulting organization.
US Citizen or Green Card holder is Required to be eligible
No C2C Request Please
Join a growing team as a Senior Data Engineer leveraging advanced engineering techniques and analytics to support business decisions. Play a critical role in building and optimizing data platforms on Databricks and cloud environments, working with data scientists, architects, and client stakeholders. Requires strong hands-on expertise in Scala, Spark, Databricks and ETL; modern data engineering practices, including both batch and streaming architectures.
Requirements
• 3-10+ years of experience as a Data Engineer or Big Data Engineer
• Strong experience across at least two major cloud platforms: AWS, Azure, or GCP
• Proven experience building production-grade data platforms on Databricks
• Experience designing both batch and streaming pipelines
• Experience implementing DataOps, CI/CD, and DevOps practices for data platforms
• Experience migrating data platforms from on-prem to cloud
• Bachelor's degree in Computer Science, Engineering or related field (or equivalent practical experience)
The Impact You Will Have
• Lead Big Data & AI Transformations: Design and implement end-to-end data platforms, including large-scale big data and AI-enabled analytics solutions.
• Champion Best Practices: Ensure Databricks, Spark, and cloud best practices are applied across all engagements.
• Support Delivery & Estimation: Partner with Professional Services leadership to estimate work, manage technical risk, and support statements of work.
• Architect Complex Solutions: Design, develop, deploy, and document complex customer solutions, serving as a technical lead.
• Enable Knowledge Transfer: Produce reusable assets, documentation, and deliver training to clients and internal teams.
• Advance Consulting Excellence: Share expertise and implementation patterns to improve delivery quality across the consulting organization.






