

Insight Global
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown," and is located in "unknown." Key skills include strong Python, Databricks, Java, Scala, and experience with AI tools. MarTech background is a plus.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
February 26, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Francisco Bay Area
-
🧠 - Skills detailed
#Spark SQL #Data Engineering #Spark (Apache Spark) #AI (Artificial Intelligence) #Cloud #"ETL (Extract #Transform #Load)" #Scala #Databricks #Java #Python #Data Pipeline #SQL (Structured Query Language) #ADF (Azure Data Factory) #Azure #Data Processing
Role description
Must Haves:
At least 8 years of strong Python skills
• 5+ years Databricks or comparable big‑data platforms
• Hands‑on experience building services using Java - - Strong proficiency in big‑data technologies: Scala, Spark SQL, Spark Streaming
• Experience with AI tools, pref. Claude
Plusses:
• Familiarity with Adobe Campaign
• Background in MarTech platforms and data workflows
Day-To-Day:
• Lead end‑to‑end MarTech engineering initiatives across orchestration, data processing, and activation pipelines.
• Architect scalable, event‑driven systems that power real‑time marketing experiences and automated customer journeys.
• Design and implement orchestration workflows using Adobe Campaign or equivalent enterprise‑grade tools.
• Develop high‑performance big‑data applications using Scala, Databricks, Spark SQL, Spark Streaming, and Python.
• Build and optimize cloud‑native data pipelines on Azure, including ADF‑based ingestion, transformation and orchestration.
Must Haves:
At least 8 years of strong Python skills
• 5+ years Databricks or comparable big‑data platforms
• Hands‑on experience building services using Java - - Strong proficiency in big‑data technologies: Scala, Spark SQL, Spark Streaming
• Experience with AI tools, pref. Claude
Plusses:
• Familiarity with Adobe Campaign
• Background in MarTech platforms and data workflows
Day-To-Day:
• Lead end‑to‑end MarTech engineering initiatives across orchestration, data processing, and activation pipelines.
• Architect scalable, event‑driven systems that power real‑time marketing experiences and automated customer journeys.
• Design and implement orchestration workflows using Adobe Campaign or equivalent enterprise‑grade tools.
• Develop high‑performance big‑data applications using Scala, Databricks, Spark SQL, Spark Streaming, and Python.
• Build and optimize cloud‑native data pipelines on Azure, including ADF‑based ingestion, transformation and orchestration.






