

Golden Technology
Data Engineer 3
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer 3 with a contract length of "unknown" and a pay rate of "$/hour." The position requires 7+ years of experience in data engineering, expertise in Azure, Databricks, and Synapse, and relevant certifications.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 8, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Blue Ash, OH
-
🧠 - Skills detailed
#Azure Databricks #Data Management #Delta Lake #Microsoft Power BI #Cloud #Spark (Apache Spark) #Semantic Models #SQL (Structured Query Language) #Data Ingestion #ML (Machine Learning) #Data Mart #Datasets #Jira #Data Engineering #Data Transformations #"ETL (Extract #Transform #Load)" #Leadership #BI (Business Intelligence) #Data Pipeline #Databricks #Azure #Data Science #Data Architecture #Compliance #Synapse #AI (Artificial Intelligence) #Documentation #Scala #Data Processing #Metadata #Agile #Data Catalog #Alation #Data Governance
Role description
Position Overview
• We are looking for a highly skilled, hands-on Senior Data Engineer to join our Data & Analytics team.
• In this role, you will play a key part in building and scaling behavioral and Ecommerce data platform, enabling trusted analytics and making AI-ready data available across the organization.
• You will be responsible for designing and implementing robust, scalable data pipelines, modeling complex, high-volume datasets, and delivering high-quality, well-structured data products to power business insights and future AI capabilities.
• This is a delivery-focused engineering position demanding deep technical expertise in Azure, Databricks, and Synapse, combined with experience managing large-scale behavioral data in enterprise environments.
• Translating business needs into technical specifications and platform capabilities is a critical part of this role.
Required Qualifications
• 7+ years of experience in data engineering or similar roles, with hands-on delivery of cloud-based data solutions.
• Preferred Engineering certifications: Microsoft Certified: Azure Data Engineer Associate and Databricks: Databricks Certified Data Engineer Professional
• Strong expertise in Databricks and Azure Synapse, with practical experience in Spark-based data processing.
• Proficient in modern data architectures (Lakehouse, ELT/ETL pipelines, real-time data processing).
• Advanced SQL skills for data transformation and performance optimization.
• Proven ability to model and manage large-scale, complex behavioral and Ecommerce datasets.
• Expert in BI and best practices for data enablement and self-service analytics.
• Hands-on experience with Unity Catalog and data cataloging tools (e.g., Alation) for governance and metadata management.
• Working knowledge of behavioral analytics platforms (Adobe Analytics, Adobe Customer Journey Analytics).
• Excellent communication skills with a talent for translating technical concepts into business value.
• Experience operating in agile delivery environments, balancing speed, scalability, and solution quality.
• Proven leadership as a technical lead
Key Responsibilities
• Design, build, and maintain robust, scalable ELT/ETL pipelines and data transformations using Databricks, Spark, and Synapse.
• Model high-volume, complex event-level datasets (digital behavior, Ecommerce transactions, marketing interactions) to support dashboards, experimentation, ML models, and marketing activation.
• Enforce data governance, discoverability, and stewardship using Unity Catalog and Alation, ensuring compliance and lineage tracking.
• Validate and reconcile data pipelines against established behavioral datasets such as Adobe Customer Journey Analytics (CJA) and Adobe Analytics.
• Partner with data architects, analysts, data scientists, and marketing teams to deliver trusted, reusable, and well-structured datasets that power BI dashboards and decision-making.
• Mature the data ingestion, processing, orchestration, and curation capabilities leveraging Delta Lake optimization, Databricks Workflows, and Synapse for analytical consumption.
• Support and optimize semantic models and data marts that enable self-service analytics through AI/BI Dashboards and Power BI.
• Participate in agile delivery processes (sprint planning, backlog refinement, documentation), collaborating through Jira and Confluence.
• Document data assets, transformations, and pipelines for discoverability, transparency, and long-term maintainability.
• Facilitate clear and continuous communication between business and engineering teams.
Position Overview
• We are looking for a highly skilled, hands-on Senior Data Engineer to join our Data & Analytics team.
• In this role, you will play a key part in building and scaling behavioral and Ecommerce data platform, enabling trusted analytics and making AI-ready data available across the organization.
• You will be responsible for designing and implementing robust, scalable data pipelines, modeling complex, high-volume datasets, and delivering high-quality, well-structured data products to power business insights and future AI capabilities.
• This is a delivery-focused engineering position demanding deep technical expertise in Azure, Databricks, and Synapse, combined with experience managing large-scale behavioral data in enterprise environments.
• Translating business needs into technical specifications and platform capabilities is a critical part of this role.
Required Qualifications
• 7+ years of experience in data engineering or similar roles, with hands-on delivery of cloud-based data solutions.
• Preferred Engineering certifications: Microsoft Certified: Azure Data Engineer Associate and Databricks: Databricks Certified Data Engineer Professional
• Strong expertise in Databricks and Azure Synapse, with practical experience in Spark-based data processing.
• Proficient in modern data architectures (Lakehouse, ELT/ETL pipelines, real-time data processing).
• Advanced SQL skills for data transformation and performance optimization.
• Proven ability to model and manage large-scale, complex behavioral and Ecommerce datasets.
• Expert in BI and best practices for data enablement and self-service analytics.
• Hands-on experience with Unity Catalog and data cataloging tools (e.g., Alation) for governance and metadata management.
• Working knowledge of behavioral analytics platforms (Adobe Analytics, Adobe Customer Journey Analytics).
• Excellent communication skills with a talent for translating technical concepts into business value.
• Experience operating in agile delivery environments, balancing speed, scalability, and solution quality.
• Proven leadership as a technical lead
Key Responsibilities
• Design, build, and maintain robust, scalable ELT/ETL pipelines and data transformations using Databricks, Spark, and Synapse.
• Model high-volume, complex event-level datasets (digital behavior, Ecommerce transactions, marketing interactions) to support dashboards, experimentation, ML models, and marketing activation.
• Enforce data governance, discoverability, and stewardship using Unity Catalog and Alation, ensuring compliance and lineage tracking.
• Validate and reconcile data pipelines against established behavioral datasets such as Adobe Customer Journey Analytics (CJA) and Adobe Analytics.
• Partner with data architects, analysts, data scientists, and marketing teams to deliver trusted, reusable, and well-structured datasets that power BI dashboards and decision-making.
• Mature the data ingestion, processing, orchestration, and curation capabilities leveraging Delta Lake optimization, Databricks Workflows, and Synapse for analytical consumption.
• Support and optimize semantic models and data marts that enable self-service analytics through AI/BI Dashboards and Power BI.
• Participate in agile delivery processes (sprint planning, backlog refinement, documentation), collaborating through Jira and Confluence.
• Document data assets, transformations, and pipelines for discoverability, transparency, and long-term maintainability.
• Facilitate clear and continuous communication between business and engineering teams.