Optomi

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 3–6+ years of experience, focusing on Dataiku DSS and Azure data services. Contract length is unspecified, with a pay rate of "unknown." Key skills include SQL, Python, ELT/ETL design, and data modeling.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
March 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Quality #Dataiku #Datasets #ADF (Azure Data Factory) #Monitoring #Data Governance #Data Modeling #Python #SQL (Structured Query Language) #Azure SQL #Scala #Data Pipeline #Version Control #Azure #Synapse #Data Engineering #Azure Data Factory #ADLS (Azure Data Lake Storage) #"ETL (Extract #Transform #Load)" #Data Layers #Documentation #Metadata #Automation
Role description
Optomi, in partnership with a leading utilities company, is seeking a Data Engineer with strong experience in Dataiku DSS and Azure data services to design, build, and optimize enterprise data pipelines and curated data layers. Qualifications: • 3–6+ years of experience in data engineering, analytics engineering, or a related field. • Strong hands-on experience with Dataiku DSS, including flows, recipes, scenarios, automation, and governance features. • Proficiency in SQL and Python for data transformation, processing, and workflow automation. • Experience with Azure data services such as ADLS Gen2, Azure SQL, Azure Data Factory, Synapse, or similar. • Solid understanding of ELT/ETL design, data modeling, and data quality principles. • Experience integrating and transforming data from a variety of enterprise source systems. Responsibilities: • Design, build, and maintain scalable, production-grade pipelines using Dataiku DSS. • Develop and optimize data layers within Azure, including ADLS Gen2, Azure SQL, and Synapse. • Ingest, transform, and curate data from enterprise systems, operational sources, APIs, and third party providers. • Implement robust data validation, workflow automation, monitoring, and exception handling in Dataiku. • Collaborate with business units to understand data needs for financial, capital project, and safety analytics use cases. • Develop well-structured datasets and curated layers that support analytical, operational, and reporting workflows. • Establish best practices for pipeline development, ELT architecture, performance optimization, and reusable components. • Ensure alignment with enterprise data governance, metadata standards, and secure access frameworks. • Build and maintain documentation, lineage tracking, and dataset definitions within Dataiku and Azure. • Participate in version control workflows, CI/CD, and environment management as part of the development cycle.