Motion Recruitment

Principal Data Engineers/Architects - Energy, Oil & Gas, or Trading Domains

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Principal Data Engineer/Architect in Houston, TX (Hybrid) for a 12+ month contract at $70/hr. Requires 10+ years in data engineering, 3+ years in a lead role, and expertise in Azure, Databricks, and energy domain experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
October 11, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#Data Modeling #AWS (Amazon Web Services) #Azure DevOps #Cloud #Data Quality #Data Engineering #DevOps #GitHub #Azure #Documentation #AI (Artificial Intelligence) #SAP #Azure Data Factory #SonarQube #Security #Python #Automation #Databricks #IoT (Internet of Things) #Scala #ADF (Azure Data Factory) #Synapse #Data Pipeline #"ETL (Extract #Transform #Load)" #Leadership #Forecasting #Pytest #ML (Machine Learning) #Monitoring #Data Governance
Role description
Principal Data Engineers/Architects – Energy, Oil & Gas, or Trading Domains Hello We are looking for Principal Data Engineers/Architects who have led Azure + Databricks solutions enabling global data modernization, real-time analytics, and AI readiness with deep domain alignment within Energy, Oil & Gas, or Trading environments. Location: Houston, TX (Hybrid) Duration: 12+ Months Contract (with potential extensions) Pay Rate: $70/hr on W2 Contract Duration: 12+Months with possibility of longer-term extensions If interested, please email your resume to grace.johnson@motionrecruitment.com Please Note: Client is not open to C2C, H1B, TN Visa, 1099, F1 – CPT at this time. Key Responsibilities Architect and deliver scalable, cloud-based data platforms and pipelines supporting analytics, AI/ML, and reporting workloads. Design and implement end-to-end ELT frameworks using Databricks Delta Live Tables (DLT), Azure Data Factory, and Synapse. Translate complex business and operational requirements into production-grade data solutions. Lead data modeling, ingestion, and transformation efforts, ensuring high data quality, performance, and reliability. Drive best practices in DevOps, CI/CD automation, testing, and documentation using GitHub Actions, Azure DevOps, and PyTest. Mentor engineers, set technical standards, and ensure alignment across global data modernization initiatives. Collaborate across business units to unify operational, trading, and production data into a consistent global reporting layer. Domain Alignment (Required) Resume must show experience and understanding in one or more of the following areas: Production & Operations Data: Wells, drilling, exploration, pipeline performance. Trading & Commodities Data: Pricing, risk, P&L, and market position analytics. Refinery, Logistics, & Distribution: Throughput optimization, supply chain visibility, or product movement tracking. Awareness of end-to-end energy data flow, from field sensors (IoT) to enterprise analytics platforms, including: Forecasting production or demand. Optimizing energy trading and pricing. Monitoring asset, equipment, or pipeline performance. Emissions tracking and sustainability analytics. Required Skills & Experience Experience from Energy, Oil & Gas, or Trading Domains 10+ years overall experience in data engineering. 3+ years in a lead or architect capacity (solution ownership, mentoring, and best-practice leadership). 6+ years in cloud data engineering (Azure, AWS, or SAP) in enterprise environments. Deep expertise in Databricks, Azure Data Factory, Synapse, and ELT development. Strong proficiency in Python for data pipelines, automation, and testing. Proven experience with CI/CD tools such as GitHub Actions, Azure DevOps, and SonarQube. Familiarity with Databricks Delta Live Tables (DLT) for pipeline automation and data quality enforcement. Experience with enterprise data governance, lineage, auditability, and security.