Principal Data Engineer - AWS, SAP, Azure

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Principal Data Engineer in Houston, TX, offering $68/hr on a 12+ month contract. Requires 10+ years of IT/data engineering experience, including 3+ years in a lead capacity, with expertise in AWS, Azure, SAP, and Python.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
544
-
πŸ—“οΈ - Date discovered
September 30, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Houston, TX
-
🧠 - Skills detailed
#ADF (Azure Data Factory) #SonarQube #DevOps #Redshift #AWS (Amazon Web Services) #Airflow #GitHub #Azure DevOps #Pytest #Scala #Azure Data Factory #Synapse #Databricks #Data Engineering #Python #SQL (Structured Query Language) #Documentation #Scrum #Data Integration #Azure #ML (Machine Learning) #AI (Artificial Intelligence) #SAP #Data Modeling #Cloud
Role description
MUST HAVE 10+ YEARS OF EXPERIENCE AS A DATA ENGIEER WITH 3+ YEARS IN AN ARCHITECT/LEAD CAPACITY (OWNING SOLUTIONS, MENTORING, LEADING SMALL TEAMS, SETTING BEST PRACTICES). Principal Data Engineer – Houston, TX (Contract) Location: Hybrid roles in Houston, TX Pay Rate: $68/hr on W2 Contract Duration: 12+Months with possibility of longer-term extensions If interested, please email your resume to grace.johnson@motionrecruitment.com Please Note: Client is not open to C2C, H1B, TN Visa, 1099, F1 – CPT & OPT at this time. Our client, a leading energy organization, is seeking a Principal Data Engineer to join their Products Data Exchange (PDE) initiative. This strategic program focuses on modernizing the data landscape to enable global reporting, AI/ML capabilities, and real-time analytics. What You’ll Do: Architect and deliver scalable cloud-based data platforms and pipelines. Translate business requirements into production-ready data solutions. Lead ELT development, data modeling, and ingestion. Mentor engineers and drive technical excellence across the team. Ensure adherence to DevOps, change management, and documentation best practices. Required Skills & Experience: 10–12+ years total IT/data engineering experience. 3–5+ years in an architect/lead capacity (owning solutions, mentoring, leading small teams, setting best practices). 6–8+ years specifically in cloud data engineering (Azure/AWS/SAP at enterprise scale) from enterprise environments. Deep knowledge of ELT development, data modeling, and data integration. Hands-on experience with Databricks, Azure Data Factory, Synapse, SQL DB, Redshift, Glue, Stream Analytics, Airflow, Kinesis. Expert Python development skills. Experience with GitHub, GitHub Actions, Azure DevOps, SonarQube, and PyTest. Preferred Skills: Experience leading Scrum teams or managing small technical teams. Exposure to planning or documentation tools (e.g., BPC, MKDocs). Domain experience in energy, oil & gas, or trading. Familiarity with scientific computing, seismic, or subsurface data.