Data Engineer - AWS, SAP, Azure

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 7+ years of experience, specializing in Azure, AWS, SAP, Python, and ELT development. It offers a hybrid work location in Houston, TX, a pay rate of $65/hr, and a contract duration of 12+ months.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
520
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Houston, TX
-
🧠 - Skills detailed
#Azure Data Factory #ADF (Azure Data Factory) #Synapse #DevOps #SonarQube #AWS (Amazon Web Services) #Databricks #Data Integration #Cloud #SQL (Structured Query Language) #Azure #Python #SAP #Azure DevOps #Pytest #Scala #Airflow #Strategy #Redshift #Documentation #Data Engineering #Data Modeling #GitHub
Role description
HYBRID ROLES IN HOUSTON, TX Hello Looking for a Data Engineers with expert level experience in Azure, AWS, SAP, Python and ELT Development. Location: Hybrid roles in Houston, TX Pay Rate: $65/hr on W2 Contract Duration: 12+Months with possibility of longer-term extensions If interested, please email your resume to grace.johnson@motionrecruitment.com Please Note: Client is not open to C2C, H1B, TN Visa, 1099, F1 – CPT & OPT at this time. Mandatory Skills: MUST HAVE: 7+ years of experience as a Data Engineer – an Architect who defines strategy and enterprise-level design. Must have expertise level experience with Azure, AWS, SAP, Python and ELT development, data modeling and data integration. Experience with Databricks, Azure Data Factory, Synapse, SQL DB, Redshift, Glue, Stream Analytics, Airflow, Kinesis Experience with GitHub, GitHub Actions, Azure DevOps, SonarQube, PyTest Experience working in a centralized cloud repository. Experience in global intra-day reporting and analytics. Key Responsibilities: Architect and deliver scalable data platforms and pipelines Translate business needs into production-ready data solutions Lead ELT development, data modeling, and ingestion Drive stakeholder engagement and provide technical mentorship Ensure best practices in DevOps, change management, and documentation.