

Open Systems Technologies
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "X months" and a pay rate of "$X/hour". Key skills include advanced Python, AWS CloudFormation, Terraform, and experience with ETL processes. Requires 5+ years in data engineering and cloud architecture.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 3, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Plano, TX
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Automation #Data Management #Big Data #Terraform #API (Application Programming Interface) #NoSQL #SQL (Structured Query Language) #Data Lake #Deployment #RDBMS (Relational Database Management System) #Data Engineering #GitLab #Lambda (AWS Lambda) #Python #Data Ingestion #AWS (Amazon Web Services) #Metadata #Cloud #Data Warehouse #Data Modeling
Role description
Develop scripts and tools for automation of system provisioning, deployment, upgrade, and scaling (Python)
Create ETLs to take data from various operational systems.
Development solutions on Cloud based architecture.
Evangelizing key strategic technologies in the areas of public API’s, Big Data, and Analytics
5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols).
Experience with data warehouse, data lake, and enterprise big data platforms.
Good Knowledge of metadata management, data modeling, and related tools required.
Experience in team management, communication, and Advanced SQL skills
AWS CloudFormation and Lambda experience
Terraform and GitLab is a must have due to the Client ecosystem
Advanced knowledge in Python development
Good technical Acumen around AWS environment for Data driven projects.
Develop scripts and tools for automation of system provisioning, deployment, upgrade, and scaling (Python)
Create ETLs to take data from various operational systems.
Development solutions on Cloud based architecture.
Evangelizing key strategic technologies in the areas of public API’s, Big Data, and Analytics
5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols).
Experience with data warehouse, data lake, and enterprise big data platforms.
Good Knowledge of metadata management, data modeling, and related tools required.
Experience in team management, communication, and Advanced SQL skills
AWS CloudFormation and Lambda experience
Terraform and GitLab is a must have due to the Client ecosystem
Advanced knowledge in Python development
Good technical Acumen around AWS environment for Data driven projects.






