

Unisys
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a contract basis, requiring 5+ years of experience in data platforms and ETL processes. Key skills include Python, AWS (CloudFormation, Lambda), Terraform, and GitLab. Location is "remote," with a pay rate of "$X/hour."
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
April 3, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Plano, TX
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Automation #Data Management #Big Data #Terraform #API (Application Programming Interface) #NoSQL #SQL (Structured Query Language) #Data Lake #Deployment #RDBMS (Relational Database Management System) #Data Engineering #GitLab #Lambda (AWS Lambda) #Python #Data Ingestion #AWS (Amazon Web Services) #Metadata #Cloud #Data Warehouse #Data Modeling
Role description
β’ Develop scripts and tools for automation of system provisioning, deployment, upgrade, and scaling (Python).
β’ Create ETLs to take data from various operational systems.
β’ Development solutions on Cloud based architecture.
β’ Evangelizing key strategic technologies in the areas of public APIβs, Big Data, and Analytics.
β’ 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols).
β’ Experience with data warehouse, data lake, and enterprise big data platforms.
β’ Good Knowledge of metadata management, data modeling, and related tools required.
β’ Experience in team management, communication, and Advanced SQL skills.
β’ AWS CloudFormation and Lambda experience.
β’ Terraform and GitLab is a must have due to the client ecosystem.
β’ Advanced knowledge in Python development.
β’ Good technical Acumen around AWS environment for Data driven projects.
#LI-CGTS
#TS-3142
β’ Develop scripts and tools for automation of system provisioning, deployment, upgrade, and scaling (Python).
β’ Create ETLs to take data from various operational systems.
β’ Development solutions on Cloud based architecture.
β’ Evangelizing key strategic technologies in the areas of public APIβs, Big Data, and Analytics.
β’ 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols).
β’ Experience with data warehouse, data lake, and enterprise big data platforms.
β’ Good Knowledge of metadata management, data modeling, and related tools required.
β’ Experience in team management, communication, and Advanced SQL skills.
β’ AWS CloudFormation and Lambda experience.
β’ Terraform and GitLab is a must have due to the client ecosystem.
β’ Advanced knowledge in Python development.
β’ Good technical Acumen around AWS environment for Data driven projects.
#LI-CGTS
#TS-3142






