

Unisys
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of "unknown." It requires 5+ years of experience in data engineering, strong Python skills, AWS CloudFormation, Terraform, and GitLab expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 3, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Plano, TX
-
🧠 - Skills detailed
#Data Ingestion #Deployment #Data Warehouse #Data Lake #"ETL (Extract #Transform #Load)" #Data Engineering #GitLab #Data Modeling #Metadata #SQL (Structured Query Language) #Lambda (AWS Lambda) #Automation #Terraform #Python #RDBMS (Relational Database Management System) #Data Management #Big Data #AWS (Amazon Web Services) #NoSQL #Cloud #API (Application Programming Interface)
Role description
Description:
Develop scripts and tools for automation of system provisioning, deployment, upgrade, and scaling (Python)
Create ETLs to take data from various operational systems.
Development solutions on Cloud based architecture.
Evangelizing key strategic technologies in the areas of public API’s, Big Data, and Analytics
5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols).
Experience with data warehouse, data lake, and enterprise big data platforms.
Good Knowledge of metadata management, data modeling, and related tools required.
Experience in team management, communication, and Advanced SQL skills
AWS CloudFormation and Lambda experience
Terraform and GitLab is a must have due to the Fannie Mae ecosystem
Advanced knowledge in Python development
Good technical Acumen around AWS environment for Data driven projects.
# LI-CGTS
# TS-2505
Description:
Develop scripts and tools for automation of system provisioning, deployment, upgrade, and scaling (Python)
Create ETLs to take data from various operational systems.
Development solutions on Cloud based architecture.
Evangelizing key strategic technologies in the areas of public API’s, Big Data, and Analytics
5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols).
Experience with data warehouse, data lake, and enterprise big data platforms.
Good Knowledge of metadata management, data modeling, and related tools required.
Experience in team management, communication, and Advanced SQL skills
AWS CloudFormation and Lambda experience
Terraform and GitLab is a must have due to the Fannie Mae ecosystem
Advanced knowledge in Python development
Good technical Acumen around AWS environment for Data driven projects.
# LI-CGTS
# TS-2505






