Unisys

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer, remote, with a contract length and pay rate unspecified. Requires 5+ years in data engineering, expertise in AWS, SQL, Python/Scala, and utility industry experience. Bachelor's degree in a related field is mandatory.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 28, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Seattle, WA
-
🧠 - Skills detailed
#Data Warehouse #Databases #ML (Machine Learning) #Data Lake #Indexing #AWS (Amazon Web Services) #Data Engineering #Data Pipeline #Data Lifecycle #Cloud #Data Integration #Data Cleansing #Consulting #Databricks #S3 (Amazon Simple Storage Service) #Redshift #Schema Design #Lambda (AWS Lambda) #DevOps #Batch #IoT (Internet of Things) #Data Science #Computer Science #BI (Business Intelligence) #Python #SQL (Structured Query Language) #Web Services #Infrastructure as Code (IaC) #"ETL (Extract #Transform #Load)" #Data Storage #Terraform #Storage #SQL Queries #Scala #Data Architecture
Role description
Unisys is an Information Technology and Business Consulting firm providing project-based solutions, software solutions, and professional staffing services. – "Those authorized to work in the United States are encouraged to apply. We are unable to sponsor at this time. Job Description: Position: AWS Data Engineer Location: Remote Client:- Federal (Confidential) Client is seeking an AWS Data Engineer to design, build, and optimize large-scale data pipelines and analytics solutions on Amazon Web Services (AWS). The role will design and build the infrastructure and pipelines that enable organizations to collect, store, process, and analyze large volumes of structured and unstructured data efficiently and securely. A Data Engineer is responsible for the end-to-end data lifecycle, from ingestion and transformation to storage and delivery for analytics, machine learning, and operational systems. They ensure data is reliable, high-quality, scalable, and accessible for business and technical stakeholders. The ideal candidate will have strong expertise in cloud-based data engineering, hands-on experience with AWS native services, and a solid understanding of data lake, data warehouse, and real-time streaming architectures. Location: This position is remote. Responsibilities: β€’ Design, build, and optimize ETL/ELT workflows to ingest data from multiple sources. (e.g., S3, Redshift, Lake Formation, Glue, lambda). β€’ Implement data cleansing, enrichment, and standardization processes. β€’ Automate batch and streaming data pipelines for real-time analytics.Build solutions for both streaming (Kinesis, MSK, Lambda) and batch processing (Glue, EMR, Step Functions). β€’ Ensure pipelines are optimized for scalability, performance, and fault tolerance. β€’ Optimize SQL queries, data models, and pipeline performance. β€’ Ensure efficient use of cloud-native resources (compute, storage, networking). β€’ Design and implement data architecture across data lakes, data warehouses, and lakehouses. β€’ Optimize data storage strategies (partitioning, indexing, schema design). β€’ Implement data integration from diverse sources (databases, APIs, IoT, third-party systems). β€’ Work with Data Scientists, Analysts, and BI developers to deliver clean, well-structured data. β€’ Document data assets and processes for discoverability. β€’ Training of existing core staff who will maintain infrastructure and pipelines. Required Education & Experience: β€’ Bachelor’s degree in Computer Science, Data Engineering, or related field. β€’ 5+ years of experience in data engineering roles β€’ Proficiency in SQL, Python, or Scala for data transformation and processing. β€’ Experience in the utility industry data including meter data, customer data, grid/asset data, work management, outage data. β€’ Familiarity with IEC CIM standards and utility integration frameworks. β€’ Working knowledge of Databricks on AWS β€’ Working knowledge of DevOps and CI/CD Preferred Qualifications: β€’ Experience with IaC tools (e.g. Terraform) is a plus Thank you!! Poonam Mittal Senior Talent Acquisition Specialist Direct line is: (703) 454-0366 Extension number is: 8970 Company number is: (703) 454-0677 Email: poonam.mittal@Unisys.com