Infojini Inc

Lead AWS Data Engineer(W2)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead AWS Data Engineer (W2) in Chicago, IL, on a contract basis. Requires 12+ years of experience in data engineering, strong AWS skills, Python, SQL, and expertise in data architecture and governance. Hybrid work model.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#IAM (Identity and Access Management) #Data Security #IoT (Internet of Things) #PostgreSQL #Datasets #Compliance #"ETL (Extract #Transform #Load)" #Programming #GraphQL #Data Engineering #MySQL #React #API (Application Programming Interface) #AWS IAM (AWS Identity and Access Management) #Cloud #Aurora PostgreSQL #Scala #Lambda (AWS Lambda) #Athena #DevOps #Security #S3 (Amazon Simple Storage Service) #Data Pipeline #Data Catalog #Visualization #Data Warehouse #SQL (Structured Query Language) #AWS Glue #Redshift #AWS (Amazon Web Services) #Data Architecture #Batch #Data Lake #Data Quality #Aurora #Python #Monitoring #Data Modeling #Data Governance #Kafka (Apache Kafka) #Terraform
Role description
Role: Lead AWS Data Engineer Location: Chicago, IL- Hybrid Hire Type: Contract (W2) Note: Candidate must be ready to attend the in-person interview at Chicago, IL. Job Description: About the Role We are seeking an experienced Lead Data Engineer to design and build scalable, cloud-native data platforms supporting IoT, operational systems, and enterprise analytics. This role requires a hands-on technical leader who can architect modern data pipelines, enable real-time and batch ingestion, and design performant serving layers for APIs and applications. The ideal candidate blends deep AWS expertise, strong data modeling skills, and the ability to translate business requirements into scalable technical solutions. Key Responsibilities Data Engineering & Architecture • Architect ingestion pipelines for streaming and event-based data (IoT telemetry) • Implement time-window logic and event-to-state transformations • Design, build, and optimize data pipelines using AWS Glue, Lambda, and Step Functions for ingestion, transformation, and curation. • Develop and maintain data lakes (S3 + Lake Formation) and data warehouses (Redshift) to support analytics and visualization. • Integrate Aurora (PostgreSQL/MySQL) as a serving layer for APIs and dashboards. • Design APIs or direct query interfaces that expose curated datasets for React-based dashboards and web applications. • Define and implement data models, partitioning strategies, and schema evolution best practices for performance and scalability. Security, Governance & Compliance • Implement data security policies, encryption, and role-based access controls using AWS IAM, KMS, Lake Formation, and Secrets Manager. • Ensure compliance with organizational data governance and privacy standards. • Maintain data cataloging, lineage, and access tracking within AWS Glue Data Catalog and Lake Formation. • Implement CI/CD pipelines for data workflows in collaboration with DevOps teams. Required Skills & Qualifications • 12+ years of experience in data engineering and AWS data architecture. • Strong expertise with AWS Glue, Redshift, S3, Lake Formation, Aurora (Postgres/MySQL). • Proven experience building APIs or exposing curated datasets for web applications (preferably React or similar front-end frameworks). • Strong programming skills in Python and SQL. • Experience with data modeling, ETL/ELT frameworks, and performance optimization. • Working knowledge of Terraform/CloudFormation and CI/CD practices. • Understanding of data security, IAM policies, encryption, and governance frameworks. Preferred Skills • Experience with Athena for ad-hoc analytics. • Familiarity with API Gateway, AppSync, or GraphQL APIs for serving data to front-end applications. • Knowledge of event-driven pipelines (Kinesis, EventBridge, kafka) and data quality monitoring.