

PEOPLE FORCE CONSULTING INC
Data Engineer - Hybrid in Chicago
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Chicago (hybrid, 3 days/week) for 6-12 months, offering a competitive pay rate. Key skills include Python, PySpark, SQL, AWS, Terraform, and Angular; 8+ years of experience and an engineering degree are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 21, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Chicago, IL
-
π§ - Skills detailed
#Scala #IAM (Identity and Access Management) #Cloud #Azure DevOps #Spark (Apache Spark) #Lambda (AWS Lambda) #Angular #Terraform #Databricks #PySpark #S3 (Amazon Simple Storage Service) #Python #SQL (Structured Query Language) #Version Control #Azure #DevOps #GIT #SNS (Simple Notification Service) #Deployment #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Data Processing #Data Engineering
Role description
Strong Data Engineer with expertise in ETL, Python, SQL, and handsβon experience in PySpark, Databricks, AWS, Terraform, Angular, and CI/CD tools.
Responsibilities: -
β’ Develop and manage ETL processes using Python, PySpark, and SQL.
β’ Work with Databricks for data processing.
β’ Utilize AWS services like S3, CloudWatch, IAM, SNS, and Lambda.
β’ Familiarity with Terraform for infrastructure management.
β’ Work with Angular for front-end integration.
β’ Collaborate on version control using Git and deployment using Octopus and Azure DevOps.
β’ Basic knowledge of Scala and HL7 is a plus.
Experience: -
β’ 8+ Years
Location: -
β’ Chicago, IL (3day/week)
Tentative Duration - 6 - 12 months
Educational Qualifications: -
β’ Engineering Degree β BE/ME/BTech/MTech/BSc/MSc.
β’ Technical certification in multiple technologies is desirable.
Mandatory Skills
β’ Strong experience in Python, PySpark, and SQL
β’ Angular for front-end development
β’ Experience with AWS services (S3, CloudWatch, IAM, SNS, Lambda)
β’ Terraform knowledge
β’ Experience with Git, Octopus, and Azure DevOps for CI/CD.
Good to have skills: -
β’ Experience PySpark, Databricks, AWS, Terraform, Angular
Strong Data Engineer with expertise in ETL, Python, SQL, and handsβon experience in PySpark, Databricks, AWS, Terraform, Angular, and CI/CD tools.
Responsibilities: -
β’ Develop and manage ETL processes using Python, PySpark, and SQL.
β’ Work with Databricks for data processing.
β’ Utilize AWS services like S3, CloudWatch, IAM, SNS, and Lambda.
β’ Familiarity with Terraform for infrastructure management.
β’ Work with Angular for front-end integration.
β’ Collaborate on version control using Git and deployment using Octopus and Azure DevOps.
β’ Basic knowledge of Scala and HL7 is a plus.
Experience: -
β’ 8+ Years
Location: -
β’ Chicago, IL (3day/week)
Tentative Duration - 6 - 12 months
Educational Qualifications: -
β’ Engineering Degree β BE/ME/BTech/MTech/BSc/MSc.
β’ Technical certification in multiple technologies is desirable.
Mandatory Skills
β’ Strong experience in Python, PySpark, and SQL
β’ Angular for front-end development
β’ Experience with AWS services (S3, CloudWatch, IAM, SNS, Lambda)
β’ Terraform knowledge
β’ Experience with Git, Octopus, and Azure DevOps for CI/CD.
Good to have skills: -
β’ Experience PySpark, Databricks, AWS, Terraform, Angular





