Randstad Digital Americas

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Chicago, IL, on a contract-to-perm basis, offering $50-$60 per hour. Requires 3-4 years in data engineering, proficiency in GCP, SQL, Python, and experience with relational and NoSQL databases.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
October 21, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Data Integrity #Apache Beam #Data Mart #AWS (Amazon Web Services) #Pytest #BI (Business Intelligence) #Dataflow #Deployment #Data Warehouse #Cloud #Data Lake #Compliance #GCP (Google Cloud Platform) #MongoDB #Security #Data Engineering #Oracle #Infrastructure as Code (IaC) #Monitoring #Databases #Data Processing #Data Analysis #Data Lakehouse #Python #Data Governance #NoSQL #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Azure #BigQuery #Data Pipeline #Docker #Logging #Terraform
Role description
Job Summary: Randstad Digital is seeking an experienced Data Engineer for an exciting opportunity in Chicago, IL. GENERAL DESCRIPTION: The Data Engineer role will play a role accomplishing the following team goals: Data Centralization, Data Governance, and Data Integrity. They will be focused on building data pipelines, expanding Data Lakehouse functionality, and maintenance of the on-prem data warehouse. They must be well organized, flexible, and able to communicate effectively with all levels of the organization. Essential Duties And Responsibilities: Build and maintain data pipelines using GCP tools such as Dataflow, Apache Beam, BigQuery, Cloud Composer, and Pub/Sub Ingest, transform, and store structured and unstructured data from various sources Experience with deployment in docker image and testing with pytests Write efficient, well-documented SQL and Python code for data processing and analysis Design performance-focused tables and databases, and perform volume testing and tuning of queries to ensure efficiency Build data service solutions such as data marts and designing complex models for specific business use cases Manage infrastructure as code using Terraform Monitor pipeline performance and resolve issues using Cloud Monitoring and logging Take technical direction from senior engineers and architects as well as collaborate with data analysts, scientists, and engineers to understand data requirements and deliver solutions Apply security best practices and ensure data is handled securely and in compliance with internal policies and procedures Participate in the on-call rotation and perform operational support tasks Additional Desired Duties/Responsibilities: Exposure or experience with creating, and enhancing GCP data lakehouse preferred Minimum Required Experience: 3 - 4 years of experience as a data engineer or comparable Business Intelligence role 3 - 4 years minimum experience with relational databases such as MS-SQL, Postgres, or Oracle 2 - 3 years of experience with NoSQL databases such as Capella or MongoDB 1 - 2 years Python experience or similar Proficient with Cloud offerings primarily with GCP (Azure or AWS is acceptable) Desirable Education/Experience: 1-3 years of logistics or transportation experience location: Chicago, Illinois job type: Contract to Perm salary: $50 - 60 per hour work hours: 8am to 5pm education: No Degree Required Responsibilities: ESSENTIAL DUTIES AND RESPONSIBILITIES: • Build and maintain data pipelines using GCP tools such as Dataflow, Apache Beam, BigQuery, Cloud Composer, and Pub/Sub • Ingest, transform, and store structured and unstructured data from various sources • Experience with deployment in docker image and testing with pytests • Write efficient, well-documented SQL and Python code for data processing and analysis • Design performance-focused tables and databases, and perform volume testing and tuning of queries to ensure efficiency • Build data service solutions such as data marts and designing complex models for specific business use cases • Manage infrastructure as code using Terraform • Monitor pipeline performance and resolve issues using Cloud Monitoring and logging • Take technical direction from senior engineers and architects as well as collaborate with data analysts, scientists, and engineers to understand data requirements and deliver solutions • Apply security best practices and ensure data is handled securely and in compliance with internal policies and procedures • Participate in the on-call rotation and perform operational support tasks Qualifications: Minimum Required Experience: 3 - 4 years of experience as a data engineer or comparable Business Intelligence role 3 - 4 years minimum experience with relational databases such as MS-SQL, Postgres, or Oracle 2 - 3 years of experience with NoSQL databases such as Capella or MongoDB 1 - 2 years Python experience or similar Proficient with Cloud offerings primarily with GCP Desirable Education/Experience: 1-3 years of logistics or transportation experience Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status. At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com. Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including: medical, prescription, dental, vision, AD&D, and life insurance offerings, short-term disability, and a 401K plan (all benefits are based on eligibility). This posting is open for thirty (30) days.