

Auralis India Pvt Ltd
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Chicago, IL, on a contract for over 6 months, with a pay rate of "rate". Requires 5-6 years of data engineering experience, proficiency in GCP, and expertise in building automated data pipelines.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 20, 2026
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Chicago, IL
-
π§ - Skills detailed
#Data Governance #Data Modeling #Data Engineering #Storage #Terraform #Dataflow #Complex Queries #Scala #Cloud #MongoDB #Monitoring #SQL (Structured Query Language) #NoSQL #Oracle #Computer Science #BI (Business Intelligence) #BigQuery #Data Pipeline #Python #GCP (Google Cloud Platform) #Data Catalog #Databases #Data Framework
Role description
Senior Data Engineer
Chicago, IL
Contract
Senior Data Engineer
We are seeking a Senior Data Engineer to lead the design, development, and maintenance of scalable, high-performance data solutions focused on Data Centralization, Governance, and Integrity. This is a highly technical role requiring expertise in Google Cloud Platform (GCP) native services (BigQuery, Dataflow, Dataproc) to build automated, cost-efficient, and complex data pipelines. The ideal candidate will be a technical leader, making key architectural decisions, implementing robust data frameworks, and mentoring junior engineers. This position requires a strong commitment to working full-time onsite (with potential remote flexibility on Fridays) and leveraging data engineering expertise to drive business insights and analytics.
Key Requirements
Experience & Education:
5-6 years of experience as a Data Engineer or in a comparable Business Intelligence role.
Bachelor of Science in Information Technology, Computer Science, or a related technical field.
Cloud & Pipeline Expertise (Required):
Proficiency with GCP (Google Cloud Platform) offering is required
Expertise in designing and building large-scale, automated data pipelines using GCP native services (e.g., BigQuery, Dataflow, Dataproc).
Experience with Terraform to create and maintain intermediate to advanced infrastructure-as-code scripts is a plus
Database & Coding Skills:
4-5 years minimum experience with relational databases (MS-SQL, Postgres, or Oracle).
3-4 years of experience with NoSQL databases (Capella or MongoDB).
2-3 years of experience with Python or a similar language.
Ability to work with multi-statement complex queries and enhance performance/troubleshoot issues.
Architecture & Governance:
Proven ability to make key architectural decisions related to data modeling, storage, and real-time processing.
Experience assisting with data governance, quality, and lineage practices using tools like Data Catalog, DLP, or Cloud Audit Logs.
Experience translating business requirements into technical design documents.
Work Schedule & Location:
Full-time onsite commitment required (Monday-Friday, 8a-5p), with remote flexibility for Fridays pending manager approval.
Desired Experience (Bonus):
Experience in the logistics or transportation industry (1-3 years).
Proven ability to mentor junior engineers and provide technical guidance.
Experience with monitoring and optimizing GCP resources for cost management.
Senior Data Engineer
Chicago, IL
Contract
Senior Data Engineer
We are seeking a Senior Data Engineer to lead the design, development, and maintenance of scalable, high-performance data solutions focused on Data Centralization, Governance, and Integrity. This is a highly technical role requiring expertise in Google Cloud Platform (GCP) native services (BigQuery, Dataflow, Dataproc) to build automated, cost-efficient, and complex data pipelines. The ideal candidate will be a technical leader, making key architectural decisions, implementing robust data frameworks, and mentoring junior engineers. This position requires a strong commitment to working full-time onsite (with potential remote flexibility on Fridays) and leveraging data engineering expertise to drive business insights and analytics.
Key Requirements
Experience & Education:
5-6 years of experience as a Data Engineer or in a comparable Business Intelligence role.
Bachelor of Science in Information Technology, Computer Science, or a related technical field.
Cloud & Pipeline Expertise (Required):
Proficiency with GCP (Google Cloud Platform) offering is required
Expertise in designing and building large-scale, automated data pipelines using GCP native services (e.g., BigQuery, Dataflow, Dataproc).
Experience with Terraform to create and maintain intermediate to advanced infrastructure-as-code scripts is a plus
Database & Coding Skills:
4-5 years minimum experience with relational databases (MS-SQL, Postgres, or Oracle).
3-4 years of experience with NoSQL databases (Capella or MongoDB).
2-3 years of experience with Python or a similar language.
Ability to work with multi-statement complex queries and enhance performance/troubleshoot issues.
Architecture & Governance:
Proven ability to make key architectural decisions related to data modeling, storage, and real-time processing.
Experience assisting with data governance, quality, and lineage practices using tools like Data Catalog, DLP, or Cloud Audit Logs.
Experience translating business requirements into technical design documents.
Work Schedule & Location:
Full-time onsite commitment required (Monday-Friday, 8a-5p), with remote flexibility for Fridays pending manager approval.
Desired Experience (Bonus):
Experience in the logistics or transportation industry (1-3 years).
Proven ability to mentor junior engineers and provide technical guidance.
Experience with monitoring and optimizing GCP resources for cost management.






