Jobs via Dice

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a contract-to-perm basis, located in Chicago, Illinois, with a pay rate of $60-$70 per hour. Requires 5-6 years of data engineering experience, proficiency in Google Cloud Platform, and expertise in building data pipelines.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
560
-
πŸ—“οΈ - Date
October 16, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Complex Queries #Dataflow #GCP (Google Cloud Platform) #MongoDB #Monitoring #Code Reviews #Data Governance #Scala #Data Modeling #Oracle #BigQuery #Cloud #Data Pipeline #Data Quality #Databases #Python #BI (Business Intelligence) #Data Loss Prevention #Terraform #Data Framework #Data Engineering #Storage #SQL (Structured Query Language) #Computer Science #Data Catalog #NoSQL #Data Integrity
Role description
job summary: Senior Data Engineer We are seeking a Senior Data Engineer to lead the design, development, and maintenance of scalable, high-performance data solutions focused on Data Centralization, Governance, and Integrity. This is a highly technical role requiring expertise in Google Cloud Platform (Google Cloud Platform) native services (BigQuery, Dataflow, Dataproc) to build automated, cost-efficient, and complex data pipelines. The ideal candidate will be a technical leader, making key architectural decisions, implementing robust data frameworks, and mentoring junior engineers. This position requires a strong commitment to working full-time onsite (with potential remote flexibility on Fridays) and leveraging data engineering expertise to drive business insights and analytics. location: Chicago, Illinois job type: Contract to Perm salary: $60 - 70 per hour work hours: 8am to 5pm education: No Degree Required responsibilities: Design and build large-scale automated and cost-efficient data pipelines that are integrated across multi-platform and disparate data sources using Google Cloud Platform native services (e.g., BigQuery, Dataflow, Dataproc) Make key architectural decisions related to data modeling, storage, real-time processing, and orchestration Work with multi statement complex queries and enhance performance or able to troubleshoot issues related to performance Implement scalable, reusable and efficient data frameworks that ensure data quality, data integrity and performance consistently Assist in establishing data governance, quality, and lineage practices using tools like Data Catalog, Data Loss Prevention (DLP), or Cloud Audit Logs. Creates and maintains intermediate to advanced terraform scripts Mentor junior and mid-level engineers, conduct code reviews, and provide technical guidance on best practices Translate business requirements into technical design documents and implementation plans Participate in support rotation and proactively identify issues while implementing issue resolutions as appropriate. ADDITIONAL DESIRED DUTIES/RESPONSIBILITIES: Monitor and optimize Google Cloud Platform resources to manage costs effectively Stay updated with latest Google Cloud Platform services and features Evaluate and recommend new tools and technologies to improve data engineering practices qualifications: Experience & Education: 5-6 years of experience as a Data Engineer or in a comparable Business Intelligence role. Bachelor of Science in Information Technology, Computer Science, or a related technical field preferred but not required Cloud & Pipeline Expertise (Required): Proficiency with Google Cloud Platform (Google Cloud Platform) offerings is required Expertise in designing and building large-scale, automated data pipelines using Google Cloud Platform native services (e.g., BigQuery, Dataflow, Dataproc). Experience with Terraform to create and maintain intermediate to advanced infrastructure-as-code scripts. Database & Coding Skills: 4-5 years minimum experience with relational databases (MS-SQL, Postgres, or Oracle). 3-4 years of experience with NoSQL databases (Capella or MongoDB). 2-3 years of experience with Python or a similar language. Ability to work with multi-statement complex queries and enhance performance/troubleshoot issues. Architecture & Governance: Proven ability to make key architectural decisions related to data modeling, storage, and real-time processing. Experience assisting with data governance, quality, and lineage practices using tools like Data Catalog, DLP, or Cloud Audit Logs. Experience Translating Business Requirements Into Technical Design Documents. Work Schedule & Location: Full-time onsite commitment required (Monday-Friday, 8a-5p), with remote flexibility for Fridays pending manager approval. Desired Experience (Bonus): Experience in the logistics or transportation industry (1-3 years). Proven ability to mentor junior engineers and provide technical guidance. Experience with monitoring and optimizing Google Cloud Platform resources for cost managemen Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status. At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including: medical, prescription, dental, vision, AD&D, and life insurance offerings, short-term disability, and a 401K plan (all benefits are based on eligibility). This posting is open for thirty (30) days.