

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with 10+ years of experience in Data Engineering, focusing on cloud solutions, ETL design, and data architecture. The contract is for 12+ months, located in Dearborn, MI, with a pay rate of $55.00 - $65.00 hourly.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
July 3, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Yes
-
π - Location detailed
Dearborn, MI
-
π§ - Skills detailed
#Datasets #Data Lake #Data Architecture #Security #Data Science #Cloud #Bash #Data Pipeline #GCP (Google Cloud Platform) #Python #Database Administration #Data Engineering #API (Application Programming Interface) #Spark (Apache Spark) #Data Warehouse #Visualization #"ETL (Extract #Transform #Load)" #Unix #AI (Artificial Intelligence) #BI (Business Intelligence) #Groovy #Databases #Computer Science #Monitoring #PySpark #REST (Representational State Transfer) #REST API #PostgreSQL #ML (Machine Learning)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Akkodis is seeking multiple of a GCP Data Engineer for a 12+ months contract position with one of our Automotive Industry leading Direct Client in Dearborn, MI (hybrid schedule). The ideal candidate would be someone with a minimum 10+ years of overall experience in Data Engineering with GCP and Python.
Pay Range: $55.00 - $65.00 hourly on W2(all inclusive), the pay range may be negotiable based on experience, education, geographic location, and other factors.
Position Description:
β’ Design data solutions in the cloud or on premises, using the latest data services, products, technology, and industry best practices.
β’ Experience migrating legacy data environments with a focus performance and reliability.
β’ Data Architecture contributions include assessing and understanding data sources, data models and schemas, and data workflows.
β’ Ability to assess, understand, and design ETL jobs, data pipelines, and workflows.
β’ BI and Data Visualization include assessing, understanding, and designing reports, creating dynamic dashboards, and setting up data pipelines in support of dashboards and reports.
β’ Data Science focus on designing machine learning, AI applications, MLOps pipelines.
β’ Addressing technical inquiries concerning customization, integration, enterprise architecture and general feature / functionality of data products.
β’ Experience in crafting data lake house solutions in GCP. This includes relational & vector databases, data warehouses, data lakes, and distributed data systems.
β’ Must have PySpark API processing knowledge utilizing resilient distributed datasets (RDSS) and data frames.
Skills Required:
β’ Design data solutions in the cloud or on premises, using the latest data services, products, technology, and industry best practices.
β’ Experience migrating legacy data environments with a focus performance and reliability.
β’ Data Architecture contributions include assessing and understanding data sources, data models and schemas, and data workflows.
β’ Ability to assess, understand, and design ETL jobs, data pipelines, and workflows.
β’ BI and Data Visualization include assessing, understanding, and designing reports, creating dynamic dashboards, and setting up data pipelines in support of dashboards and reports.
β’ Data Science focus on designing machine learning, AI applications, MLOps pipelines.
β’ Addressing technical inquiries concerning customization, integration, enterprise architecture and general feature / functionality of data products.
β’ Experience in crafting data lake house solutions in GCP. This includes relational & vector databases, data warehouses, data lakes, and distributed data systems.
β’ Must have PySpark API processing knowledge utilizing resilient distributed datasets (RDSS) and data frames.
Skills Preferred:
β’ Ability to write bash, python and groovy scripts to help configure and administer tools.
β’ Experience installing applications on VMs, monitoring performance, and tailing logs on Unix.
β’ PostgreSQL Database administration skills are preferred.
β’ Python experience and experience developing REST APIs
Experience Required:
β’ 10 + Years
Education Required:
β’ Bachelor's Degree Computer Science, Computer Information Systems, or equivalent experience.
Education Preferred:
β’ Masters Data Science
Equal Opportunity Employer/Veterans/Disabled:
Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, an EAP program, commuter benefits, and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client.
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit https://www.akkodis.com/en/privacy-policy.
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
Β· The California Fair Chance Act
Β· Los Angeles City Fair Chance Ordinance
Β· Los Angeles County Fair Chance Ordinance for Employers
Β· San Francisco Fair Chance Ordinance.
Thanks & Regards
Aditya Agnihotri
Sr. Resource Development Manager
Email: aditya.agnihotri@akkodisgroup.com
Direct: +1(610)-472-0979
LinkedIn: https://www.linkedin.com/in/aditya-agnihotri-7300722061/
(An Adecco Group Company)
World Leader in IT and Engineering Workforce Solutions
www.akkodis.com
βBelieve you can and you're halfway there.β β Theodore Roosevelt