

Senior GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior GCP Data Engineer on a contract exceeding 6 months, offering $84 to $90 per hour. Key skills include GCP tools, Java, Python, and data engineering. A GCP Data Engineer Certification is required; automotive industry experience is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
720
-
ποΈ - Date discovered
June 24, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Allen Park, MI
-
π§ - Skills detailed
#REST API #Data Engineering #AI (Artificial Intelligence) #Databases #Java #SonarQube #REST (Representational State Transfer) #NoSQL #Data Pipeline #Migration #Spark (Apache Spark) #Airflow #Microservices #Python #BigQuery #Cloud #PostgreSQL #SAP #Security #Computer Science #Data Migration #Agile #GitHub #GCP (Google Cloud Platform) #Data Processing #"ETL (Extract #Transform #Load)" #Observability #Storage #SQL (Structured Query Language) #Dataflow #Scala #Kafka (Apache Kafka) #Terraform #MySQL #DevOps
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
CLIENT WORKS WITH W2s ONLY. NO SUBCONTRACTING OR C2C.
About the Role
Join a multi-year transformation initiative focused on revolutionizing Materials Requirement Planning and Inventory Management. As a GCP Data Engineer, youβll play a critical role in designing and deploying a data-centric architecture on Google Cloud Platform (GCP) to support the Materials Management Platform (MMP). This platform integrates data across Product Development, Manufacturing, Finance, Purchasing, and Supply Chain systemsβboth modern and legacy.
Responsibilities
β’ Architect and implement scalable data solutions using GCP tools such as BigQuery, Dataflow, Dataproc, Cloud SQL, Pub/Sub, and Vertex AI.
β’ Build and maintain ETL pipelines to ingest data from diverse sources.
β’ Develop data processing workflows using Java and Python.
β’ Design and optimize data models for efficient storage and retrieval.
β’ Manage SQL and NoSQL databases (e.g., Bigtable, Firestore).
β’ Apply CI/CD practices using tools like GitHub Actions, Tekton, and Terraform.
β’ Monitor and troubleshoot data pipelines using GCPβs observability tools.
β’ Enforce code quality and security using SonarQube, Checkmarx, Fossa, and Cycode.
β’ Collaborate with cross-functional teams to define data requirements.
β’ Mentor junior engineers and contribute to a culture of knowledge sharing.
Qualifications
β’ Bachelorβs degree in Computer Science, Information Technology, or related field.
β’ 8+ years of experience in data engineering and software product development.
β’ 4+ years of hands-on experience with GCP and cloud-native data engineering.
β’ Proficiency in at least three of the following: Java, Python, Spark, Scala, SQL.
β’ Experience with Airflow, BigQuery, MySQL/PostgreSQL, Kafka or Pub/Sub.
β’ Strong understanding of microservices and REST APIs.
β’ Familiarity with DevOps and agile delivery methodologies.
Required Skills
β’ GCP Data Engineer Certification (Required)
Preferred Skills
β’ Masterβs degree in a related field.
β’ Experience with SAP S/4HANA, IDOC processing, and SAP data migration.
β’ Automotive industry experience.
β’ Prior work in an onshore/offshore support model.
β’ GCP Cloud Professional Certification (Preferred)
Pay range and compensation package
Pay Range: $84 to $90 an hour
Benefits: Health, Vision, Dental, Paid time off, 401(k), Flexible schedule