

Stefanini North America and APAC
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Dearborn, MI, offering a contract length of "Unknown" and a pay rate of "Unknown." Requires 7+ years in data engineering, proficiency in Python, and experience with GCP services, AI/ML, and data warehousing.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#Data Warehouse #GIT #Data Lineage #Data Processing #Observability #FastAPI #Kubernetes #Snowflake #Dataflow #GCP (Google Cloud Platform) #Microservices #Data Engineering #API (Application Programming Interface) #Apache Spark #Security #Storage #Airflow #Classification #Data Governance #Databases #Cloud #Big Data #REST (Representational State Transfer) #REST API #Spark (Apache Spark) #AI (Artificial Intelligence) #Programming #Redshift #Consulting #Version Control #Python #Terraform #Data Manipulation #Docker #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #SQL (Structured Query Language) #IAM (Identity and Access Management) #Data Science #BigQuery #Data Pipeline
Role description
Stefanini Group is hiring!
Note: Only W2 applicants.
Stefanini is looking for a GCP Data Engineer (Dearborn, MI)
For quick apply, please reach out to Adil Khan at 248-728- 6424/ adil.khan@stefanini.com
We are looking for candidate who is responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately
Responsibilities
• Architect and scale end-to-end data pipelines on GCP, transforming complex telemetry and enterprise data into high-quality, analytics-ready assets using Medallion architectures.
• Lead the implementation of robust CI/CD workflows, rigorous data governance, and security controls while mentoring junior talent and driving engineering best practices. By collaborating with cross-functional stakeholders and optimizing cloud performance, you will ensure the data platform remains secure, cost-effective, and highly available to power critical business insights.
• Using Terraform, Git, and Airflow to ensure reproducible, secure, and cost-optimized cloud infrastructure.
• Prioritizing data lineage, PII protection, and observability to maintain high trust in data assets.
• Acting as a bridge between technical teams (Data Science, Security) and business stakeholders to deliver self-service analytics. Strong understanding of Generative AI principles and architectures, including Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) systems.
• Proven experience in building and deploying RAG systems, including the use of Vector Databases.
Skills Required
• GCP, Big Data, Data Warehousing, Artificial Intelligence & Expert Systems, API
Skills Preferred
• Google Cloud Platform, Familiarity with advanced GCP services beyond core compute and storage, such as Vertex AI, Dataflow, Cloud Composer (Airflow), and BigQuery ML. For example, using Cloud Composer to orchestrate scheduled data pipelines that feed into a BigQuery data warehouse.
Experience Required
• Senior Data Engineer with 7+ years in data engineering and 10+ years in software with AI/ML
• Proficiency in Python programming.
• Experience deploying and managing services on Google Cloud Platform, including Compute Engine, Cloud Storage, IAM, and Cloud Functions. For example, designing and implementing a cloud-native application architecture using GKE (Google Kubernetes Engine) with Cloud SQL and Pub/Sub.
• Experience working with large-scale data processing frameworks such as Apache Spark, Dataflow, or BigQuery.
• Experience designing and maintaining data warehouse solutions (e.g., BigQuery, Snowflake, Redshift). For example, modeling a star schema for a retail analytics platform that supports reporting on sales, inventory, and customer behavior.
• Experience developing or integrating AI/ML models and rule-based expert systems. For example, building a classification model using Vertex AI to predict customer churn, or implementing a rule engine that automates underwriting decisions.
• Experience designing, building, and consuming RESTful or gRPC APIs. For example, developing a versioned REST API with OAuth 2.0 authentication that serves as the integration layer between a mobile application and backend microservices.
• Solid experience with SQL for data manipulation and querying.
• Hands-on experience with Google Cloud Platform (GCP) services relevant to AI/ML.
• Basic understanding and practical experience with Machine Learning model fine-tuning.
• Familiarity with data engineering concepts and practices.
• Expertise in prompt engineering techniques for interacting with LLMs.
• Experience with the OpenAI SDK.
• Experience developing robust APIs, preferably with FastAPI.
• Proficiency with version control systems (e.g., Git).
• Experience with containerization technologies (e.g., Docker)
Education Required
• Bachelor's Degree
Education Preferred
• Certification Program
About Stefanini Group
The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world. Our presence is in countries like the Americas, Europe, Africa, and Asia, and more than four hundred clients across a broad spectrum of markets, including financial services, manufacturing, telecommunications, chemical services, technology, public sector, and utilities. Stefanini is a CMM level 5, IT consulting company with a global presence. We are a CMM Level 5 company.
Stefanini Group is hiring!
Note: Only W2 applicants.
Stefanini is looking for a GCP Data Engineer (Dearborn, MI)
For quick apply, please reach out to Adil Khan at 248-728- 6424/ adil.khan@stefanini.com
We are looking for candidate who is responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately
Responsibilities
• Architect and scale end-to-end data pipelines on GCP, transforming complex telemetry and enterprise data into high-quality, analytics-ready assets using Medallion architectures.
• Lead the implementation of robust CI/CD workflows, rigorous data governance, and security controls while mentoring junior talent and driving engineering best practices. By collaborating with cross-functional stakeholders and optimizing cloud performance, you will ensure the data platform remains secure, cost-effective, and highly available to power critical business insights.
• Using Terraform, Git, and Airflow to ensure reproducible, secure, and cost-optimized cloud infrastructure.
• Prioritizing data lineage, PII protection, and observability to maintain high trust in data assets.
• Acting as a bridge between technical teams (Data Science, Security) and business stakeholders to deliver self-service analytics. Strong understanding of Generative AI principles and architectures, including Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) systems.
• Proven experience in building and deploying RAG systems, including the use of Vector Databases.
Skills Required
• GCP, Big Data, Data Warehousing, Artificial Intelligence & Expert Systems, API
Skills Preferred
• Google Cloud Platform, Familiarity with advanced GCP services beyond core compute and storage, such as Vertex AI, Dataflow, Cloud Composer (Airflow), and BigQuery ML. For example, using Cloud Composer to orchestrate scheduled data pipelines that feed into a BigQuery data warehouse.
Experience Required
• Senior Data Engineer with 7+ years in data engineering and 10+ years in software with AI/ML
• Proficiency in Python programming.
• Experience deploying and managing services on Google Cloud Platform, including Compute Engine, Cloud Storage, IAM, and Cloud Functions. For example, designing and implementing a cloud-native application architecture using GKE (Google Kubernetes Engine) with Cloud SQL and Pub/Sub.
• Experience working with large-scale data processing frameworks such as Apache Spark, Dataflow, or BigQuery.
• Experience designing and maintaining data warehouse solutions (e.g., BigQuery, Snowflake, Redshift). For example, modeling a star schema for a retail analytics platform that supports reporting on sales, inventory, and customer behavior.
• Experience developing or integrating AI/ML models and rule-based expert systems. For example, building a classification model using Vertex AI to predict customer churn, or implementing a rule engine that automates underwriting decisions.
• Experience designing, building, and consuming RESTful or gRPC APIs. For example, developing a versioned REST API with OAuth 2.0 authentication that serves as the integration layer between a mobile application and backend microservices.
• Solid experience with SQL for data manipulation and querying.
• Hands-on experience with Google Cloud Platform (GCP) services relevant to AI/ML.
• Basic understanding and practical experience with Machine Learning model fine-tuning.
• Familiarity with data engineering concepts and practices.
• Expertise in prompt engineering techniques for interacting with LLMs.
• Experience with the OpenAI SDK.
• Experience developing robust APIs, preferably with FastAPI.
• Proficiency with version control systems (e.g., Git).
• Experience with containerization technologies (e.g., Docker)
Education Required
• Bachelor's Degree
Education Preferred
• Certification Program
About Stefanini Group
The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world. Our presence is in countries like the Americas, Europe, Africa, and Asia, and more than four hundred clients across a broad spectrum of markets, including financial services, manufacturing, telecommunications, chemical services, technology, public sector, and utilities. Stefanini is a CMM level 5, IT consulting company with a global presence. We are a CMM Level 5 company.






