

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Dearborn, MI (Hybrid) with a contract length of "unknown." Pay rate is "unknown." Requires 5+ years in analytics application development, SQL, and 3+ years in GCP. GCP Professional Data Engineer certification preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 27, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dearborn, MI
-
π§ - Skills detailed
#SQL (Structured Query Language) #Apache Beam #Computer Science #Batch #Data Engineering #Teradata #AI (Artificial Intelligence) #Google Cloud Storage #Consulting #Dataflow #Data Lineage #Java #Lean #Automation #ML (Machine Learning) #Sybase #Data Integrity #Informatica #"ETL (Extract #Transform #Load)" #Data Processing #GCP (Google Cloud Platform) #dbt (data build tool) #Data Lake #Data Science #Airflow #Leadership #Terraform #Deployment #Data Ingestion #Storage #Big Data #Data Architecture #Cloud #Python #TensorFlow #GIT #Agile #Data Warehouse #Datasets #Jenkins #Data Management #BigQuery
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Details:
Job Description
Stefanini Group is hiring!
Stefanini is looking for a GCP Data Engineer, Dearborn, MI (Hybrid)
For quick apply, please reach out Saurabh Kapoor at 248-582-6559/saurabh.kapoor@stefanini.com
We're seeking an experienced GCP Data Engineer who can build cloud analytics platforms to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solutions and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform.
Job Requirements
Details:
Responsibilities
β’ Work in collaborative environment including pairing and mobbing with other cross-functional engineers
β’ Work on a small agile team to deliver working, tested software
β’ Work effectively with fellow data engineers, product owners, data champions and other technical experts
β’ Demonstrate technical knowledge/leadership skills and advocate for technical excellence
β’ Develop exceptional Analytics data products using streaming, batch ingestion patterns in the Google Cloud Platform with solid Data Warehouse principles
β’ Be the Subject Matter Expert in Data Engineering and GCP tool technologies
β’ Implement methods for automation of all parts of the pipeline to minimize labor in development and production
β’ Ensure key business drivers are captured in collaboration with product management this includes designing and deploying a pipeline with automated data lineage.
β’ Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions.
β’ Test and compare competing solutions and report out a point of view on the best solution.
β’ Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine.
Experience Required
β’ In-depth understanding of Google's product technology (or other cloud platforms) and underlying architectures
β’ 5+ years of analytics application development experience required
β’ 5+ years of SQL development experience
β’ 3+ years of Cloud experience (GCP preferred) with solution designed and implemented at production scale
β’ Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Terraform, Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Airflow, etc.
β’ 2 + years professional development experience in Java or Python, and Apache Beam
β’ Extracting, Loading, Transforming, cleaning, and validating data
β’ Designing pipelines and architectures for data processing
β’ 1+ year of designing and building CI/CD pipelines
β’ Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.
β’ Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products
β’ Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting
β’ Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions
Experience Preferred
β’ Experience building Machine Learning solutions using TensorFlow, BigQueryML, AutoML, Vertex AI
β’ Experience in building solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP
β’ Experience with DataPlex or Informatica EDC is preferred - Experience with development ecosystems such as Git, Jenkins and CICD
β’ Exceptional problem solving and communication skills
β’ Experience in working with DBT/Dataform
β’ Experience in working with Agile and Lean methodologies
β’ Team player and attention to detail
β’ Performance tuning experience
β’ 2+ years mentoring engineers
β’ In-depth software engineering knowledge
Education Required
β’ Bachelor's degree in computer science or related scientific field
β’ IT or related Associated topics: data architect, data center, data integrity, data manager, data management, data scientist, data warehousing, SQL, Sybase, Teradata
Education Preferred
β’ GCP Professional Data Engineer Certified
β’ Master's degree in computer science or related field
β’ Listed salary ranges may vary based on experience, qualifications, and local market. Also, some positions may include bonuses or other incentives
β’
β’
β’ Stefanini takes pride in hiring top talent and developing relationships with our future employees. Our talent acquisition teams will never make an offer of employment without having a phone conversation with you. Those face-to-face conversations will involve a description of the job for which you have applied. We also speak with you about the process, including interviews and job offers.
About Stefanini Group
The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world. Our presence is in countries like the Americas, Europe, Africa, and Asia, and more than four hundred clients across a broad spectrum of markets, including financial services, manufacturing, telecommunications, chemical services, technology, public sector, and utilities. Stefanini is a CMM level 5, IT consulting company with a global presence. We are a CMM Level 5 company.