

EPITEC
Data Engineering Engineer 3
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineering Engineer 3 position based in Dearborn, Michigan, offering a contract for 40 hours per week at $72-76 per hour. Key skills include GCP, Python, and SQL, with 5-7 years of relevant experience required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
608
-
ποΈ - Date
March 6, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Dearborn, MI
-
π§ - Skills detailed
#Databases #Data Architecture #Documentation #Datasets #Data Pipeline #SQL (Structured Query Language) #Cloud #Microservices #Security #Airflow #Dataflow #Data Governance #Scala #Data Engineering #Java #BigQuery #Computer Science #Terraform #Infrastructure as Code (IaC) #Data Ingestion #GCP (Google Cloud Platform) #Automation #Python #NoSQL
Role description
Job Title
Data Engineering Engineer 3
Location: Dearborn, Michigan
Job Type: Contract
Expected Hours Per Week
40 hours per week
Schedule
Monday β Friday, 8-5 hybrid.
Pay Range
$ 72-76 Per Hour
Weβre seeking an experienced Data Pipeline Architect & Builder to design, build, and scale high?quality data pipelines on Google Cloud Platform (GCP). In this role, youβll lead end?to?end data engineering initiatives, ensuring reliable, secure, and performant data solutions that drive real business impact.
What Youβll Do
β’ Design, build, and maintain scalable data ingestion and curation pipelines from diverse data sources
β’ Develop standardized, high?quality datasets optimized for analytics and insights
β’ Build and manage cloud data platforms using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, DataProc)
β’ Implement data governance, security, and access controls using GCP native capabilities
β’ Orchestrate workflows and infrastructure using Astronomer (Airflow) and Terraform (IaC)
β’ Monitor, optimize, and scale pipelines for performance, reliability, and cost efficiency
β’ Collaborate with architects, engineers, and stakeholders to define best practices and design patterns
β’ Automate data platform processes to improve reliability and reduce manual effort
β’ Translate business requirements into efficient data solutions and reusable assets
β’ Create and maintain clear technical documentation
Required Skills
β’ Google Cloud Platform (GCP)
β’ Cloud Architecture
β’ Python
Experience & Qualifications
β’ Bachelorβs degree in Computer Science, IT, Data Analytics, or related field (or equivalent experience)
β’ 5β7 years of Data Engineering or Software Engineering experience
β’ 2+ years building and deploying cloud-based data platforms (GCP preferred)
β’ Strong proficiency in SQL, Python, and Java
β’ Hands-on experience with BigQuery, Dataflow, DataProc, and relational/NoSQL databases
β’ Knowledge of microservices, SOA, and cloud data architectures
β’ Experience with CI/CD, Terraform, and automation frameworks
β’ Familiarity with data governance, encryption, and data masking
β’ Proven ability to optimize cloud cost and compute performance
Why Join Us
β’ Work on modern, scalable cloud data platforms
β’ Influence architecture, standards, and best practices
β’ Collaborate with highly skilled, cross?functional teams
β’ Make a direct impact on business outcomes through data
Job Title
Data Engineering Engineer 3
Location: Dearborn, Michigan
Job Type: Contract
Expected Hours Per Week
40 hours per week
Schedule
Monday β Friday, 8-5 hybrid.
Pay Range
$ 72-76 Per Hour
Weβre seeking an experienced Data Pipeline Architect & Builder to design, build, and scale high?quality data pipelines on Google Cloud Platform (GCP). In this role, youβll lead end?to?end data engineering initiatives, ensuring reliable, secure, and performant data solutions that drive real business impact.
What Youβll Do
β’ Design, build, and maintain scalable data ingestion and curation pipelines from diverse data sources
β’ Develop standardized, high?quality datasets optimized for analytics and insights
β’ Build and manage cloud data platforms using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, DataProc)
β’ Implement data governance, security, and access controls using GCP native capabilities
β’ Orchestrate workflows and infrastructure using Astronomer (Airflow) and Terraform (IaC)
β’ Monitor, optimize, and scale pipelines for performance, reliability, and cost efficiency
β’ Collaborate with architects, engineers, and stakeholders to define best practices and design patterns
β’ Automate data platform processes to improve reliability and reduce manual effort
β’ Translate business requirements into efficient data solutions and reusable assets
β’ Create and maintain clear technical documentation
Required Skills
β’ Google Cloud Platform (GCP)
β’ Cloud Architecture
β’ Python
Experience & Qualifications
β’ Bachelorβs degree in Computer Science, IT, Data Analytics, or related field (or equivalent experience)
β’ 5β7 years of Data Engineering or Software Engineering experience
β’ 2+ years building and deploying cloud-based data platforms (GCP preferred)
β’ Strong proficiency in SQL, Python, and Java
β’ Hands-on experience with BigQuery, Dataflow, DataProc, and relational/NoSQL databases
β’ Knowledge of microservices, SOA, and cloud data architectures
β’ Experience with CI/CD, Terraform, and automation frameworks
β’ Familiarity with data governance, encryption, and data masking
β’ Proven ability to optimize cloud cost and compute performance
Why Join Us
β’ Work on modern, scalable cloud data platforms
β’ Influence architecture, standards, and best practices
β’ Collaborate with highly skilled, cross?functional teams
β’ Make a direct impact on business outcomes through data






