

EPITEC
Data Engineering Senior
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Dearborn, Michigan, on a W2 contract for 40 hours per week, paying $75-80 per hour. Key skills include GCP, Python, SQL, and experience with data pipelines and AI use cases.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
April 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#Data Processing #"ETL (Extract #Transform #Load)" #BigQuery #Data Engineering #Cloud #Scala #GCP (Google Cloud Platform) #ML (Machine Learning) #Data Science #Data Pipeline #Data Warehouse #SQL (Structured Query Language) #Databases #Data Lake #Python #Monitoring #AI (Artificial Intelligence) #Airflow #Terraform
Role description
Job Title: Senior Data Engineer
Location: Dearborn, Michigan
Job Type: W2 Contract
Expected hours per week: 40 hours
Schedule: Hybrid - 4 days onsite, 1 remote
Pay Range: $75-80 an hour
Job Description
This Senior Data Engineer designs and runs large?scale data systems on Google Cloud so teams can reliably use data for reporting, analytics, and AI. They build secure, scalable pipelines that move raw data into clean, analytics?ready formats and help power advanced use cases like machine learning and generative AI. They also set technical standards, mentor others, and ensure the data platform is reliable, cost?effective, and well governed.
• Meet with business partners, data scientists, and engineers to understand what data is needed and how it will be used
• Build and maintain data pipelines on GCP that ingest, clean, transform, and store large volumes of data
• Design and manage data platforms like data warehouses and data lakes (primarily BigQuery?based)
• Write Python and SQL to automate data processing, monitoring, and quality checks
• Develop and support APIs that allow applications and AI systems to access data
• Implement CI/CD, Terraform, and Airflow to keep data systems reliable, secure, and repeatable
• Ensure data is well governed by tracking lineage, protecting sensitive data, and monitoring performance
• Support AI and generative AI use cases, including RAG systems, vector databases, and LLM integrations
• Review work from junior engineers, mentor the team, and promote data engineering best practices
Benefits: Medical, Dental, Vision, PTO & 401K
#INDOEM
Job Title: Senior Data Engineer
Location: Dearborn, Michigan
Job Type: W2 Contract
Expected hours per week: 40 hours
Schedule: Hybrid - 4 days onsite, 1 remote
Pay Range: $75-80 an hour
Job Description
This Senior Data Engineer designs and runs large?scale data systems on Google Cloud so teams can reliably use data for reporting, analytics, and AI. They build secure, scalable pipelines that move raw data into clean, analytics?ready formats and help power advanced use cases like machine learning and generative AI. They also set technical standards, mentor others, and ensure the data platform is reliable, cost?effective, and well governed.
• Meet with business partners, data scientists, and engineers to understand what data is needed and how it will be used
• Build and maintain data pipelines on GCP that ingest, clean, transform, and store large volumes of data
• Design and manage data platforms like data warehouses and data lakes (primarily BigQuery?based)
• Write Python and SQL to automate data processing, monitoring, and quality checks
• Develop and support APIs that allow applications and AI systems to access data
• Implement CI/CD, Terraform, and Airflow to keep data systems reliable, secure, and repeatable
• Ensure data is well governed by tracking lineage, protecting sensitive data, and monitoring performance
• Support AI and generative AI use cases, including RAG systems, vector databases, and LLM integrations
• Review work from junior engineers, mentor the team, and promote data engineering best practices
Benefits: Medical, Dental, Vision, PTO & 401K
#INDOEM






