

YDC Pro
Data Engineers
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 7+ years of experience, focusing on Google Cloud Platform, BigQuery, and Apache Airflow. It offers a 12-month contract at a hybrid location in Detroit, requiring strong ETL/data integration skills and Agile experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 6, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Detroit, MI
-
🧠 - Skills detailed
#Microsoft Power BI #Jira #AI (Artificial Intelligence) #Data Lake #Agile #BI (Business Intelligence) #Data Mining #Stories #Scala #GCP (Google Cloud Platform) #Apache Airflow #Big Data #Migration #Airflow #Storage #Alteryx #Data Engineering #"ETL (Extract #Transform #Load)" #BigQuery #Data Pipeline #Data Warehouse #Data Integration #Automation #Programming #Cloud
Role description
We are hiring !!
Data Engineers
Hybrid – 2 days a week on site
Detroit, Michigan, United States
Job Summary:
We are seeking an experienced Data Engineering Engineer with 7+ years of hands-on experience in designing, developing, and maintaining scalable data engineering solutions. The ideal candidate will have strong experience with Google Cloud Platform (GCP), BigQuery, Apache Airflow, ETL/data integration, and modern cloud-based data pipelines. Experience supporting migration initiatives from Alteryx to cloud-native platforms will be considered a strong advantage.
Location: Dearborn, Detroit
Contract Duration: 12 Months
Key Responsibilities:
• Design, build, and maintain scalable data pipelines and data engineering solutions
• Develop and optimize data infrastructure for data collection, storage, transformation, and analytics
• Build and support data platforms including Data Warehouses, Data Lakes, and Lakehouse environments
• Design and maintain ETL/data integration workflows and automation processes
• Work closely with product managers, product owners, and cross-functional engineering teams
• Support performance tuning, troubleshooting, and continuous improvement initiatives
• Participate in Agile development processes including Epics, User Stories, and JIRA activities
• Develop production-grade software/services with focus on reliability and scalability
Required Skills & Experience:
• 7+ years of experience in Data Engineering
• Strong hands-on experience with Google Cloud Platform (GCP)
• Experience building and maintaining data pipelines using BigQuery and Apache Airflow
• Strong ETL/Data Integration experience
• Experience with Big Data and Analytics solutions
• Experience with Software/Application Development
• Strong understanding of Data Mining and Business Intelligence concepts
• Experience working in Agile environments
Preferred Qualifications:
• Experience migrating from Alteryx to cloud-based data platforms
• Experience with QlikSense and/or Power BI
• Experience with production support and platform operations
• Exposure to Test-Driven Development (TDD) and Extreme Agile Programming
• Knowledge or exposure to AI/GenAI concepts is a plus
• Strong analytical, communication, and problem-solving skills
• Willingness to learn and adapt to new technologies
Please email your resume to mycareer@ydcpro.com
We are hiring !!
Data Engineers
Hybrid – 2 days a week on site
Detroit, Michigan, United States
Job Summary:
We are seeking an experienced Data Engineering Engineer with 7+ years of hands-on experience in designing, developing, and maintaining scalable data engineering solutions. The ideal candidate will have strong experience with Google Cloud Platform (GCP), BigQuery, Apache Airflow, ETL/data integration, and modern cloud-based data pipelines. Experience supporting migration initiatives from Alteryx to cloud-native platforms will be considered a strong advantage.
Location: Dearborn, Detroit
Contract Duration: 12 Months
Key Responsibilities:
• Design, build, and maintain scalable data pipelines and data engineering solutions
• Develop and optimize data infrastructure for data collection, storage, transformation, and analytics
• Build and support data platforms including Data Warehouses, Data Lakes, and Lakehouse environments
• Design and maintain ETL/data integration workflows and automation processes
• Work closely with product managers, product owners, and cross-functional engineering teams
• Support performance tuning, troubleshooting, and continuous improvement initiatives
• Participate in Agile development processes including Epics, User Stories, and JIRA activities
• Develop production-grade software/services with focus on reliability and scalability
Required Skills & Experience:
• 7+ years of experience in Data Engineering
• Strong hands-on experience with Google Cloud Platform (GCP)
• Experience building and maintaining data pipelines using BigQuery and Apache Airflow
• Strong ETL/Data Integration experience
• Experience with Big Data and Analytics solutions
• Experience with Software/Application Development
• Strong understanding of Data Mining and Business Intelligence concepts
• Experience working in Agile environments
Preferred Qualifications:
• Experience migrating from Alteryx to cloud-based data platforms
• Experience with QlikSense and/or Power BI
• Experience with production support and platform operations
• Exposure to Test-Driven Development (TDD) and Extreme Agile Programming
• Knowledge or exposure to AI/GenAI concepts is a plus
• Strong analytical, communication, and problem-solving skills
• Willingness to learn and adapt to new technologies
Please email your resume to mycareer@ydcpro.com




