

MethodHub
Senior Data Engineer
โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (SDET Lead) based in Auburn Hills, MI, lasting 1+ year, with a pay rate of "TBD." Required skills include Data Engineering, Python, PySpark, CI/CD, and Airflow, with 8+ years of relevant experience.
๐ - Country
United States
๐ฑ - Currency
$ USD
-
๐ฐ - Day rate
520
-
๐๏ธ - Date
December 10, 2025
๐ - Duration
More than 6 months
-
๐๏ธ - Location
Remote
-
๐ - Contract
Unknown
-
๐ - Security
Unknown
-
๐ - Location detailed
Auburn Hills, MI
-
๐ง - Skills detailed
#Data Engineering #Cloud #Docker #Spark (Apache Spark) #GitHub #Batch #Airflow #Agile #Data Architecture #Deployment #Documentation #Data Pipeline #Python #Scala #PySpark
Role description
Senior Data Engineer (SDET Lead)
Location: Auburn Hills, MI
Remote work: No
Duration: 1+ year
Mandatory Skills: Data Engineering, Python, PySpark, CI/CD, Airflow, Workflow Orchestration
Overall Experience: 8+ years of relevant experience
The Senior Data Engineer (SDET Lead) will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects
Key Responsibilities
1. Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
1. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
1. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions.
1. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines.
1. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle.
1. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
1. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
1. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
1. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.
Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage. Includes all above skills, plus the following;
ยท Minimum of 7+ years overall IT experience
ยท Experienced in waterfall, iterative, and agile methodologies
Senior Data Engineer (SDET Lead)
Location: Auburn Hills, MI
Remote work: No
Duration: 1+ year
Mandatory Skills: Data Engineering, Python, PySpark, CI/CD, Airflow, Workflow Orchestration
Overall Experience: 8+ years of relevant experience
The Senior Data Engineer (SDET Lead) will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects
Key Responsibilities
1. Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
1. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
1. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions.
1. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines.
1. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle.
1. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
1. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
1. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
1. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.
Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage. Includes all above skills, plus the following;
ยท Minimum of 7+ years overall IT experience
ยท Experienced in waterfall, iterative, and agile methodologies






