Sibitalent Corp

Lead ETL Test

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead ETL Test position in Dearborn, MI, hybrid work, lasting 12 months with a contract pay rate. Key skills include Azure Databricks, PySpark, advanced SQL, and ETL testing experience. Local candidates preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 7, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Corp-to-Corp (C2C)
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#Datasets #Jira #Azure Data Factory #Databricks #Data Ingestion #Pytest #PySpark #Azure Databricks #"ETL (Extract #Transform #Load)" #Azure cloud #ADF (Azure Data Factory) #Data Reconciliation #Spark (Apache Spark) #SQL (Structured Query Language) #Delta Lake #Python #SQL Queries #API (Application Programming Interface) #Data Pipeline #Automation #Agile #Cloud #Azure #AI (Artificial Intelligence) #Data Engineering
Role description
Hi, Hope you are doing well. IMMEDIATE INTERVIEW = Onshore Test Lead – Data / ETL Testing (Azure Databricks, PySpark) IN Dearborn, MI – HYBRID- (NEED LOCAL CANDIDATE) Please find the job details below and kindly revert if you’re interested in learning more about this opportunity. Job Title: Onshore Test Lead – Data / ETL Testing (Azure Databricks, PySpark) Location: Dearborn, MI – HYBRID- (NEED LOCAL CANDIDATE) Duration: 12 Months + Possible Extension Start: Immediate Engagement: Contract (C2C) Job Overview CLIENT: is seeking an experienced Onshore Test Lead with strong Data/ETL testing expertise to support analytics and AI-driven platforms. The role focuses on validating complex data pipelines, ETL processes, dashboards, and AI prompt-based tools within a modern Azure cloud data platform environment. This position requires hands-on experience with Azure Databricks, PySpark, and advanced SQL for data validation. Candidates with primarily UI automation experience (Selenium-only) will not be suitable. Key Responsibilities β€’ Validate analytics dashboards and reporting metrics against underlying datasets. β€’ Perform ETL and data pipeline testing using Azure Databricks and Azure Data Factory. β€’ Conduct source-to-target data reconciliation across multiple data sources. β€’ Validate Delta Lake tables, schema changes, partitions, and incremental loads in Databricks. β€’ Write complex SQL queries and PySpark validations for ETL transformation testing. β€’ Test APIs supporting data ingestion and reporting workflows. β€’ Validate outputs of AI prompt-based tools against backend datasets. β€’ Develop or enhance test automation using Python / PyTest. β€’ Manage test cases, defects, and reporting using TestRail and Jira. β€’ Collaborate with developers, product teams, and offshore QA teams in Agile environments. Required Skills β€’ Azure Databricks β€’ Delta Lake β€’ PySpark (DataFrame validation) β€’ Advanced SQL β€’ ETL / Data Pipeline Testing β€’ Azure Data Factory (ADF) β€’ Data reconciliation & transformation testing β€’ API testing β€’ Python / PyTest automation β€’ Jira / Agile methodologies Preferred Skills β€’ Testing AI prompt-based applications β€’ Validation of analytics dashboards and reporting pipelines β€’ Experience with large-scale cloud data platforms β€’ CI/CD test automation experience Ideal Candidate β€’ Strong data engineering testing background β€’ Hands-on experience with Databricks notebooks and PySpark β€’ Comfortable validating complex ETL transformations β€’ Able to independently test large data pipelines β€’ Experience working in Agile, fast-paced environments Submission Requirements Please include: β€’ Updated Resume β€’ Current Location β€’ Visa Status β€’ Availability to Start β€’ Confirmation of hands-on Databricks + PySpark experience