Arkhya Tech. Inc.

Data Quality Lead

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Quality Lead on a contract basis, requiring 8–15 years of experience. Key skills include Azure, Python, Pyspark, and SQL. Work is onsite in Minneapolis, Dallas, or Atlanta, with a focus on automated data quality frameworks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#ADLS (Azure Data Lake Storage) #Synapse #PySpark #"ETL (Extract #Transform #Load)" #Python #Cloud #Monitoring #Data Engineering #Databricks #Azure #DevOps #Programming #Data Automation #Spark (Apache Spark) #SQL (Structured Query Language) #Data Lake #Data Pipeline #Observability #ADF (Azure Data Factory) #Automation #Data Quality #Data Processing
Role description
Title : Data Quality Engineer – Automation Location: Minneapolis , Dallas , Atlanta (Onsite from Dayone) Job Type : Contract Experience: 8–15 years Skills - Data Quality Framework , Python , Pyspark and SQL Key Responsibilities: • Design, build, and maintain automated data quality frameworks to validate accuracy, completeness, consistency, and timeliness of data. • Develop automation scripts using Python/SQL to test data pipelines, ETL/ELT processes, and analytics workflows. • Implement data quality checks and monitoring within Azure-based data platforms. • Work extensively with Azure services (ADF, ADLS, Synapse) and Databricks for large-scale data processing. • Integrate data quality validations into CI/CD pipelines and support proactive issue detection. • Perform root cause analysis for data issues and collaborate with data engineering, analytics, and business teams to resolve them. • Define and enforce data quality standards, metrics, and SLAs. Required Skills & Qualifications: • Strong experience (8–15 years) in data engineering, data quality, or data automation roles. • Hands-on expertise with Azure data ecosystem and Databricks. • Strong programming skills in Python and SQL. • Experience building automated data validation and reconciliation frameworks. • Solid understanding of data warehousing, data lakes, and distributed data processing. • Familiarity with DevOps/CI-CD practices for data platforms. Preferred Skills: • Experience with data observability or data quality tools. • Exposure to cloud-scale analytics and performance optimization. • Strong communication and stakeholder management skills.