Data Quality Expert - IDQ/Databuck -- Long Term Contract -- Multiple Locations

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Quality Expert - IDQ/Databuck on a long-term contract, requiring 8+ years of experience. Key skills include Informatica IDQ, DataBuck, data profiling, and ETL integration. Location is hybrid across multiple cities in the U.S.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
June 4, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Irving, TX
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Documentation #IICS (Informatica Intelligent Cloud Services) #Anomaly Detection #Informatica PowerCenter #Datasets #Compliance #dbt (data build tool) #Data Pipeline #Libraries #Informatica IDQ (Informatica Data Quality) #Data Quality #Data Profiling #Data Engineering #Scala #Automation #Informatica #Airflow
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Lorven Technologies, Inc., is seeking the following. Apply via Dice today! Job Title: Data Quality Expert - IDQ/Databuck Location: Irving - Texas, Basking Ridge - NJ, Tampa- Florida, Atlanta Georgia (Hybrid) Duration: Long Term Contract Need 8+ years candidate. We are seeking a highly technical and hands-on Data Quality Engineer with deep experience in Informatica Data Quality (IDQ) or FirstEigen's DataBuck platform. The ideal candidate will be responsible for building and maintaining data quality rules, validation thresholds, and developing/integrating APIs that operationalize data quality at scale. This role demands strong proficiency in enterprise-grade data quality tooling, rule design, automation, and integration across data pipelines. Key Responsibilities: β€’ Design, implement, and manage data quality frameworks and reusable rule libraries in Informatica IDQ or DataBuck. β€’ Develop DQ rules and scorecards to validate accuracy, completeness, consistency, uniqueness, and integrity of critical data assets. β€’ Define and automate threshold-based alerts, data health scoring, and anomaly detection using profiling results. β€’ Develop and integrate RESTful APIs for DQ rules invocation, validation feedback, exception management, and orchestration. β€’ Perform data profiling, cleansing, parsing, matching, and standardization across diverse datasets. β€’ Integrate DQ processes into ETL/ELT pipelines and orchestration tools (e.g., Informatica PowerCenter, IICS, Airflow, dbt). β€’ Work with Data Engineers and Architects to ensure data quality rules are embedded across ingestion, transformation, and consumption layers. β€’ Create DQ dashboards, audit trails, and documentation to support governance, compliance, and reporting needs. β€’ Troubleshoot, optimize, and enhance DQ jobs, mappings, and workflows for performance and scalability. β€’ Participate in data issue triage, root cause analysis, and drive long-term remediation strategies.