ProSearch

Quality Assurance (QA) Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Quality Assurance (QA) Data Engineer on a 6+ month remote contract, paying competitive rates. Candidates must have GCP Data Testing experience, strong SQL skills, and familiarity with ETL/ELT processes, BigQuery, and data validation tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 11, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Data Modeling #Dataflow #Java #Cloud #Datasets #Data Quality #Data Accuracy #Data Engineering #Documentation #Storage #Python #Scripting #Jira #Quality Assurance #Data Transformations #Programming #Schema Design #Data Pipeline #Data Ingestion #BigQuery #Regression #dbt (data build tool) #SharePoint #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Data Governance
Role description
ProSearch is seeking an experienced QA Data Engineer for a remote contract engagement with a globally respected enterprise client. This opportunity supports the Enterprise Data Platform (EDP) team in ensuring the quality, accuracy, and performance of data pipelines and applications in a Google Cloud Platform (GCP) environment. This is a 6+ month contract, operating on Eastern Time hours, with the potential for extension based on performance. To be considered for hire, candidates MUST HAVE current GCP Data Testing experience designing, validating, and optimizing ETL/ELT data pipelines across GCP Key Responsibilities • Design, develop, and execute automated and manual test cases for ETL/ELT pipelines and data ingestion processes • Perform data validation, reconciliation, and regression testing for high-quality data assurance • Validate data transformations and support end-to-end testing across BigQuery, Dataflow, and Cloud Storage • Identify, document, and track data quality issues and provide detailed reports for resolution • Monitor pipeline failures, troubleshoot data inconsistencies, and assist with root cause analysis • Create and maintain data validation rules, test cases, and best practices for data governance • Collaborate cross-functionally to define quality benchmarks for data accuracy and application performance • Support post-release validation of data in production environments • Leverage tools like Jira, Confluence, and SharePoint for documentation and collaboration Required Experience & Skills • Must have current experience as a GCP Data Tester designing, validating, and optimizing ETL/ELT data pipelines across GCP • Must have experience with GCP Dataform • 5+ years in data engineering, data testing, or quality assurance • Strong hands-on skills in SQL, with deep understanding of data validation and testing frameworks • Familiarity with GCP services: BigQuery, Dataflow, Dataproc, and Cloud Storage • Experience with automated data testing tools (e.g., Great Expectations, dbt tests) • Experience working with retail data and large-scale datasets • Able to write test cases based on business and technical requirements • Understanding of ETL/ELT processes, data modeling, and schema design • Familiarity with programming languages like Python or Java for scripting automated tests