

Jobs via Dice
Data Governance Analyst (Data Contracts & BigQuery)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Governance Analyst focused on data contracts and BigQuery, offering a 6-month contract at a competitive pay rate. Key skills include advanced SQL, Google BigQuery, Python, and experience with data ingestion pipelines and contracts.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 15, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#GIT #Data Lineage #Observability #Monitoring #YAML (YAML Ain't Markup Language) #JSON (JavaScript Object Notation) #Data Quality #Scala #Data Management #Data Governance #SQL Queries #Automation #Data Ingestion #Version Control #Python #"ETL (Extract #Transform #Load)" #Datasets #SQL (Structured Query Language) #BigQuery #Data Pipeline
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, RIIASH LLC, is seeking the following. Apply via Dice today!
Role Overview
This role is ideal for a technically strong data professional who understands how data moves across systems from live operational sites to software testing environments and how validated data supports mission-critical decision-making.
You will help define and enforce data contracts, manage ingestion pipelines, ensure data quality, and bring clarity to what it means when data is production-ready and available for internal teams.
This is a 6-month contract with structured milestone reviews at 30, 60, and 90 days.
What You’ll Do
Data Contracts & Governance
• Define and implement data contracts using JSON, AsyncAPI, and YAML
• Clarify expectations around what it means when data lands in a dataset
• Establish validation standards and acceptance criteria for internal consumption
• Document schema definitions, data lineage, and governance processes
• Partner cross-functionally to align data expectations across engineering, site, product, and testing teams
Data Pipeline Management
• Support and monitor 40–50 active data pipelines across environments
• Work with:
• Data pipelines originating from live site operations
• Data pipelines used to test and validate software systems
• Improve ingestion reliability and pipeline observability
• Ensure stable, trusted datasets in Google BigQuery
Data Quality & Validation
• Implement data quality control frameworks
• Define validation checkpoints and monitoring processes
• Translate technical validation status into clear business meaning
• Support teams that rely on accurate data for operational and critical decision-making
Technical Execution
• Develop and optimize advanced SQL queries
• Work extensively with Google BigQuery (BQ)
• Use Python for automation, validation, and transformation
• Maintain version control standards using Git
• Define and document scalable workflows for structured data management
Required Experience
• Advanced SQL skills (joins, aggregations, optimization, performance tuning)
• Hands-on experience with Google BigQuery
• Technical background in data contracts (JSON, AsyncAPI, YAML)
• Experience defining data contracts
• Experience building and maintaining data ingestion pipelines
• Strong understanding of data validation, schema enforcement, and quality control
• Experience managing multiple concurrent data pipelines (40+ preferred)
• Proficiency with Python and Git
• Experience defining and operationalizing data workflows
Skills
Soft Skills
• Strong cross-functional communication skills
• Ability to translate technical concepts into business-relevant insights
• Comfortable working in fast-paced, complex environments
Dice is the leading career destination for tech experts at every stage of their careers. Our client, RIIASH LLC, is seeking the following. Apply via Dice today!
Role Overview
This role is ideal for a technically strong data professional who understands how data moves across systems from live operational sites to software testing environments and how validated data supports mission-critical decision-making.
You will help define and enforce data contracts, manage ingestion pipelines, ensure data quality, and bring clarity to what it means when data is production-ready and available for internal teams.
This is a 6-month contract with structured milestone reviews at 30, 60, and 90 days.
What You’ll Do
Data Contracts & Governance
• Define and implement data contracts using JSON, AsyncAPI, and YAML
• Clarify expectations around what it means when data lands in a dataset
• Establish validation standards and acceptance criteria for internal consumption
• Document schema definitions, data lineage, and governance processes
• Partner cross-functionally to align data expectations across engineering, site, product, and testing teams
Data Pipeline Management
• Support and monitor 40–50 active data pipelines across environments
• Work with:
• Data pipelines originating from live site operations
• Data pipelines used to test and validate software systems
• Improve ingestion reliability and pipeline observability
• Ensure stable, trusted datasets in Google BigQuery
Data Quality & Validation
• Implement data quality control frameworks
• Define validation checkpoints and monitoring processes
• Translate technical validation status into clear business meaning
• Support teams that rely on accurate data for operational and critical decision-making
Technical Execution
• Develop and optimize advanced SQL queries
• Work extensively with Google BigQuery (BQ)
• Use Python for automation, validation, and transformation
• Maintain version control standards using Git
• Define and document scalable workflows for structured data management
Required Experience
• Advanced SQL skills (joins, aggregations, optimization, performance tuning)
• Hands-on experience with Google BigQuery
• Technical background in data contracts (JSON, AsyncAPI, YAML)
• Experience defining data contracts
• Experience building and maintaining data ingestion pipelines
• Strong understanding of data validation, schema enforcement, and quality control
• Experience managing multiple concurrent data pipelines (40+ preferred)
• Proficiency with Python and Git
• Experience defining and operationalizing data workflows
Skills
Soft Skills
• Strong cross-functional communication skills
• Ability to translate technical concepts into business-relevant insights
• Comfortable working in fast-paced, complex environments






