

Verita AI
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 3+ years of experience, focusing on dbt and Airflow. The contract lasts 2-3 weeks, with 20-40 hours per week, and pays $90–$125/hour. It is fully remote, requiring strong communication skills and familiarity with cloud warehouses.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
1000
-
🗓️ - Date
May 13, 2026
🕒 - Duration
1 to 3 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#dbt (data build tool) #BigQuery #Debugging #Data Quality #Snowflake #Databricks #Cloud #Monitoring #Redshift #Data Architecture #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Data Pipeline #Airflow #Data Engineering #BI (Business Intelligence)
Role description
About Verita AI
Verita AI builds high-trust data pipelines that enable AI systems to understand real-world workflows across finance, analytics, and operations.
We work with domain experts to help train and evaluate next-generation AI systems on how modern data infrastructure and analytics engineering function in practice.
Our founding team includes alumni of Mercor, Hudson River Trading, Citadel, IDEO, Stanford, and Yale. We partner with world-class researchers and engineers at leading AI labs to advance the state of the art. Verita AI is a seed-stage company valued at $25 million, having raised $6 million led by Kindred Ventures.
About the Role
We are hiring experienced Data Engineering Experts to help train and evaluate AI systems on real-world analytics engineering and data infrastructure workflows.
This work focuses heavily on modern data stack tooling, particularly dbt and Airflow, and requires individuals who can reason through complex data engineering scenarios with precision and clarity.
You will help create, review, and evaluate realistic workflows spanning data transformation, orchestration, warehouse design, testing, and analytics engineering best practices.
This is a high-focus, project-based engagement best suited for experienced practitioners who are comfortable working independently and communicating technical reasoning clearly.
What You’ll Work On
You may be asked to build, review, or evaluate scenarios involving:
Pipelines & Transformations
• ETL/ELT workflows
• dbt model development
• Incremental model logic and watermark handling
• Structured output table generation
Orchestration & Reliability
• Airflow or Dagster DAG design
• Workflow orchestration logic
• Data quality monitoring
• Test suite validation and debugging
Warehouse & Analytics Engineering
• Schema and data contract design
• Query optimization and performance tradeoffs
• Warehouse modeling across Snowflake, BigQuery, Redshift, or Databricks
• Analytics-focused data architecture decisions
AI Evaluation & Reasoning
• Reviewing AI-generated technical outputs for correctness
• Explaining engineering reasoning step-by-step
• Converting workflows into structured evaluation tasks
• Providing detailed feedback to improve model performance
Requirements
• 3+ years of professional experience in data engineering or analytics engineering
• Strong experience with dbt and Airflow
• Experience working with modern cloud warehouses such as Snowflake, BigQuery, Redshift, or Databricks
• Familiarity with data quality testing and validation workflows
• Comfortable reading and producing technical artifacts including DAGs, dbt models, schema docs, and test suites
• Strong written communication skills and attention to detail
• Able to work independently and maintain high-quality output
Preferred backgrounds include:
• Analytics Engineering
• Data Infrastructure
• Platform/Data Tooling
• Business Intelligence Engineering
• Data Platform teams at high-scale technology companies
Engagement Details
• Expected commitment: 20–40 hours per week
• Engagement duration: approximately 2–3 weeks initially, with potential extensions based on project needs and performance
• Immediate onboarding available for qualified candidates
• Fully remote and asynchronous
Compensation
Compensation ranges from $90–$125/hour depending on experience, technical depth, and prior domain expertise.
Strong contributors may receive expanded scope and longer-term opportunities based on quality and throughput.
About Verita AI
Verita AI builds high-trust data pipelines that enable AI systems to understand real-world workflows across finance, analytics, and operations.
We work with domain experts to help train and evaluate next-generation AI systems on how modern data infrastructure and analytics engineering function in practice.
Our founding team includes alumni of Mercor, Hudson River Trading, Citadel, IDEO, Stanford, and Yale. We partner with world-class researchers and engineers at leading AI labs to advance the state of the art. Verita AI is a seed-stage company valued at $25 million, having raised $6 million led by Kindred Ventures.
About the Role
We are hiring experienced Data Engineering Experts to help train and evaluate AI systems on real-world analytics engineering and data infrastructure workflows.
This work focuses heavily on modern data stack tooling, particularly dbt and Airflow, and requires individuals who can reason through complex data engineering scenarios with precision and clarity.
You will help create, review, and evaluate realistic workflows spanning data transformation, orchestration, warehouse design, testing, and analytics engineering best practices.
This is a high-focus, project-based engagement best suited for experienced practitioners who are comfortable working independently and communicating technical reasoning clearly.
What You’ll Work On
You may be asked to build, review, or evaluate scenarios involving:
Pipelines & Transformations
• ETL/ELT workflows
• dbt model development
• Incremental model logic and watermark handling
• Structured output table generation
Orchestration & Reliability
• Airflow or Dagster DAG design
• Workflow orchestration logic
• Data quality monitoring
• Test suite validation and debugging
Warehouse & Analytics Engineering
• Schema and data contract design
• Query optimization and performance tradeoffs
• Warehouse modeling across Snowflake, BigQuery, Redshift, or Databricks
• Analytics-focused data architecture decisions
AI Evaluation & Reasoning
• Reviewing AI-generated technical outputs for correctness
• Explaining engineering reasoning step-by-step
• Converting workflows into structured evaluation tasks
• Providing detailed feedback to improve model performance
Requirements
• 3+ years of professional experience in data engineering or analytics engineering
• Strong experience with dbt and Airflow
• Experience working with modern cloud warehouses such as Snowflake, BigQuery, Redshift, or Databricks
• Familiarity with data quality testing and validation workflows
• Comfortable reading and producing technical artifacts including DAGs, dbt models, schema docs, and test suites
• Strong written communication skills and attention to detail
• Able to work independently and maintain high-quality output
Preferred backgrounds include:
• Analytics Engineering
• Data Infrastructure
• Platform/Data Tooling
• Business Intelligence Engineering
• Data Platform teams at high-scale technology companies
Engagement Details
• Expected commitment: 20–40 hours per week
• Engagement duration: approximately 2–3 weeks initially, with potential extensions based on project needs and performance
• Immediate onboarding available for qualified candidates
• Fully remote and asynchronous
Compensation
Compensation ranges from $90–$125/hour depending on experience, technical depth, and prior domain expertise.
Strong contributors may receive expanded scope and longer-term opportunities based on quality and throughput.






