

CriticalRiver Inc.
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer for a full-time contract in Pleasanton, California, requiring 10+ years of data engineering experience, including 3+ years with Snowflake and dbt. Proficiency in Postgres, SQL, Python, and AWS is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 17, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Pleasanton, CA
-
🧠 - Skills detailed
#dbt (data build tool) #Schema Design #Strategy #Observability #Scala #Python #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Cloud #Data Engineering #AWS (Amazon Web Services) #Data Modeling #Automation #Data Quality #Azure #Snowflake #API (Application Programming Interface)
Role description
Title: Senior Data Engineer (Must have DBT experience)
Location: Pleasanton, California (hybrid work)
Job type: Fulltime/Contract
Responsibilities:
• We are seeking a Senior Data Engineer to own and architect core data infrastructure. In this strategic role, you will design and implement scalable ELT pipelines using Postgres, dbt, and Snowflake, enabling data products that power product strategy and business operations.
• You’ll collaborate across Finance, Product, and Marketing teams to ensure high-quality, trusted data flows through robust and secure systems.
• You’ll optimize data models across transactional and cloud environments, implement advanced Snowflake features and build hybrid pipelines from Postgres to Snowflake.
• You'll also lead the development of CI/CD workflows, data quality frameworks, and observability systems.
Requirements:
• 10+ years in Data Engineering, including 3+ years in Snowflake & dbt.
• Strong expertise in Postgres (schema design, optimization, stored procedures, large-scale workloads).
• Advanced knowledge of Snowflake (data modeling, performance tuning, governance).
• Proficient in SQL and Python, including API integrations and automation.
• Strong understanding of data warehousing, dimensional modeling and system design principles.
• Experience with AWS (mandatory); GCP or Azure is a plus.
Title: Senior Data Engineer (Must have DBT experience)
Location: Pleasanton, California (hybrid work)
Job type: Fulltime/Contract
Responsibilities:
• We are seeking a Senior Data Engineer to own and architect core data infrastructure. In this strategic role, you will design and implement scalable ELT pipelines using Postgres, dbt, and Snowflake, enabling data products that power product strategy and business operations.
• You’ll collaborate across Finance, Product, and Marketing teams to ensure high-quality, trusted data flows through robust and secure systems.
• You’ll optimize data models across transactional and cloud environments, implement advanced Snowflake features and build hybrid pipelines from Postgres to Snowflake.
• You'll also lead the development of CI/CD workflows, data quality frameworks, and observability systems.
Requirements:
• 10+ years in Data Engineering, including 3+ years in Snowflake & dbt.
• Strong expertise in Postgres (schema design, optimization, stored procedures, large-scale workloads).
• Advanced knowledge of Snowflake (data modeling, performance tuning, governance).
• Proficient in SQL and Python, including API integrations and automation.
• Strong understanding of data warehousing, dimensional modeling and system design principles.
• Experience with AWS (mandatory); GCP or Azure is a plus.