Bespoke Labs

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a contract basis, requiring 3+ years of dbt experience and strong Snowflake expertise. Remote work, with a focus on AI/ML capabilities, CI/CD processes, and collaboration with data teams. Pay rate is "unknown".
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 9, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Data Modeling #GitHub #Monitoring #Data Science #Deployment #dbt (data build tool) #Data Quality #Snowflake #SQL (Structured Query Language) #DataOps #Cloud #Python #Azure #DevOps #ML (Machine Learning) #Documentation #CLI (Command-Line Interface) #Scala #GitLab #Macros #Data Pipeline #Airflow #Scripting #"ETL (Extract #Transform #Load)" #Automation #Azure DevOps #Data Engineering #Snowpark
Role description
Job Posting: Senior Data Engineer / Analytics Engineer (DBT + Snowflake Cortex CLI) Location: Remote Type: Contract Experience Level: Mid–Senior About the Role We are seeking a skilled Data/Analytics Engineer with hands-on experience using dbt in conjunction with Snowflake's Cortex CLI. This role involves designing, developing, and optimizing data workflows that leverage Snowflake's new AI/ML and feature engineering capabilities via Cortex, while maintaining production-grade dbt transformations and CI/CD processes. You will collaborate with data engineering, analytics, and ML teams to prototype and productionize Cortex-driven workloads, ensure scalable model development, and define best practices for using dbt in a modern Snowflake-native stack. Responsibilities - Design and build dbt models, macros, and tests aligned with modern data modeling practices (e.g., modular, source freshness, semantic layers). - Integrate dbt workflows with Snowflake Cortex CLI, including: - Feature engineering pipelines - Model training & inference tasks - Pipeline orchestration and automation - Evaluation and monitoring of Cortex models - Define and document best practices for dbt–Cortex usage patterns. - Collaborate with data scientists and ML engineers to operationalize Cortex workloads in Snowflake. - Implement CI/CD pipelines for dbt projects (GitHub Actions / GitLab / Azure DevOps). - Optimize queries and Snowflake compute usage for cost and performance efficiency. - Troubleshoot and debug dbt artifacts, Snowflake objects, lineage, and data quality issues. - Provide guidance on dbt project structure, governance, and testing frameworks. Required Qualifications - 3+ years of experience with dbt Core or dbt Cloud, including macros, packages, testing, documentation, and deployments. - Strong expertise with Snowflake (warehouses, tasks, streams, materialized views, performance tuning). - Hands-on experience with Snowflake Cortex CLI or willingness and ability to quickly ramp up on Cortex features. - Proficiency in SQL and familiarity with Python as used in dbt and scripting. - Experience integrating dbt with orchestration tools (Airflow, Dagster, Prefect, etc.). - Strong understanding of modern data engineering workflows, ELT patterns, and version-controlled analytics development. Nice-to-Have Skills - Prior experience operationalizing ML workflows inside Snowflake. - Familiarity with Snowpark and Python UDFs/UDFs. - Experience building semantic layers using dbt metrics. - Knowledge of MLOps or DataOps best practices. - Exposure to LLM use cases, vector search, and unstructured data pipelines.