Lumicity

Senior Snowflake Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Snowflake Data Engineer (dbt / ELT + Cortex AI) with a contract length of "Contract / Contract-to-Hire" and a pay rate of "Unknown." It requires 5+ years in Data Engineering, 3+ years with Snowflake, and advanced SQL skills. Remote/Hybrid (US) location.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 5, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Version Control #Documentation #Data Pipeline #Matillion #Scala #Macros #Data Modeling #Azure DevOps #GitHub #Snowpark #Snowflake #Data Engineering #"ETL (Extract #Transform #Load)" #GitLab #ADF (Azure Data Factory) #Data Mart #Data Quality #Monitoring #AI (Artificial Intelligence) #GIT #Python #Azure #Cloud #Deployment #dbt (data build tool) #SQL (Structured Query Language) #Classification #Datasets #SnowPipe #Azure Data Factory #Airflow #DevOps #Fivetran
Role description
Senior Snowflake Data Engineer (dbt / ELT + Cortex AI) 📍 Remote / Hybrid (US) | 💼 Contract / Contract-to-Hire / GC or US Citizens We are seeking a Senior Snowflake Data Engineer with deep expertise in dbt-driven ELT frameworks and advanced Snowflake capabilities, including Snowflake Cortex AI. This role will focus on building scalable, production-grade data pipelines and analytics models while supporting emerging AI-enabled data initiatives within Snowflake. Key Responsibilities • Design, build, and maintain scalable ELT pipelines leveraging Snowflake + dbt • Develop dbt projects including models, macros, snapshots, seeds, tests, and documentation • Implement modern dbt architecture best practices (staging → intermediate → marts) • Build analytics-ready data marts using strong dimensional modeling (Kimball) principles • Optimize Snowflake performance through query tuning, warehouse sizing, workload isolation, and cost controls • Implement data quality validation using dbt testing frameworks (schema tests, custom tests, freshness monitoring) • Integrate ingestion tools such as Fivetran, Airbyte, Stitch, Matillion, or ADF • Build and maintain CI/CD deployment workflows for dbt and Snowflake code using Git-based version control • Partner with analytics and business stakeholders to translate requirements into scalable datasets and reporting models Niche Responsibilities (High-Value / Hard-to-Find Skillset) • Implement AI-enabled workflows using Snowflake Cortex (summarization, classification, extraction, text analytics) • Support vector embedding pipelines and enable semantic search / retrieval-ready datasets inside Snowflake • Build Snowflake-native transformations and processing using Snowpark (Python) • Design governed datasets to support GenAI-ready data products and enterprise AI adoption • Contribute to modern Snowflake feature enablement including Dynamic Tables, Streams/Tasks, Snowpipe, and Secure Data Sharing Required Qualifications • 5+ years of experience in Data Engineering / Analytics Engineering • 3+ years of hands-on experience with Snowflake in production environments • 2+ years of strong experience with dbt (Core or Cloud) in production • Advanced SQL expertise (complex transformations, performance tuning, optimization) • Strong experience building ELT pipelines and layered transformation frameworks • Experience with orchestration tools such as Airflow, Prefect, Dagster, or Azure Data Factory • Experience implementing CI/CD pipelines and version control best practices (GitHub Actions, GitLab CI, Azure DevOps) • Strong data modeling background (facts/dimensions, marts, analytics-ready design) • Experience supporting enterprise-grade reliability, monitoring, and data quality standards No third parties or vendors please!!