Snowflake Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer based in Dallas, TX, on a long-term contract with a pay rate of "unknown." Candidates should have 10+ years of experience, including 5+ years in Snowflake, and expertise in SQL, Python, and cloud integrations.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 5, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#dbt (data build tool) #Computer Science #ADLS (Azure Data Lake Storage) #Python #Metadata #Azure #Kafka (Apache Kafka) #DevOps #Vault #GitHub #Clustering #Storage #Snowflake #Data Engineering #SQL (Structured Query Language) #Cloud #"ETL (Extract #Transform #Load)" #Azure ADLS (Azure Data Lake Storage) #S3 (Amazon Simple Storage Service) #Scala #Fivetran #JSON (JavaScript Object Notation) #GIT #Batch #Collibra #Data Catalog #Data Security #Microsoft Power BI #Data Vault #Snowpark #Java #BI (Business Intelligence) #Data Marketplace #Automation #SnowPipe #AWS (Amazon Web Services) #API (Application Programming Interface) #Deployment #XML (eXtensible Markup Language) #Security #Alation #Data Processing #Tableau #Azure DevOps #Matillion #Informatica
Role description
Job Title: Snowflake Data Engineer Experience: 10+ Years Location: Dallas, TX, US Contract Duration: Long Term Work Time: US Time Zone Job Description We are seeking an experienced Snowflake Data Engineer to design and optimize Snowflake-based data solutions, build real-time and batch ingestion pipelines, implement advanced features like Streams, Tasks, and Snowpark, and ensure secure, scalable, and cost-effective data platforms. The role involves close collaboration with architects and domain teams to deliver data products and enable integration with BI, governance, and cloud platforms. Key Responsibilities: • Design and implement Snowflake schemas (star, snowflake, data vault) optimized with micro-partitioning, clustering keys, materialized views, and search optimization services. • Build real-time and batch ingestion pipelines into Snowflake using Snowpipe, Kafka Connect, Fivetran, Matillion, Informatica, or dbt. • Automate incremental data processing with Streams & Tasks to support CDC (Change Data Capture). • Use Zero-Copy Cloning for environment management, testing, and sandboxing. • Apply Time Travel and Fail-safe features for data recovery and auditing. • Develop data transformation logic in Snowpark for Python/SQL/Scala to push compute directly into Snowflake. • Design integrations with cloud storage (S3, Azure ADLS, GCS) for staging and external tables. • Implement data sharing and data marketplace solutions via Snowflake Secure Data Sharing and Snowflake Marketplace. • Enable semi-structured data handling (JSON, Avro, Parquet, ORC, XML) using VARIANT columns and lateral flattening. • Integrate Snowflake with BI tools (Power BI, Tableau) via live connections and semantic layers. • Implement RBAC (Role-Based Access Control), Row Access Policies, and Dynamic Data Masking for data security. • Optimize compute usage with multi-cluster warehouses, resource monitors, and query performance tuning. • Manage cost optimization strategies (warehouse auto-suspend, query profiling, storage/compute separation). • Integrate with data catalog & governance platforms (Collibra, Alation, Informatica CDGC) using Snowflake metadata and APIs. • Work with domain teams to deliver data products leveraging Snowflake’s data mesh-friendly features. • Collaborate with architects to design a Snowflake-centric data fabric integrated with ETL/ELT and API layers. • Support CI/CD automation for Snowflake code deployment using GitHub Actions, Azure DevOps, or dbt Cloud. Qualifications: • Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. • 10+ years of data engineering experience, with 5+ years in Snowflake Data Cloud. • Expertise in SQL optimization and Snowflake performance tuning. • Hands-on with Snowpipe, Streams & Tasks, Snowpark, Zero-Copy Cloning, and Secure Data Sharing. • Proficiency in Python, Scala, or Java for Snowpark development. • Experience integrating with cloud platforms like AWS. • Exposure to ETL/ELT tools (Informatica, Matillion, Fivetran). • Familiarity with CI/CD, Git, DevOps practices for data operations. Preferred Certifications: • SnowPro Core Certification