

Artmac
Senior Snowflake Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Snowflake Data Engineer, requiring 8-15 years of experience, with 5+ years in Snowflake. Contract length is W2/C2C, located in Dallas, Texas. Key skills include SQL optimization, Python/Scala, and cloud integration. SnowPro Core certification preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Data Processing #S3 (Amazon Simple Storage Service) #GIT #Scala #Python #Data Engineering #Snowpark #Automation #Collibra #Tableau #Data Vault #XML (eXtensible Markup Language) #Security #Data Marketplace #DevOps #Snowflake #ADLS (Azure Data Lake Storage) #Java #SQL (Structured Query Language) #AWS S3 (Amazon Simple Storage Service) #Informatica #Azure DevOps #dbt (data build tool) #Consulting #Fivetran #Cloud #Data Ingestion #BI (Business Intelligence) #AWS (Amazon Web Services) #GitHub #Vault #Matillion #"ETL (Extract #Transform #Load)" #Clustering #Azure ADLS (Azure Data Lake Storage) #JSON (JavaScript Object Notation) #Microsoft Power BI #GCP (Google Cloud Platform) #SnowPipe #Batch #Data Catalog #Alation #Data Governance #Storage #Data Security #Deployment #Kafka (Apache Kafka) #Azure #Metadata
Role description
Who We Are
Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to customers.
Job Description
Job Title : Senior Snowflake Data Engineer
Job Type : W2/C2C
Experience : 8-15 Years
Location : Dallas, Texas (On-Site)
Responsibilities
• 10+ years of data engineering experience, with 5+ years in Snowflake Data Cloud.
• Expertise in SQL optimization and Snowflake performance tuning.
• Hands-on with Snowpipe, Streams & Tasks, Snowpark, Zero-Copy Cloning, and Secure Data Sharing.
• Proficiency in Python, Scala, or Java for Snowpark development.
• Experience integrating with cloud platforms like AWS.
• Exposure to ETL/ELT tools (Informatica, Matillion, Fivetran).
• Familiarity with CI/CD, Git, and DevOps practices for data operations.
• Preferred Certifications: SnowPro Core
• Design and implement Snowflake schemas (star, snowflake, data vault) optimized with micro-partitioning, clustering keys, materialized views, and search optimization services.
• Build real-time and batch ingestion pipelines into Snowflake using Snowpipe, Kafka Connect, Fivetran, Matillion, Informatica, or dbt.
• Automate incremental data processing with Streams & Tasks to support CDC (Change Data Capture).
• Use Zero-Copy Cloning for environment management, testing, and sandboxing.
• Apply Time Travel and Fail-safe features for data recovery and auditing.
• Develop data transformation logic in Snowpark for Python/SQL/Scala to push compute directly into Snowflake.
• Design integrations with cloud storage (S3, Azure ADLS, GCS) for staging and external tables.
• Implement data sharing and data marketplace solutions via Snowflake Secure Data Sharing and Snowflake Marketplace.
• Enable semi-structured data handling (JSON, Avro, Parquet, ORC, XML) using VARIANT columns and lateral flattening.
• Integrate Snowflake with BI tools (Power BI, Tableau) via live connections and semantic layers.
• Implement RBAC (Role-Based Access Control), Row Access Policies, and Dynamic Data Masking for data security.
• Integrate with data catalog & governance platforms (Collibra, Alation, Informatica CDGC) using Snowflake metadata and APIs.
• Support CI/CD automation for Snowflake code deployment using GitHub Actions, Azure DevOps, or dbt Cloud.
Preferred Key Skills
• Snowflake-native feature design and implementation (Snowpark, Streams, Time Travel, Secure Data Sharing)
• Data ingestion (Snowpipe, CDC, Kafka, Fivetran)
• Semi-structured data handling (VARIANT, JSON, Avro, Parquet)
• Advanced SQL and performance tuning
• Data governance (RBAC, masking, lineage, catalogs)
• Cloud data platform integrations (AWS S3, Azure ADLS, GCP GCS)
• BI and analytics tool integration
• Cost optimization and warehouse orchestration
Qualification
• Bachelor's degree or equivalent combination of education and experience.
Who We Are
Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to customers.
Job Description
Job Title : Senior Snowflake Data Engineer
Job Type : W2/C2C
Experience : 8-15 Years
Location : Dallas, Texas (On-Site)
Responsibilities
• 10+ years of data engineering experience, with 5+ years in Snowflake Data Cloud.
• Expertise in SQL optimization and Snowflake performance tuning.
• Hands-on with Snowpipe, Streams & Tasks, Snowpark, Zero-Copy Cloning, and Secure Data Sharing.
• Proficiency in Python, Scala, or Java for Snowpark development.
• Experience integrating with cloud platforms like AWS.
• Exposure to ETL/ELT tools (Informatica, Matillion, Fivetran).
• Familiarity with CI/CD, Git, and DevOps practices for data operations.
• Preferred Certifications: SnowPro Core
• Design and implement Snowflake schemas (star, snowflake, data vault) optimized with micro-partitioning, clustering keys, materialized views, and search optimization services.
• Build real-time and batch ingestion pipelines into Snowflake using Snowpipe, Kafka Connect, Fivetran, Matillion, Informatica, or dbt.
• Automate incremental data processing with Streams & Tasks to support CDC (Change Data Capture).
• Use Zero-Copy Cloning for environment management, testing, and sandboxing.
• Apply Time Travel and Fail-safe features for data recovery and auditing.
• Develop data transformation logic in Snowpark for Python/SQL/Scala to push compute directly into Snowflake.
• Design integrations with cloud storage (S3, Azure ADLS, GCS) for staging and external tables.
• Implement data sharing and data marketplace solutions via Snowflake Secure Data Sharing and Snowflake Marketplace.
• Enable semi-structured data handling (JSON, Avro, Parquet, ORC, XML) using VARIANT columns and lateral flattening.
• Integrate Snowflake with BI tools (Power BI, Tableau) via live connections and semantic layers.
• Implement RBAC (Role-Based Access Control), Row Access Policies, and Dynamic Data Masking for data security.
• Integrate with data catalog & governance platforms (Collibra, Alation, Informatica CDGC) using Snowflake metadata and APIs.
• Support CI/CD automation for Snowflake code deployment using GitHub Actions, Azure DevOps, or dbt Cloud.
Preferred Key Skills
• Snowflake-native feature design and implementation (Snowpark, Streams, Time Travel, Secure Data Sharing)
• Data ingestion (Snowpipe, CDC, Kafka, Fivetran)
• Semi-structured data handling (VARIANT, JSON, Avro, Parquet)
• Advanced SQL and performance tuning
• Data governance (RBAC, masking, lineage, catalogs)
• Cloud data platform integrations (AWS S3, Azure ADLS, GCP GCS)
• BI and analytics tool integration
• Cost optimization and warehouse orchestration
Qualification
• Bachelor's degree or equivalent combination of education and experience.






