Danta Technologies

Snowflake Data Engineer | Dallas TX

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer in Dallas, TX, with a contract length of unspecified duration and a competitive pay rate. Requires 10+ years of data engineering experience, 5+ years in Snowflake, and expertise in SQL, Python, and cloud integrations. SnowPro Core Certification preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Informatica #AWS S3 (Amazon Simple Storage Service) #Metadata #Batch #JSON (JavaScript Object Notation) #Security #SQL (Structured Query Language) #Fivetran #Matillion #Data Catalog #Alation #API (Application Programming Interface) #Data Engineering #Clustering #Data Processing #Automation #Deployment #Scala #BI (Business Intelligence) #dbt (data build tool) #Storage #Azure DevOps #Vault #Data Vault #Snowflake #Data Governance #Data Ingestion #Azure #XML (eXtensible Markup Language) #Data Security #Data Marketplace #DevOps #Microsoft Power BI #Python #Kafka (Apache Kafka) #GCP (Google Cloud Platform) #Cloud #"ETL (Extract #Transform #Load)" #ADLS (Azure Data Lake Storage) #Azure ADLS (Azure Data Lake Storage) #Collibra #Tableau #Java #SnowPipe #GitHub #S3 (Amazon Simple Storage Service) #GIT #AWS (Amazon Web Services) #Computer Science #Snowpark
Role description
Title : Snowflake Data Engineer Location: Dallas TX Key Responsibilities • Design and implement Snowflake schemas (star, snowflake, data vault) optimized with micro-partitioning, clustering keys, materialized views, and search optimization services. • Build real-time and batch ingestion pipelines into Snowflake using Snowpipe, Kafka Connect, Fivetran, Matillion, Informatica, or dbt. • Automate incremental data processing with Streams & Tasks to support CDC (Change Data Capture). • Use Zero-Copy Cloning for environment management, testing, and sandboxing. • Apply Time Travel and Fail-safe features for data recovery and auditing. • Develop data transformation logic in Snowpark for Python/SQL/Scala to push compute directly into Snowflake. • Design integrations with cloud storage (S3, Azure ADLS, GCS) for staging and external tables. • Implement data sharing and data marketplace solutions via Snowflake Secure Data Sharing and Snowflake Marketplace. • Enable semi-structured data handling (JSON, Avro, Parquet, ORC, XML) using VARIANT columns and lateral flattening. • Integrate Snowflake with BI tools (Power BI, Tableau) via live connections and semantic layers. • Implement RBAC (Role-Based Access Control), Row Access Policies, and Dynamic Data Masking for data security. • Optimize compute usage with multi-cluster warehouses, resource monitors, and query performance tuning. • Manage cost optimization strategies (warehouse auto-suspend, query profiling, storage/compute separation). • Integrate with data catalog & governance platforms (Collibra, Alation, Informatica CDGC) using Snowflake metadata and APIs. • Work with domain teams to deliver data products leveraging Snowflake's data mesh-friendly features. • Collaborate with architects to design a Snowflake-centric data fabric integrated with ETL/ELT and API layers. • Support CI/CD automation for Snowflake code deployment using GitHub Actions, Azure DevOps, or dbt Cloud. Qualifications Education: Bachelor's or Master's in Computer Science, Data Engineering, or related field. Experience • 10+ years of data engineering experience, with 5+ years in Snowflake Data Cloud. • Expertise in SQL optimization and Snowflake performance tuning. • Hands-on with Snowpipe, Streams & Tasks, Snowpark, Zero-Copy Cloning, and Secure Data Sharing. • Proficiency in Python, Scala, or Java for Snowpark development. • Experience integrating with cloud platforms like AWS. • Exposure to ETL/ELT tools (Informatica, Matillion, Fivetran). • Familiarity with CI/CD, Git, DevOps practices for data operations. • Preferred Certifications: • SnowPro Core Certification Key Skills • Snowflake-native feature design and implementation (Snowpark, Streams, Time Travel, Secure Data Sharing) • Data ingestion (Snowpipe, CDC, Kafka, Fivetran) • Semi-structured data handling (VARIANT, JSON, Avro, Parquet) • Advanced SQL and performance tuning • Data governance (RBAC, masking, lineage, catalogs) • Cloud data platform integrations (AWS S3, Azure ADLS, GCP GCS) • BI and analytics tool integration • Cost optimization and warehouse orchestration Notes:- All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance. Benefits: Danta offers a compensation package to all W2 employees that are competitive in the industry. It consists of competitive pay, the option to elect healthcare insurance (Dental, Medical, Vision), Major holidays and Paid sick leave as per state law. The rate/ Salary range is dependent on numerous factors including Qualification, Experience and Location.