JAKALA

Snowflake Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Developer with a contract length of "unknown", offering a pay rate of "$/hour". Key skills include advanced SQL, Snowflake Data Cloud expertise, and cloud ETL/ELT experience. Pharmaceutical industry experience is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 25, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#GIT #Clustering #SQL Queries #dbt (data build tool) #SnowPipe #Indexing #AWS Glue #Matillion #Talend #SQL (Structured Query Language) #Azure #AWS (Amazon Web Services) #Data Quality #ADF (Azure Data Factory) #Cloud #Data Lake #Snowflake #Data Pipeline #Security #Scala #DevOps #Data Architecture #Data Security #Data Warehouse #Informatica Cloud #Data Modeling #"ETL (Extract #Transform #Load)" #Airflow #Data Engineering #GCP (Google Cloud Platform) #Informatica #Databases
Role description
We are looking to hire a Data Engineer - Salesforce Data Cloud for for one of our major global pharmaceutical clients. Key Responsibilities • Design, develop, and maintain Snowflake objects such as warehouses, databases, schemas, tables, views, stages, file formats, tasks, and streams. • Build and manage Snowflake pipelines, including Snowpipe, ingestion processes, and continuous data flows. • Implement role-based access controls (RBAC) and ensure data security best practices. • SQL Development & Performance Optimization: Write, optimize, and troubleshoot complex SQL queries. Improve query performance through indexing strategies, clustering, caching, pruning, and warehouse optimizations. Conduct performance tuning of ETL/ELT jobs and Snowflake compute resources. • Data Modeling & Validation: Develop scalable data models: conceptual, logical, and physical. Work with dimensional modeling techniques (Star/Snowflake schema). Ensure data quality through comprehensive validation, profiling, and reconciliation. • Cloud ETL/ELT Integration Use at least one cloud ETL/ELT platform (e.g., Informatica Cloud, Matillion, ADF, Talend, DBT, AWS Glue, etc.) to ingest and transform data from diverse sources. Manage automated workflows, data pipelines, and schedulers. • Architecture & Best Practices: Understanding of data architecture concepts including data lakes, data warehouses, ingestion patterns, and transformation frameworks. Contribute to architectural discussions and recommend Snowflake best practices, new features, and process improvements. Collaborate with Data Engineers, Analysts, and Architects to build end-to-end data solutions. Required Skills • Strong experience in Snowflake Data Cloud development. • Advanced SQL development skills. • Deep knowledge of performance tuning and optimization. • Hands-on experience in data modeling and validation. • Experience working with one or more cloud ETL/ELT tools. • Familiarity with Snowflake features like Time Travel, Cloning, Streams, Tasks, Snowpipe, and Resource Monitors. • Knowledge of cloud environments (AWS/Azure/GCP). Good-to-Have Skills • Experience in data architecture design principles. • Exposure to DBT or similar transformation frameworks. • Knowledge of DevOps practices (CI/CD pipelines, Git). • Understanding of orchestration tools like Airflow or ADF pipelines.