

Kavaliro
Snowflake Implementation Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Snowflake Implementation Data Engineer for a 6-month contract, paying $50.00 per hour, fully remote. Key requirements include 5+ years in data engineering, experience with Snowflake migrations, and proficiency in SQL, Python, and ETL processes.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
400
-
🗓️ - Date
February 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Denver, CO
-
🧠 - Skills detailed
#Clustering #Data Quality #Data Warehouse #Snowflake #Data Storage #Data Architecture #Data Engineering #Automation #Python #Data Reconciliation #SQL (Structured Query Language) #Storage #Migration #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Cloud #Data Ingestion #Data Pipeline
Role description
Job Description
Position Title: Snowflake Implementation Data Engineer
Status: 6-month contract
Pay: $50.00 per hour
Location: Fully Remote
Open To C2c Candidates Objective
• Engineer end to end Snowflake migration efforts Migrate existing cloud-based data warehouses, ETL pipelines, and data assets into Snowflake, ensuring accuracy, performance, and minimal downtime.
• Build and optimize ELT/ETL pipelines Develop data ingestion and transformation pipelines using tools such as dbt, Python, or similar technologies.
• Optimize data storage and compute performance Tune Snowflake queries, warehouse sizing, clustering keys, and micro partitioning for cost efficiency and high performance.
• Develop migration scripts and automation Write SQL, Python, and automation scripts to streamline migration tasks, data validation, and environment orchestration.
• Ensure high data quality and integrity Perform data reconciliation, profiling, validation, and testing to guarantee accurate migration outcomes and reliable analytics.
• Collaborate with cross Functional teams Work closely with data architects, analysts, engineers, and business stakeholders to understand data requirements and translate them into Snowflake solutions.
• Monitor and troubleshoot Snowflake environments Proactively identify performance bottlenecks, resolve pipeline issues, and monitor workloads using builtin Snowflake tools and logs.
Qualifications
• Strong Communication skills
• Experience: 5+ years of experience in data engineering and has completed at lest on enterprise snowflake migration. Proven track record of successfully leading data engineering teams and delivering complex projects.
• Technical Expertise: Deep knowledge of data engineering, preferably with experience in Snowflake. Strong understanding of data pipelines, data warehousing, and ETL processes.
• Problem-Solving & Communication: Strong analytical thinking with excellent decision-making abilities. Ability to manage multiple priorities, meet deadlines, and communicate effectively with technical and non-technical stakeholders.
Job Requirements
Remote
Job Description
Position Title: Snowflake Implementation Data Engineer
Status: 6-month contract
Pay: $50.00 per hour
Location: Fully Remote
Open To C2c Candidates Objective
• Engineer end to end Snowflake migration efforts Migrate existing cloud-based data warehouses, ETL pipelines, and data assets into Snowflake, ensuring accuracy, performance, and minimal downtime.
• Build and optimize ELT/ETL pipelines Develop data ingestion and transformation pipelines using tools such as dbt, Python, or similar technologies.
• Optimize data storage and compute performance Tune Snowflake queries, warehouse sizing, clustering keys, and micro partitioning for cost efficiency and high performance.
• Develop migration scripts and automation Write SQL, Python, and automation scripts to streamline migration tasks, data validation, and environment orchestration.
• Ensure high data quality and integrity Perform data reconciliation, profiling, validation, and testing to guarantee accurate migration outcomes and reliable analytics.
• Collaborate with cross Functional teams Work closely with data architects, analysts, engineers, and business stakeholders to understand data requirements and translate them into Snowflake solutions.
• Monitor and troubleshoot Snowflake environments Proactively identify performance bottlenecks, resolve pipeline issues, and monitor workloads using builtin Snowflake tools and logs.
Qualifications
• Strong Communication skills
• Experience: 5+ years of experience in data engineering and has completed at lest on enterprise snowflake migration. Proven track record of successfully leading data engineering teams and delivering complex projects.
• Technical Expertise: Deep knowledge of data engineering, preferably with experience in Snowflake. Strong understanding of data pipelines, data warehousing, and ETL processes.
• Problem-Solving & Communication: Strong analytical thinking with excellent decision-making abilities. Ability to manage multiple priorities, meet deadlines, and communicate effectively with technical and non-technical stakeholders.
Job Requirements
Remote






