

Nexwave
Snowflake Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer on a W2 contract, remote for 6-10 years. Key skills include Snowflake architecture, Azure services, SQL proficiency, and experience in healthcare analytics. Preferred certifications in Azure or Snowflake are advantageous.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 18, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Agile #Azure #Security #ADF (Azure Data Factory) #Azure ADLS (Azure Data Lake Storage) #Data Integrity #Mathematics #Data Science #Logging #Azure SQL Database #Scrum #SQL (Structured Query Language) #Data Lake #Data Modeling #Azure SQL #"ETL (Extract #Transform #Load)" #Cloud #NoSQL #Database Management #Compliance #Data Privacy #Databricks #SnowPipe #Complex Queries #Azure Databricks #Scala #Storage #Synapse #Monitoring #Automation #Data Security #Stories #Data Strategy #Data Architecture #Strategy #Data Engineering #Data Warehouse #Azure Data Factory #Computer Science #Schema Design #BI (Business Intelligence) #ADLS (Azure Data Lake Storage) #Clustering #Code Reviews #DevOps #Azure cloud #Data Pipeline #Microsoft Azure #Data Analysis #Datasets #Snowflake
Role description
Role : Snowflake Data Engineer
Location : Remote
Contract: W2
Job Summary:
As a Cloud Data Engineer, you will design, build, and maintain scalable data pipelines, systems and platforms
with primary focus on Snowflake, and leveraging Microsoft Azure services to deliver integrated and high-
performing data solutions. You will collaborate with business data analysts, data architects, data scientists,
project managers, and GEHA’s business stakeholders to build solutions that ensure seamless data flow and
enable actionable insights. This role supports data development and maintenance efforts of the Data & Analytics
activities and initiatives in alignment with the Enterprise Data strategy, policies, organizational goals, and
governance policies. You’ll work in a cloud-native environment to design and optimize data solutions that
improve performance, scalability, and efficiency.
Key Responsibilities:
· Proven experience as a Cloud Data Engineer or similar role, with a strong focus on Snowflake,
including architecture, performance optimization and data modeling as well as hands-on experience
with Azure cloud services.
· Advanced Snowflake ETL/ELT Development: Design and develop complex ETL/ELT pipelines natively in
snowflake using snowflake streams, tasks, SnowPipe and other features to automate data
transformation and orchestration, leveraging Azure Data services and other tools where appropriate
ensuring the efficient flow of data to multiple destinations for processing, storing, analysis, and
delivery to downstream external consumers.
· Cloud Data Architecture: Design and implement Snowflake-based data solutions to support
enterprise data platforms, integrating with Azure Data Lake Storage (ADLS), Azure SQL Data
Warehouse, Azure Databricks to support data extracts, data sharing, business intelligence, analytics,
and data science initiatives.
· Snowflake Data Architecture: Oversee the design and optimization of Snowflake architecture,
including data models, schema design, partitioning, clustering, and query optimization to ensure high
performance and low cost.
· Work independently on your assigned stories/tasks, work with other engineers & analysts on the
team to take code all the way to Production using Change Management process.
· Participate in Architectural discussions, able to understand the bigger picture & expected to provide
recommendation to make better architectural decisions for specific domains.
· Perform regular Code Reviews and making sure best practices are followed.
· Collaborate with Stakeholders: Work closely with cross-functional teams, including business data
analysts, data architects, data scientists, project managers and business units, to define data
requirements and implement data solutions that support business goals.
· Data Security, Governance & Compliance: Implement best practices to ensure data privacy, security,
and compliance with relevant governance in the cloud environment.
· Automation and Monitoring: Implement and manage automated monitoring and logging for all data
workflows and pipelines to ensure data integrity and operational reliability.
· Troubleshooting and Support: Troubleshoot issues with data pipelines, data integrity, and platform
performance, providing timely resolutions to business-critical data needs.
· Ensure that all items follow company Change Management process.
· Participate in scrum, backlog grooming, refinement, planning and review sessions.
· Other duties as assigned.
Required Qualifications:
· Requires at least 6-10 years of relevant technical or business work experience with demonstrated
experience within healthcare industry or data and analytics.
· Bachelor’s degree in computer science, Information Technology, Data Engineering, Mathematics, or a
related field or comparable experience.
· Expertise in designing and implementing data architectures on Snowflake including Snowpipe,
snowflake Tasks and Snowflake streams and Azure including Azure Data Factory (ADF), Azure Data
Lake Storage (ADLS), Azure SQL Database, Azure Synapse, and Azure Databricks.
· Advanced experience with Snowflake data platform, including schema design, data sharing, internal
& external marketplace, performance turning, and cost optimization.
· Proficiency in SQL and good experience with writing complex queries, stored procedures, and
managing large datasets.
· SME level knowledge of Data Warehouse concepts, Data Modeling, database management systems
(SQL and NoSQL), and ETL/ELT frameworks, ideally in Azure Data Factory or Azure Synapse.
· Experienced with DevOps practices, Repos, Snowflake, and CI/CD pipelines for data solutions.
· Ability to work in a collaborative, cross-functional team environment and communicate complex
technical concepts to non-technical stakeholders.
· Very Good understanding of Agile and Scrum processes.
· Experience & eagerness for problem solving and root cause analysis for any issues.
· Passionate to learn new technologies & techniques; eagerness towards understanding data & business
processes to contribute & make a difference.
· Ability to work collaboratively in a team environment.
· Prior knowledge or willingness to learn healthcare and analytics industries.
· Preferred fundamentals certification in Azure, Snowflake, Databricks, or comparable cloud platforms or
technology.
Role : Snowflake Data Engineer
Location : Remote
Contract: W2
Job Summary:
As a Cloud Data Engineer, you will design, build, and maintain scalable data pipelines, systems and platforms
with primary focus on Snowflake, and leveraging Microsoft Azure services to deliver integrated and high-
performing data solutions. You will collaborate with business data analysts, data architects, data scientists,
project managers, and GEHA’s business stakeholders to build solutions that ensure seamless data flow and
enable actionable insights. This role supports data development and maintenance efforts of the Data & Analytics
activities and initiatives in alignment with the Enterprise Data strategy, policies, organizational goals, and
governance policies. You’ll work in a cloud-native environment to design and optimize data solutions that
improve performance, scalability, and efficiency.
Key Responsibilities:
· Proven experience as a Cloud Data Engineer or similar role, with a strong focus on Snowflake,
including architecture, performance optimization and data modeling as well as hands-on experience
with Azure cloud services.
· Advanced Snowflake ETL/ELT Development: Design and develop complex ETL/ELT pipelines natively in
snowflake using snowflake streams, tasks, SnowPipe and other features to automate data
transformation and orchestration, leveraging Azure Data services and other tools where appropriate
ensuring the efficient flow of data to multiple destinations for processing, storing, analysis, and
delivery to downstream external consumers.
· Cloud Data Architecture: Design and implement Snowflake-based data solutions to support
enterprise data platforms, integrating with Azure Data Lake Storage (ADLS), Azure SQL Data
Warehouse, Azure Databricks to support data extracts, data sharing, business intelligence, analytics,
and data science initiatives.
· Snowflake Data Architecture: Oversee the design and optimization of Snowflake architecture,
including data models, schema design, partitioning, clustering, and query optimization to ensure high
performance and low cost.
· Work independently on your assigned stories/tasks, work with other engineers & analysts on the
team to take code all the way to Production using Change Management process.
· Participate in Architectural discussions, able to understand the bigger picture & expected to provide
recommendation to make better architectural decisions for specific domains.
· Perform regular Code Reviews and making sure best practices are followed.
· Collaborate with Stakeholders: Work closely with cross-functional teams, including business data
analysts, data architects, data scientists, project managers and business units, to define data
requirements and implement data solutions that support business goals.
· Data Security, Governance & Compliance: Implement best practices to ensure data privacy, security,
and compliance with relevant governance in the cloud environment.
· Automation and Monitoring: Implement and manage automated monitoring and logging for all data
workflows and pipelines to ensure data integrity and operational reliability.
· Troubleshooting and Support: Troubleshoot issues with data pipelines, data integrity, and platform
performance, providing timely resolutions to business-critical data needs.
· Ensure that all items follow company Change Management process.
· Participate in scrum, backlog grooming, refinement, planning and review sessions.
· Other duties as assigned.
Required Qualifications:
· Requires at least 6-10 years of relevant technical or business work experience with demonstrated
experience within healthcare industry or data and analytics.
· Bachelor’s degree in computer science, Information Technology, Data Engineering, Mathematics, or a
related field or comparable experience.
· Expertise in designing and implementing data architectures on Snowflake including Snowpipe,
snowflake Tasks and Snowflake streams and Azure including Azure Data Factory (ADF), Azure Data
Lake Storage (ADLS), Azure SQL Database, Azure Synapse, and Azure Databricks.
· Advanced experience with Snowflake data platform, including schema design, data sharing, internal
& external marketplace, performance turning, and cost optimization.
· Proficiency in SQL and good experience with writing complex queries, stored procedures, and
managing large datasets.
· SME level knowledge of Data Warehouse concepts, Data Modeling, database management systems
(SQL and NoSQL), and ETL/ELT frameworks, ideally in Azure Data Factory or Azure Synapse.
· Experienced with DevOps practices, Repos, Snowflake, and CI/CD pipelines for data solutions.
· Ability to work in a collaborative, cross-functional team environment and communicate complex
technical concepts to non-technical stakeholders.
· Very Good understanding of Agile and Scrum processes.
· Experience & eagerness for problem solving and root cause analysis for any issues.
· Passionate to learn new technologies & techniques; eagerness towards understanding data & business
processes to contribute & make a difference.
· Ability to work collaboratively in a team environment.
· Prior knowledge or willingness to learn healthcare and analytics industries.
· Preferred fundamentals certification in Azure, Snowflake, Databricks, or comparable cloud platforms or
technology.





