

Nesco Resource
Snowflake Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a contract-to-hire Snowflake Data Engineer position requiring US citizenship or GC status. Key skills include 5+ years in data warehousing, 3+ years SQL development, strong Snowflake expertise, and familiarity with ETL tools and data governance practices.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
March 27, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Fixed Term
-
π - Security
Unknown
-
π - Location detailed
New Haven County, CT
-
π§ - Skills detailed
#Data Pipeline #Data Architecture #Scripting #Jira #Computer Science #Monitoring #Slowly Changing Dimensions #Vault #Data Quality #Airflow #Scala #Data Governance #Storage #Data Vault #Metadata #Code Reviews #Leadership #Cloud #Data Management #BI (Business Intelligence) #Security #Data Ingestion #Talend #Documentation #GIT #Snowflake #Data Modeling #Version Control #Oracle #Data Engineering #SQL Server #Data Integrity #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language)
Role description
Data Movement Engineer (Snowflake)
This is a contract to hire position and you must be a US citizen or GC holder.
Our client is undertaking a major modernization of its core data platforms and is seeking a Data Movement Engineer to support and evolve its ETL ecosystem. This role will play a key part in designing, optimizing, and supporting Snowflake-based data solutions while ensuring reliable, high-performing data pipelines. The ideal candidate brings deep Snowflake expertise, strong data warehousing knowledge (including Data Vault 2.0), and the ability to lead initiatives and mentor team members.
Key Responsibilities
β’ Own the end-to-end lifecycle of Snowflake data operations, including monitoring, troubleshooting, and performance optimization of data pipelines.
β’ Proactively identify and resolve issues related to data integrity, latency, and pipeline failures.
β’ Translate business and stakeholder requirements into scalable Snowflake-based data movement solutions.
β’ Design, develop, and optimize data pipelines, models, and warehouse configurations for performance, reliability, and scalability.
β’ Implement best practices for data ingestion, transformation, storage, security, and modeling.
β’ Support advanced data techniques such as Change Data Capture (CDC) and Slowly Changing Dimensions (SCD Type 2).
β’ Apply and support data modeling frameworks, including Data Vault and dimensional modeling.
β’ Monitor ETL and Snowflake environments, troubleshoot failures, and conduct root cause analysis of data issues.
β’ Provide production support for Snowflake, IBM DB2, and orchestration tools such as Airflow and ESP.
β’ Ensure data quality through validation, reconciliation, and consistency checks across systems.
β’ Contribute to data governance initiatives, including metadata management and lineage tracking.
β’ Lead the design and ongoing maintenance of Snowflake processes and solutions.
β’ Provide technical leadership through code reviews, mentoring, and best practice guidance.
β’ Collaborate with stakeholders, architects, and engineering teams to deliver effective data solutions.
β’ Communicate issue status, impact, and resolution timelines clearly and proactively.
β’ Maintain thorough, version-controlled documentation for solutions, issues, and fixes.
β’ Develop and maintain production support playbooks and incident response procedures.
β’ Support estimation, planning, and architectural decision-making.
β’ Deliver regular reporting on system health, issue trends, and improvement initiatives.
β’ Participate in on-call rotations and provide after-hours support as needed.
Qualifications & Skills
β’ 5+ years of experience in data warehousing, business intelligence, or data engineering.
β’ 3+ years of hands-on SQL development across platforms such as Snowflake, Oracle, SQL Server, or DB2.
β’ Strong expertise in Snowflake, including development, data modeling, and security (certification preferred).
β’ Experience with ETL tools such as Talend.
β’ Proficiency in SQL and scripting languages (e.g., Python, Shell).
β’ Experience working with cloud-based data platforms.
β’ Familiarity with orchestration tools (Airflow, ESP) and version control systems (Git).
β’ Experience with incident management tools (ServiceNow, Jira) and CI/CD practices.
β’ Strong understanding of CDC and SCD Type 2 implementations.
β’ Solid knowledge of data architecture, dimensional modeling, and frameworks such as Data Vault.
Education
β’ Bachelorβs degree in Computer Science, Information Systems, or a related field.
β’ 5+ years of experience in Snowflake development and data warehousing.
Data Movement Engineer (Snowflake)
This is a contract to hire position and you must be a US citizen or GC holder.
Our client is undertaking a major modernization of its core data platforms and is seeking a Data Movement Engineer to support and evolve its ETL ecosystem. This role will play a key part in designing, optimizing, and supporting Snowflake-based data solutions while ensuring reliable, high-performing data pipelines. The ideal candidate brings deep Snowflake expertise, strong data warehousing knowledge (including Data Vault 2.0), and the ability to lead initiatives and mentor team members.
Key Responsibilities
β’ Own the end-to-end lifecycle of Snowflake data operations, including monitoring, troubleshooting, and performance optimization of data pipelines.
β’ Proactively identify and resolve issues related to data integrity, latency, and pipeline failures.
β’ Translate business and stakeholder requirements into scalable Snowflake-based data movement solutions.
β’ Design, develop, and optimize data pipelines, models, and warehouse configurations for performance, reliability, and scalability.
β’ Implement best practices for data ingestion, transformation, storage, security, and modeling.
β’ Support advanced data techniques such as Change Data Capture (CDC) and Slowly Changing Dimensions (SCD Type 2).
β’ Apply and support data modeling frameworks, including Data Vault and dimensional modeling.
β’ Monitor ETL and Snowflake environments, troubleshoot failures, and conduct root cause analysis of data issues.
β’ Provide production support for Snowflake, IBM DB2, and orchestration tools such as Airflow and ESP.
β’ Ensure data quality through validation, reconciliation, and consistency checks across systems.
β’ Contribute to data governance initiatives, including metadata management and lineage tracking.
β’ Lead the design and ongoing maintenance of Snowflake processes and solutions.
β’ Provide technical leadership through code reviews, mentoring, and best practice guidance.
β’ Collaborate with stakeholders, architects, and engineering teams to deliver effective data solutions.
β’ Communicate issue status, impact, and resolution timelines clearly and proactively.
β’ Maintain thorough, version-controlled documentation for solutions, issues, and fixes.
β’ Develop and maintain production support playbooks and incident response procedures.
β’ Support estimation, planning, and architectural decision-making.
β’ Deliver regular reporting on system health, issue trends, and improvement initiatives.
β’ Participate in on-call rotations and provide after-hours support as needed.
Qualifications & Skills
β’ 5+ years of experience in data warehousing, business intelligence, or data engineering.
β’ 3+ years of hands-on SQL development across platforms such as Snowflake, Oracle, SQL Server, or DB2.
β’ Strong expertise in Snowflake, including development, data modeling, and security (certification preferred).
β’ Experience with ETL tools such as Talend.
β’ Proficiency in SQL and scripting languages (e.g., Python, Shell).
β’ Experience working with cloud-based data platforms.
β’ Familiarity with orchestration tools (Airflow, ESP) and version control systems (Git).
β’ Experience with incident management tools (ServiceNow, Jira) and CI/CD practices.
β’ Strong understanding of CDC and SCD Type 2 implementations.
β’ Solid knowledge of data architecture, dimensional modeling, and frameworks such as Data Vault.
Education
β’ Bachelorβs degree in Computer Science, Information Systems, or a related field.
β’ 5+ years of experience in Snowflake development and data warehousing.






