Sr Data Engineer (Snowflake, SnowPark, SnowPipe)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer (Snowflake, SnowPark, SnowPipe) with a contract length of "unknown" and a pay rate of "unknown." Key skills include Snowflake proficiency, data engineering experience, and strong communication. Cloud experience in Azure, Google Cloud, or AWS is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 30, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
St Louis, MO
-
🧠 - Skills detailed
#Batch #Data Architecture #"ETL (Extract #Transform #Load)" #Data Quality #SnowPipe #Snowpark #Azure #Data Science #Cloud #Snowflake #Data Ingestion #AWS (Amazon Web Services) #Databricks #Apache Iceberg #Datasets #Data Integrity #Scala #Security #Data Lake #Data Engineering #Leadership #ML (Machine Learning) #AI (Artificial Intelligence) #Data Lakehouse #Compliance
Role description
JOB DETAIL We are seeking a highly skilled and experienced Senior Data Engineer to design and build a robust set of data ingestion and processing pipelines, maintaining best practices for a data lakehouse architecture. This is a highly visible, client-facing role that requires a blend of technical expertise, architectural leadership, and strong communication skills. You'll be the technical lead, working closely with team members to deliver innovative data solutions that support analytics, machine learning, and real-time decision-making. Key Responsibilities β€’ Lead the design, implementation, and maintenance of scalable data lakehouse platforms using modern tools like Databricks, Snowflake, and Apache Iceberg. β€’ Develop and optimize high-performance batch and streaming ETL/ELT pipelines, with a strong focus on Snowflake, Snowpipe, and Snowpark. β€’ Act as a technical leader, managing architecture discussions and leading conversations with both internal teams and external clients. β€’ Implement and enforce data quality, governance, and security best practices to ensure data integrity and compliance. β€’ Identify opportunities to integrate platform-level AI tools (like those in Snowflake, Databricks, and Fabric) to outpace traditional data science efforts and deliver faster, more impactful insights. β€’ Collaborate with cross-functional teams, including data scientists and business stakeholders, to deliver high-quality, business-critical datasets. Qualifications β€’ Slowflake experience/proficiency is critical. β€’ Azure experience is preferred, but Google Cloud and AWS okay β€’ 5+ years of professional experience in data engineering. β€’ Strong technical leadership and excellent communication skills, with proven experience in a client-facing role. β€’ Deep expertise in cloud data platforms, with significant hands-on experience in Snowflake. β€’ Demonstrated experience with data lakehouse design patterns and modern data architectures.