Sr Data Engineer (Snowflake, SnowPark, SnowPipe) : W2 Role

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer (Snowflake, SnowPark, SnowPipe) in St. Louis, MO, with a contract starting October 1st. Pay rate is W2. Requires 5+ years of data engineering experience, strong Snowflake expertise, and executive-level communication skills.
🌎 - Country
United States
πŸ’± - Currency
Unknown
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 31, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
St. Louis, MO 63101
-
🧠 - Skills detailed
#Data Ingestion #Azure #Data Science #Security #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Snowpark #Snowflake #PySpark #ML (Machine Learning) #Scala #Data Architecture #Data Modeling #Data Quality #Python #Data Pipeline #Data Lakehouse #Spark (Apache Spark) #Azure Data Factory #GCP (Google Cloud Platform) #REST (Representational State Transfer) #Programming #SnowPipe #Data Engineering #Data Lake #Databricks #Batch #Compliance #Datasets #Cloud #Leadership #AWS (Amazon Web Services) #Apache Iceberg #Data Integrity #ADF (Azure Data Factory)
Role description
Sr Data Engineer (Snowflake, SnowPark, SnowPipe) :: W2 role Location: St. Louis, MO Hybrid This job will be mostly remote, but there will be meetings each week where team needs to meet onsite in St. Louis. Notice could be the day before to meet the next day in the office, but then may be off site the rest of the week or for the next 2 weeks. PERSON MUST RELOCATE TO ST LOUIS, No Relocation expenses Requirements: Executive-level communication skills required work with Executive leadership to communicate ideas, build trust with technical depth and collaborate. Technical - Most Senior Data Engineer / Principal Architect level role 3+ years Snowflake experience Expert: Snowflake, SnowPark (data science), Data ingestion (SnowPipe, Azure Data Factory) Programming : Scala (build custom data pipelines, Python (data modeling), PySpark Azure preferred, AWS or GCP is acceptable. Expert building Data Pipelines Must Start October 1st About the Job Safety National - owned by Tokyo Marine (large equity holding fund) Project involves extracting value from third-party administrator data using Snowflake platform High Priority - Customer facing, handling architecture, challenges, and leading the conversation. Ready to take a step towards a principal role soon and have the visible leadership to run this large project and interact with their executive team. Carry that leadership torch for Safety National and be in the leadership pipeline for us. JOB DETAIL We are seeking a highly skilled and experienced Senior Data Engineer to design and build a robust set of data ingestion and processing pipelines, maintaining best practices for a data lakehouse architecture. This is a highly visible, client-facing role that requires a blend of technical expertise, architectural leadership, and strong communication skills. You'll be the technical lead, working closely with team members to deliver innovative data solutions that support analytics, machine learning, and real-time decision-making. Key Responsibilities Lead the design, implementation, and maintenance of scalable data lakehouse platforms using modern tools like Databricks, Snowflake, and Apache Iceberg. Develop and optimize high-performance batch and streaming ETL/ELT pipelines, with a strong focus on Snowflake, Snowpipe, and Snowpark. Act as a technical leader, managing architecture discussions and leading conversations with both internal teams and external clients. Implement and enforce data quality, governance, and security best practices to ensure data integrity and compliance. Identify opportunities to integrate platform-level AI tools (like those in Snowflake, Databricks, and Fabric) to outpace traditional data science efforts and deliver faster, more impactful insights. Collaborate with cross-functional teams, including data scientists and business stakeholders, to deliver high-quality, business-critical datasets. Qualifications Slowflake experience/proficiency is critical. Azure experience is preferred, but Google Cloud and AWS okay 5+ years of professional experience in data engineering. Strong technical leadership and excellent communication skills, with proven experience in a client-facing role. Deep expertise in cloud data platforms, with significant hands-on experience in Snowflake. Demonstrated experience with data lakehouse design patterns and modern data architectures. Flexible work from home options available.