

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer based in Kirkwood, MO, on a long-term contract. Requires 5+ years of experience, expertise in Snowflake, SnowPark, Azure, Python, and Scala, along with strong leadership and communication skills.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 3, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Kirkwood, MO
-
π§ - Skills detailed
#GCP (Google Cloud Platform) #Data Lake #AWS (Amazon Web Services) #Data Modeling #Spark (Apache Spark) #Snowpark #"ETL (Extract #Transform #Load)" #Leadership #PySpark #Data Science #Data Lakehouse #Scala #Programming #Azure #Data Architecture #SnowPipe #Data Ingestion #Compliance #AI (Artificial Intelligence) #Data Integrity #Datasets #Batch #Data Engineering #Data Pipeline #Security #Apache Iceberg #ADF (Azure Data Factory) #Snowflake #Data Quality #Databricks #Azure Data Factory #Cloud #Python
Role description
Sr. Data Engineer
Day 1 Onsite from Kirkwood, MO
Long Term
Must Have:
Snowflake
SnowPark
SnowPipe
Azure
Python
Scala
Requirement:
β’ Director-level person / personality
β’ Executive-level communication skills required
β’ Work with Executive leadership to communicate ideas,
β’ Build trust with technical depth and collaborate.
β’ Technical - Most Senior Data Engineer / Principal Architect level role
β’ 3+ years Snowflake experience
β’ Expert: Snowflake, SnowPark (data science), Data ingestion (SnowPipe, Azure Data Factory)
β’ Programming : Scala (build custom data pipelines, Python (data modeling), PySpark
β’ Azure preferred, AWS or GCP is acceptable.
β’ Expert building Data Pipelines
Key Responsibilities
β’ Lead the design, implementation, and maintenance of scalable data lakehouse platforms using modern tools like Databricks, Snowflake, and Apache Iceberg.
β’ Develop and optimize high-performance batch and streaming ETL/ELT pipelines, with a strong focus on Snowflake, Snowpipe, and Snowpark.
β’ Act as a technical leader, managing architecture discussions and leading conversations with both internal teams and external clients.
β’ Implement and enforce data quality, governance, and security best practices to ensure data integrity and compliance.
β’ Identify opportunities to integrate platform-level AI tools (like those in Snowflake, Databricks, and Fabric) to outpace traditional data science efforts and deliver faster, more impactful insights.
β’ Collaborate with cross-functional teams, including data scientists and business stakeholders, to deliver high-quality, business-critical datasets.
Qualifications
β’ Slowflake experience/proficiency is critical.
β’ Azure experience is preferred, but Google Cloud and AWS okay
β’ 5+ years of professional experience in data engineering.
β’ Strong technical leadership and excellent communication skills, with proven experience in a client-facing role.
β’ Deep expertise in cloud data platforms, with significant hands-on experience in Snowflake.
β’ Demonstrated experience with data lakehouse design patterns and modern data architectures.
Sr. Data Engineer
Day 1 Onsite from Kirkwood, MO
Long Term
Must Have:
Snowflake
SnowPark
SnowPipe
Azure
Python
Scala
Requirement:
β’ Director-level person / personality
β’ Executive-level communication skills required
β’ Work with Executive leadership to communicate ideas,
β’ Build trust with technical depth and collaborate.
β’ Technical - Most Senior Data Engineer / Principal Architect level role
β’ 3+ years Snowflake experience
β’ Expert: Snowflake, SnowPark (data science), Data ingestion (SnowPipe, Azure Data Factory)
β’ Programming : Scala (build custom data pipelines, Python (data modeling), PySpark
β’ Azure preferred, AWS or GCP is acceptable.
β’ Expert building Data Pipelines
Key Responsibilities
β’ Lead the design, implementation, and maintenance of scalable data lakehouse platforms using modern tools like Databricks, Snowflake, and Apache Iceberg.
β’ Develop and optimize high-performance batch and streaming ETL/ELT pipelines, with a strong focus on Snowflake, Snowpipe, and Snowpark.
β’ Act as a technical leader, managing architecture discussions and leading conversations with both internal teams and external clients.
β’ Implement and enforce data quality, governance, and security best practices to ensure data integrity and compliance.
β’ Identify opportunities to integrate platform-level AI tools (like those in Snowflake, Databricks, and Fabric) to outpace traditional data science efforts and deliver faster, more impactful insights.
β’ Collaborate with cross-functional teams, including data scientists and business stakeholders, to deliver high-quality, business-critical datasets.
Qualifications
β’ Slowflake experience/proficiency is critical.
β’ Azure experience is preferred, but Google Cloud and AWS okay
β’ 5+ years of professional experience in data engineering.
β’ Strong technical leadership and excellent communication skills, with proven experience in a client-facing role.
β’ Deep expertise in cloud data platforms, with significant hands-on experience in Snowflake.
β’ Demonstrated experience with data lakehouse design patterns and modern data architectures.