

Snowflake SME
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake SME with a contract duration of more than 6 months, offering a remote work location and requiring expertise in Snowflake, ETL/ELT workflows, SQL, Python, and data architecture.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 25, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Azure #Kafka (Apache Kafka) #Version Control #Data Modeling #SnowPipe #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #Data Architecture #Data Pipeline #Scala #Monitoring #Database Schema #BI (Business Intelligence) #Python #SQL (Structured Query Language) #Migration #"ETL (Extract #Transform #Load)" #Schema Design #Cloud #Data Engineering #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Snowflake SME
Location: Remote
Employment Type: Contract or Full-Time
Position Overview
We are looking for a highly skilled and experienced Snowflake Data Engineer to join our growing data engineering team. The ideal candidate will possess deep hands-on expertise in Snowflake and a strong background in designing, building, and optimizing scalable data pipelines. This role plays a critical part in enabling enterprise-wide analytics by ensuring robust, high-performance data architecture.
Key Responsibilities
β’ Design and build scalable data pipelines leveraging SnowPipe, External Stages, and Kafka Connectors for Snowflake
β’ Manage DDL/DML operations, execute database schema migrations, and maintain schema integrity
β’ Develop and optimize ETL/ELT workflows using SQL and Python
β’ Architect and maintain efficient data models to support business intelligence and operational workloads
β’ Leverage Snowflake Time Travel and other advanced features for auditing, version control, and data recovery
β’ Collaborate with stakeholders to gather data requirements and translate them into effective solutions
β’ Monitor and enhance data pipeline performance and reliability
β’ Implement and maintain Application Performance Monitoring (APM) to ensure smooth data operations
Required Skills & Experience
β’ Proven, hands-on expertise with the Snowflake Data Cloud
β’ Strong proficiency with:
β’ SnowPipe
β’ External Stages
β’ Kafka Connector for Snowflake
β’ DDL/DML operations
β’ Database schema design and migrations
β’ Data modeling for analytics and operations
β’ ETL/ELT design using SQL and Python
β’ Snowflake Time Travel and advanced Snowflake features
β’ Solid experience with APM tools for monitoring data pipelines and system performance
Preferred Qualifications
β’ Familiarity with CI/CD practices in a data engineering context
β’ Experience working in cloud environments such as AWS, Azure, or Google Cloud Platform (GCP)