

Snowflake Data Engineer (Exp 14-15 Years)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer with 14-15 years of experience, focusing on data pipeline design, data warehouse management, and data quality. Contract length is unspecified, with a competitive pay rate. Key skills include Snowflake, ETL tools, and CI/CD automation.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 31, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Houston, TX
-
π§ - Skills detailed
#Data Warehouse #Data Storage #Snowflake #"ETL (Extract #Transform #Load)" #GitHub #Scala #Data Engineering #Data Lake #Automation #Jenkins #Data Science #Data Quality #ML (Machine Learning) #Storage #Data Management #Data Pipeline
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
1. Designing and implementing scalable data pipelines
1. Building and managing data warehouses and data lakes
1. Ensuring data quality and implementing data management best practices
1. Optimizing data storage and retrieval processes
1. Collaborate closely with data scientists, analysts, and product teams to support analytics and machine learning initiatives.
1. CI/CD orchestration and automation tools: Experience with tools such as Jenkins, GitHub etc.
1. Monitor and tune Snowflake query performance, warehouse usage, and credit consumption.
1. Collaborate closely with data scientists, analysts, and product teams to support analytics and machine learning initiatives.
1. Design and enforce row-level access policies and dynamic masking in Snowflake for sensitive data fields (PII, financials).
1. Enabled data sharing with external teams using secure shares and reader accounts while maintaining strict RBAC controls.
1. Experience with ETL /Scheduler tools.
1. Strong interpersonal, written, and verbal communication skills to interact effectively across teams and stakeholders.
1. Designing semantic layers, aggregate tables, and data models (Star/Snowflake) to support scalable, governed, and business-friendly analytics architecture.