Test Teechnogen, Inc.

Data Engineer(snowflake)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Snowflake) in Oaks, PA, with a contract length of "unknown" and a pay rate of "unknown." Requires 5+ years in Snowflake and SQL, 5+ years in DBT, and 3+ years in Airflow. Hybrid work model.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 29, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Oaks, PA
-
🧠 - Skills detailed
#dbt (data build tool) #Security #Macros #GCP (Google Cloud Platform) #Version Control #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Scala #Data Engineering #Data Warehouse #Data Security #Clustering #Snowflake #Python #Azure #Airflow #Computer Science #Data Quality #Programming #AWS (Amazon Web Services) #Anomaly Detection #Scripting #Cloud #Automation
Role description
Job Title: Data Engineer Location: Oaks, PA (Hybrid- 3days onsite in a week) Local candidates only- In-Person interview required for Final interview Job Description: β€’ Develop and maintain DBT models, macros, and SQL scripts to transform data within Snowflake. β€’ Optimize data models, design star/snowflake schemas, manage warehouse performance, and implement clustering/materialized views. β€’ Create scalable ELT/ETL pipelines to ingest and transform data from diverse sources. β€’ Write modular, testable SQL code using version control and manage DBT project structures. β€’ Implement data quality checks, automated tests, anomaly detection, and data security, including RBAC, masking, and row-level access in Snowflake. Basic Qualifications: (what are the skills required to this job with minimum years of experience for each) β€’ 5+ years of experience with Snowflake and Strong SQL proficiency. β€’ 5+ years of experience with DBT and Hands-on experience developing with DBT. β€’ 3+ years of experience with Airflow development β€’ Good handle on Data Warehouse concepts. β€’ Education: Bachelor’s degree in Computer Science, Data Engineering, or a related field Nice to Have (But not a must) β€’ Programming: Proficiency in Python for scripting and automation. β€’ Cloud Platforms: Experience with AWS, GCP, or Azure environments.