

Test Teechnogen, Inc.
Data Engineer(snowflake)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Snowflake) in Oaks, PA, with a contract length of "unknown" and a pay rate of "unknown." Requires 5+ years in Snowflake and SQL, 5+ years in DBT, and 3+ years in Airflow. Hybrid work model.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
April 29, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Oaks, PA
-
π§ - Skills detailed
#dbt (data build tool) #Security #Macros #GCP (Google Cloud Platform) #Version Control #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Scala #Data Engineering #Data Warehouse #Data Security #Clustering #Snowflake #Python #Azure #Airflow #Computer Science #Data Quality #Programming #AWS (Amazon Web Services) #Anomaly Detection #Scripting #Cloud #Automation
Role description
Job Title: Data Engineer
Location: Oaks, PA (Hybrid- 3days onsite in a week)
Local candidates only- In-Person interview required for Final interview
Job Description:
β’ Develop and maintain DBT models, macros, and SQL scripts to transform data within Snowflake.
β’ Optimize data models, design star/snowflake schemas, manage warehouse performance, and implement clustering/materialized views.
β’ Create scalable ELT/ETL pipelines to ingest and transform data from diverse sources.
β’ Write modular, testable SQL code using version control and manage DBT project structures.
β’ Implement data quality checks, automated tests, anomaly detection, and data security, including RBAC, masking, and row-level access in Snowflake.
Basic Qualifications: (what are the skills required to this job with minimum years of experience for each)
β’ 5+ years of experience with Snowflake and Strong SQL proficiency.
β’ 5+ years of experience with DBT and Hands-on experience developing with DBT.
β’ 3+ years of experience with Airflow development
β’ Good handle on Data Warehouse concepts.
β’ Education: Bachelorβs degree in Computer Science, Data Engineering, or a related field
Nice to Have (But not a must)
β’ Programming: Proficiency in Python for scripting and automation.
β’ Cloud Platforms: Experience with AWS, GCP, or Azure environments.
Job Title: Data Engineer
Location: Oaks, PA (Hybrid- 3days onsite in a week)
Local candidates only- In-Person interview required for Final interview
Job Description:
β’ Develop and maintain DBT models, macros, and SQL scripts to transform data within Snowflake.
β’ Optimize data models, design star/snowflake schemas, manage warehouse performance, and implement clustering/materialized views.
β’ Create scalable ELT/ETL pipelines to ingest and transform data from diverse sources.
β’ Write modular, testable SQL code using version control and manage DBT project structures.
β’ Implement data quality checks, automated tests, anomaly detection, and data security, including RBAC, masking, and row-level access in Snowflake.
Basic Qualifications: (what are the skills required to this job with minimum years of experience for each)
β’ 5+ years of experience with Snowflake and Strong SQL proficiency.
β’ 5+ years of experience with DBT and Hands-on experience developing with DBT.
β’ 3+ years of experience with Airflow development
β’ Good handle on Data Warehouse concepts.
β’ Education: Bachelorβs degree in Computer Science, Data Engineering, or a related field
Nice to Have (But not a must)
β’ Programming: Proficiency in Python for scripting and automation.
β’ Cloud Platforms: Experience with AWS, GCP, or Azure environments.






