Talent Groups

Snowflake Data Engineer - Locals to PA Only

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer in PA, requiring 8-10+ years of IT experience, 5+ years with Snowflake, SQL, and DBT, plus 3+ years with Apache Airflow. On-site work and in-person interviews are mandatory.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 21, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
King of Prussia, PA
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Cloud #Data Pipeline #Python #Data Modeling #"ETL (Extract #Transform #Load)" #Data Engineering #dbt (data build tool) #Scripting #Datasets #Apache Airflow #Snowflake #AWS (Amazon Web Services) #Computer Science #Azure #SQL (Structured Query Language) #Automation #Scala #Airflow
Role description
P.S: Need local to locals; should be open to an in-person interview and to working in the office without relocating to accept this project. Job Summary We are seeking an experienced Snowflake Data Engineer with strong expertise in DBT, SQL, and Airflow to design, build, and optimize scalable data pipelines. The ideal candidate will have a solid foundation in data warehousing concepts and hands-on experience transforming and managing large datasets in modern cloud environments. Required Qualifications β€’ IT experience: 8-10+ years β€’ 5+ years of experience working with Snowflake β€’ 5+ years of strong SQL development experience β€’ 5+ years of hands-on experience with DBT (Data Build Tool) β€’ 3+ years of experience with Apache Airflow β€’ Strong understanding of data warehousing concepts and data modeling techniques β€’ Bachelor’s degree in Computer Science, Data Engineering, or a related field Preferred Qualifications (Nice to Have) β€’ Proficiency in Python for scripting and automation β€’ Experience working with cloud platforms such as AWS, GCP, or Azure