

ETL Snowflake Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Snowflake Developer in Reading, PA, with a contract length of unspecified duration and a pay rate of "unknown." Requires a Bachelor's degree, 3-5 years in ETL development, and 1-2 years with Snowflake and dbt.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 23, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Reading, PA
-
π§ - Skills detailed
#SSIS (SQL Server Integration Services) #Informatica #Snowflake #SQL (Structured Query Language) #Data Pipeline #Computer Science #Data Warehouse #Scala #ADF (Azure Data Factory) #Python #Data Quality #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Data Engineering
Role description
Note: ΒLocation: Reading, PA (Onsite hybrid working with 3 days in office), Local candidates preferred only.
Weβre looking for a Data Engineer with hands-on experience in Snowflake and dbt to design, build, and optimize scalable data pipelines. You will work on transforming raw data into actionable insights, ensuring data quality, governance, and performance.
Requirements
β’ Bachelorβs degree in Computer Science, Information Systems, or related field.
β’ 3β5 years of experience in data warehouse/ETL development (dbt, SSIS, Informatica, ADF, etc.).
β’ 1β2 years of hands-on experience with dbt and Snowflake.
β’ Strong SQL/PL-SQL skills for querying and optimization.
β’ Familiarity with Python for dbt pipeline enhancements.
β’ Excellent problem-solving and communication skills.
Note: ΒLocation: Reading, PA (Onsite hybrid working with 3 days in office), Local candidates preferred only.
Weβre looking for a Data Engineer with hands-on experience in Snowflake and dbt to design, build, and optimize scalable data pipelines. You will work on transforming raw data into actionable insights, ensuring data quality, governance, and performance.
Requirements
β’ Bachelorβs degree in Computer Science, Information Systems, or related field.
β’ 3β5 years of experience in data warehouse/ETL development (dbt, SSIS, Informatica, ADF, etc.).
β’ 1β2 years of hands-on experience with dbt and Snowflake.
β’ Strong SQL/PL-SQL skills for querying and optimization.
β’ Familiarity with Python for dbt pipeline enhancements.
β’ Excellent problem-solving and communication skills.