DBT Snowflake Lead Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DBT Snowflake Lead Data Engineer in Jersey City/Pennington, offering a contract length of "unknown" and a pay rate of "unknown." Requires 8+ years of data engineering experience, proficiency in DBT, Snowflake, PL/SQL, and familiarity with CI/CD practices.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 20, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Jersey City, NJ
-
🧠 - Skills detailed
#dbt (data build tool) #Documentation #Normalization #Data Processing #Data Engineering #Clustering #GIT #Monitoring #Cloud #Security #SQL (Structured Query Language) #Data Science #Data Modeling #BI (Business Intelligence) #Airflow #Code Reviews #Data Quality #"ETL (Extract #Transform #Load)" #Version Control #Snowflake #Data Analysis #Scala
Role description
Role Name: DBT Snowflake Lead Data Engineer Location: Jersey City/Pennington Job Description Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives. Key Responsibilities β€’ Design and implement scalable data models and transformation pipelines using DBT on Snowflake. β€’ Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. β€’ Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. β€’ Optimize Snowflake performance through query tuning, clustering, and resource management. β€’ Ensure data quality, integrity, and governance through testing, documentation, and monitoring. β€’ Participate in code reviews, architecture discussions, and continuous improvement initiatives. β€’ Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications β€’ (3+ years for India and 8+ years for US) experience in data engineering or a related field. β€’ Strong hands-on experience with DBT (modular SQL development, testing, documentation). β€’ Proficiency in Snowflake (data warehousing, performance tuning, security). β€’ Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. β€’ Solid understanding of data modeling concepts (star/snowflake schemas, normalization). β€’ Experience with version control systems (e.g., Git) and CI/CD practices. β€’ Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus.