

Senior Data Engineer DBT Snowflake
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 3-6 years of Snowflake experience and 2 years with DBT. Contract length is unspecified, pay rate is "unknown," and remote work is allowed. Finance SAP experience is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 25, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Azure #Data Governance #SnowSQL #Version Control #SnowPipe #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #dbt (data build tool) #S3 (Amazon Simple Storage Service) #Data Quality #Airflow #ADF (Azure Data Factory) #Normalization #Leadership #Metadata #Scala #YAML (YAML Ain't Markup Language) #Azure Data Factory #Macros #Data Analysis #AWS S3 (Amazon Simple Storage Service) #Compliance #SAP #Databases #SQL (Structured Query Language) #Documentation #GIT #Data Warehouse #"ETL (Extract #Transform #Load)" #Jenkins #BitBucket #Cloud #Clustering #Data Engineering #Snowflake #Snowpark #Storage
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We are looking for a highly skilled Sr Data Engineer with deep expertise in Snowflake DBT and strong background in Finance SAP preferred and leadership experience few experience in HR systems would be a plus to join our data engineering team The ideal candidate will be responsible for designing building and optimizing scalable data transformation pipelines and ensuring data quality and performance across our analytics platforms
Key Responsibilities
Technical
β’ Design and implement modular reusable DBT models for data transformation in Snowflake
β’ Optimize Snowflake performance through clustering partitioning caching and query tuning
β’ Define and manage schema objects including databases schemas tables views and stages
β’ Build and maintain ELT pipelines using Snowflake native features like Snowpipe Streams and Tasks
β’ Integrate Snowflake with external data sources and cloud storage eg AWS S3 Azure Blob GCP
β’ Optimize query performance using clustering keys result caching and materialized views
β’ Monitor and tune warehouse performance and cost efficiency
β’ Leverage advanced Snowflake features like Time Travel ZeroCopy Cloning and Data Sharing
β’ Explore and implement UDFs external functions and Snowpark where applicable
β’ Ensure compliance with data governance and privacy standards
β’ Automate workflows using orchestration tools eg Airflow Azure Data Factory
β’ Schedule and monitor data jobs using Snowflake Tasks and external schedulers
β’ Collaborate with data analysts architects and business stakeholders to translate requirements into scalable data solutions
β’ Design and implement DBT projects from scratch including folder structure model layers staging intermediate marts and naming conventions
β’ Use Git for version control of DBT projects
β’ Design build and maintain modular DBT models for data transformation
β’ Implement staging intermediate and mart layers following best practices
β’ Use Jinja templating and macros to create reusable logic
β’ Define and manage tests eg uniqueness not null accepted values within DBT
β’ Monitor test results and resolve data quality issues proactively
β’ Implement CICD pipelines for DBT projects using Git Bitbucket and Jenkins
β’ Ensure data governance lineage and documentation using tools like dbtdocs and metadata tagging
β’ Integrate Snowflake with cloud storage eg GCP Azure Blob AWS S3 and orchestration tools eg Airflow Azure Data Factory
β’ Troubleshoot and resolve data quality issues and performance bottlenecks
β’ Implement rolebased access controls and data masking where required
β’ Ensure compliance with data governance and privacy policies
β’ Integrate DBT with orchestration tools eg Airflow Prefect
β’ Schedule and monitor DBT run in production environments
Functional
β’ Prior experience on working with sources like SAP ECC S4 HANA
β’ Functional understanding one of these SAP module Supply chain Finance FICO Sales Distribution
β’ Prior experience pulling data from SAP sources
Required Skills
β’ 3- 6 years of handson experience with Snowflake including SnowSQL Snowpipe Streams Tasks and Time Travel
β’ 2 years of handson experience with DBT Core or Cloud in a production environment
β’ Strong SQL skills and experience with data modelling starsnowflake schema normalizationdenormalization
β’ Deep understanding of DBT features materializations table view incremental ephemeral macros seeds snapshots tests and documentation
β’ Experience with cloud data warehouses Snowflake
β’ Proficiency in Git CICD and workflow orchestration eg Airflow dbt Cloud
β’ Familiarity with Jinja templating YAML configuration and DBT project structure
β’ Strong communication skills and ability to work cross functionally