Data Engineer DBT SQL, Snowflake and Python 100% Remote 12+ Month Contract

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with expertise in DBT, SQL, Snowflake, and Python, offering a 12+ month remote contract. Key skills include database architecture, Git/GitHub proficiency, and data quality monitoring. Experience with financial data reconciliation is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
June 6, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Fixed Term
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Snowflake #SQL (Structured Query Language) #Database Architecture #Python #YAML (YAML Ain't Markup Language) #Data Transformations #"ETL (Extract #Transform #Load)" #GIT #Data Quality #Macros #Data Engineering #dbt (data build tool) #Airflow #GitHub #Metadata #Monitoring #Databases #Logging
Role description
We have a 12 month contract with opportunity for extension or conversion for a Senior Database Engineer with experience in DBT and SQL, with Snowflake and Python experience as a highly preferred. Must Haves: β€’ DBT (Data Build Tool) absolute must β€’ directory structure and rule hierarchy β€’ yaml β€’ macros β€’ pre/post run hooks β€’ tests β€’ core generic β€’ singular β€’ custom generic β€’ unit tests β€’ SQL experience (plus if it’s with Snowflake) β€’ Python Experience (enough to decipher code) β€’ Knowledge of Database Architecture Concepts (relational DBs, keys) β€’ Git/Github Proficiency (CI/CD workflow knowledge is a plus) Responsibilities/JD: β€’ Design and implement Snowflake databases and schemas through DBT models β€’ Logically organize models within the DBT directory hierarchy to optimize rule enforcement. β€’ Document all data models and transformations by maintaining metadata in yaml files. β€’ Implement data quality monitoring and alerting aligned with financial reconciliation needs through Core generic, custom generic, package generic, and singular data tests in DBT. β€’ Configure robust logging utilizing macros and pre/post-run hooks. β€’ Collaborate with SMEs to profile source systems and define data models, keys, and relationships. β€’ Decipher Python utilities for data transformations, identifying inefficiencies in current pipelines and designing improved pipelines in DBT. β€’ Design Airflow DAGs for ingestion of new files (optional). β€’ Collaborate with the data platform teams to address any issues which arise during development.