

Bee Talent Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, offering a pay rate of "X" per hour. Key skills include 7+ years of experience in Snowflake and dbt pipelines, Python, SQL, and Terraform. Remote work is available.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
1000
-
🗓️ - Date
January 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Terraform #Scala #dbt (data build tool) #Data Engineering #Infrastructure as Code (IaC) #Snowflake #Documentation #Monitoring #SQL (Structured Query Language) #Python #"ETL (Extract #Transform #Load)"
Role description
Hands-on builder and implementer of the data platform, executing pipelines, transformations, and production readiness across both phases, with emphasis on scalability, quality, and client handover.
Key Responsibilities:
• Build and deploy full stack (ingestion, dbt models/tests, orchestration) in Canopy accounts.
• Develop reusable pipelines/transformations for 18+ OpCos, including custom logic/reconciliations.
• Implement dbt tests, monitoring, lineage, backups/DR.
• Support prototyping in Phase 1 and iterative sprints in Phase 2.
• Create documentation/runbooks and assist with training/knowledge transfer.
• Collaborate on MVP backlog execution and go-live validation.
Required Skills & Experience:
• 7+ years in data engineering, focused on Snowflake + dbt production pipelines.
• Proficiency in Python/SQL, IaC (Terraform), CI/CD, and multi-source ingestion.
• Experience with enterprise DR, testing, and handover to client teams.
Hands-on builder and implementer of the data platform, executing pipelines, transformations, and production readiness across both phases, with emphasis on scalability, quality, and client handover.
Key Responsibilities:
• Build and deploy full stack (ingestion, dbt models/tests, orchestration) in Canopy accounts.
• Develop reusable pipelines/transformations for 18+ OpCos, including custom logic/reconciliations.
• Implement dbt tests, monitoring, lineage, backups/DR.
• Support prototyping in Phase 1 and iterative sprints in Phase 2.
• Create documentation/runbooks and assist with training/knowledge transfer.
• Collaborate on MVP backlog execution and go-live validation.
Required Skills & Experience:
• 7+ years in data engineering, focused on Snowflake + dbt production pipelines.
• Proficiency in Python/SQL, IaC (Terraform), CI/CD, and multi-source ingestion.
• Experience with enterprise DR, testing, and handover to client teams.





