

Senior Data Engineer - DBT
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a 22-month contract, paying $75/hr, located in a hybrid setting in Orlando, FL. Key skills include 5+ years in data engineering, DBT, Snowflake, AWS, and infrastructure-as-code experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date discovered
August 17, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Orlando, FL 32819
-
π§ - Skills detailed
#Terraform #dbt (data build tool) #Deployment #Data Modeling #DevOps #S3 (Amazon Simple Storage Service) #Data Engineering #Python #Infrastructure as Code (IaC) #Data Orchestration #Snowflake #Cloud #Lambda (AWS Lambda) #Monitoring #GitHub #AWS (Amazon Web Services) #IAM (Identity and Access Management) #Scala #SQL (Structured Query Language) #Data Pipeline #DataOps
Role description
Job Description: Senior Data Engineer Weβre looking for a Senior Data Engineer to help us build and maintain a modern, reliable data platform. This role will focus on developing automated DBT models, managing Snowflake data infrastructure, and supporting our engineering teams with high-quality, scalable data pipelines. You should be comfortable working across tools like DBT, Snowflake, and AWS, and bring a solid understanding of data modeling and DevOps/DataOps best practices. This is a hands-on role where youβll collaborate closely with other engineers to deliver clean, well-tested, production-grade data workflows. Job at a Glance:
22-month contract (Possibility of extension)
$75/hr + optional benefits (Medical, Dental, Vision, and 401k)
Hybrid (Orlando, FL)
What Youβll Do:
Design, build, and maintain DBT models to support analytics and production use cases
Develop and manage data pipelines using AWS and other cloud-based tools
Collaborate with software engineers and data teams to integrate data into applications and workflows
Implement infrastructure as code using CDK, Terraform and GitHub workflows
Contribute to improving performance, reliability, and maintainability of our data stack
Support versioning, testing, and deployment processes for data assets
Help shape best practices around data modeling, pipeline architecture, and DevOps/DataOps in a modern cloud environment
What Weβre Looking For:
5+ years of experience in data engineering, software engineering, or related roles
Strong hands-on experience with DBT and Snowflake
Solid SQL skills and understanding of data modeling concepts
Working knowledge of AWS (e.g. S3, Lambda, IAM)
Experience with infrastructure-as-code (CDK preferred, Terraform or CloudFormation also acceptable)
Familiarity with GitHub, CI/CD pipelines, and software development best practices
Comfortable writing clean, maintainable, and well-documented code
Bonus: Experience with Python, data orchestration tools, or monitoring systems
#INDGEN
Job Description: Senior Data Engineer Weβre looking for a Senior Data Engineer to help us build and maintain a modern, reliable data platform. This role will focus on developing automated DBT models, managing Snowflake data infrastructure, and supporting our engineering teams with high-quality, scalable data pipelines. You should be comfortable working across tools like DBT, Snowflake, and AWS, and bring a solid understanding of data modeling and DevOps/DataOps best practices. This is a hands-on role where youβll collaborate closely with other engineers to deliver clean, well-tested, production-grade data workflows. Job at a Glance:
22-month contract (Possibility of extension)
$75/hr + optional benefits (Medical, Dental, Vision, and 401k)
Hybrid (Orlando, FL)
What Youβll Do:
Design, build, and maintain DBT models to support analytics and production use cases
Develop and manage data pipelines using AWS and other cloud-based tools
Collaborate with software engineers and data teams to integrate data into applications and workflows
Implement infrastructure as code using CDK, Terraform and GitHub workflows
Contribute to improving performance, reliability, and maintainability of our data stack
Support versioning, testing, and deployment processes for data assets
Help shape best practices around data modeling, pipeline architecture, and DevOps/DataOps in a modern cloud environment
What Weβre Looking For:
5+ years of experience in data engineering, software engineering, or related roles
Strong hands-on experience with DBT and Snowflake
Solid SQL skills and understanding of data modeling concepts
Working knowledge of AWS (e.g. S3, Lambda, IAM)
Experience with infrastructure-as-code (CDK preferred, Terraform or CloudFormation also acceptable)
Familiarity with GitHub, CI/CD pipelines, and software development best practices
Comfortable writing clean, maintainable, and well-documented code
Bonus: Experience with Python, data orchestration tools, or monitoring systems
#INDGEN