

GTN Technical Staffing
Senior Data Engineer - Snowflake / DBT
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer specializing in Snowflake and DBT, with a 3–6 month contract at $65–$75/hour, hybrid in Miami, FL. Key skills include SQL, Python, and cloud data pipeline experience, particularly with Snowflake and Informatica IICS.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
May 14, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Miami-Fort Lauderdale Area
-
🧠 - Skills detailed
#IICS (Informatica Intelligent Cloud Services) #Data Lineage #Monitoring #Snowflake #Data Warehouse #Cloud #Data Pipeline #DataOps #BI (Business Intelligence) #Data Governance #Microsoft Power BI #GIT #Automation #Compliance #Deployment #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Modeling #dbt (data build tool) #Scala #Security #Datasets #Informatica #Forecasting #Observability #DevOps #Data Engineering
Role description
Sr Data Engineer
📍 Hybrid – Miami, FL
💼 3–6 Month Contract-to-Hire
💰 $65–$75/hour
Overview
We are seeking a hands-on Sr Data Engineer to design, build, and operate reliable, governed, and cost-efficient data pipelines that support analytics, operations, and enterprise decision-making.
This role blends deep technical execution with ownership of enterprise data movement, automation, and observability. The ideal candidate brings strong cloud data engineering experience, production expertise in Snowflake and Informatica IICS, and a passion for building scalable, high-performance data platforms.
You will collaborate closely with analysts, product owners, automation engineers, and architects to transform complex business requirements into well-designed, automated data products.
Key Responsibilities
Data Engineering & Pipeline Development
• Design, develop, and optimize ETL/ELT pipelines
• Build reusable, governed data models
• Translate business requirements into scalable data products
• Implement monitoring, alerting, and data observability practices
• Optimize performance and manage data platform costs
Data Platforms & Analytics Support
• Develop pipelines using Snowflake, Informatica IICS (CDI/CAI), and dbt
• Build analytics-ready datasets supporting forecasting, logistics, and member analytics
• Support BI and reporting use cases, including Power BI
Automation & DevOps
• Implement CI/CD pipelines for data workflows
• Automate testing and deployments
• Manage source control using Git
• Support DataOps best practices
Governance & Security
• Apply data governance, security, and compliance standards
• Document data lineage, transformations, and operational processes
• Support cost governance and FinOps initiatives
Collaboration & Standards
• Partner cross-functionally with business and technical teams
• Mentor junior engineers
• Contribute to data engineering standards and best practices
Required Qualifications
• 3–5+ years of experience as a Data Engineer, Software Developer, or Automation Engineer
• Strong hands-on experience with SQL and Python
• Experience building cloud-based data pipelines and data warehouses
• Production experience with Snowflake and Informatica IICS
• Experience with Git and CI/CD pipelines
• Strong understanding of data modeling (Kimball or Inmon methodologies)
• Ability to translate business needs into automated, scalable data solutions
• Comfortable working in cross-functional, multicultural environments
• Fluent in English and Spanish
Nice to Have
• Experience with dbt
• Experience implementing monitoring and alerting frameworks
• Experience with FinOps or cloud cost governance
• Experience supporting analytics and BI teams
What We’re Looking For
• Self-directed and comfortable working with minimal supervision
• Calm under pressure and adaptable to changing constraints
• Comfortable moving forward with imperfect information
• Curious, collaborative, and energized by solving complex problems
Sr Data Engineer
📍 Hybrid – Miami, FL
💼 3–6 Month Contract-to-Hire
💰 $65–$75/hour
Overview
We are seeking a hands-on Sr Data Engineer to design, build, and operate reliable, governed, and cost-efficient data pipelines that support analytics, operations, and enterprise decision-making.
This role blends deep technical execution with ownership of enterprise data movement, automation, and observability. The ideal candidate brings strong cloud data engineering experience, production expertise in Snowflake and Informatica IICS, and a passion for building scalable, high-performance data platforms.
You will collaborate closely with analysts, product owners, automation engineers, and architects to transform complex business requirements into well-designed, automated data products.
Key Responsibilities
Data Engineering & Pipeline Development
• Design, develop, and optimize ETL/ELT pipelines
• Build reusable, governed data models
• Translate business requirements into scalable data products
• Implement monitoring, alerting, and data observability practices
• Optimize performance and manage data platform costs
Data Platforms & Analytics Support
• Develop pipelines using Snowflake, Informatica IICS (CDI/CAI), and dbt
• Build analytics-ready datasets supporting forecasting, logistics, and member analytics
• Support BI and reporting use cases, including Power BI
Automation & DevOps
• Implement CI/CD pipelines for data workflows
• Automate testing and deployments
• Manage source control using Git
• Support DataOps best practices
Governance & Security
• Apply data governance, security, and compliance standards
• Document data lineage, transformations, and operational processes
• Support cost governance and FinOps initiatives
Collaboration & Standards
• Partner cross-functionally with business and technical teams
• Mentor junior engineers
• Contribute to data engineering standards and best practices
Required Qualifications
• 3–5+ years of experience as a Data Engineer, Software Developer, or Automation Engineer
• Strong hands-on experience with SQL and Python
• Experience building cloud-based data pipelines and data warehouses
• Production experience with Snowflake and Informatica IICS
• Experience with Git and CI/CD pipelines
• Strong understanding of data modeling (Kimball or Inmon methodologies)
• Ability to translate business needs into automated, scalable data solutions
• Comfortable working in cross-functional, multicultural environments
• Fluent in English and Spanish
Nice to Have
• Experience with dbt
• Experience implementing monitoring and alerting frameworks
• Experience with FinOps or cloud cost governance
• Experience supporting analytics and BI teams
What We’re Looking For
• Self-directed and comfortable working with minimal supervision
• Calm under pressure and adaptable to changing constraints
• Comfortable moving forward with imperfect information
• Curious, collaborative, and energized by solving complex problems






