

Agility Partners
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown" and a pay rate of "unknown," located in "unknown." Key skills include 4+ years of Snowflake experience, advanced SQL, ETL pipeline development, and DBT expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
584
-
🗓️ - Date
April 2, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Cincinnati Metropolitan Area
-
🧠 - Skills detailed
#Data Engineering #Python #Scala #Data Quality #"ETL (Extract #Transform #Load)" #Programming #Data Pipeline #SQL (Structured Query Language) #Java #Monitoring #Snowflake #Cloud #Data Integration #dbt (data build tool) #API (Application Programming Interface) #Compliance #BI (Business Intelligence)
Role description
Must be able to work on a W2. C2C is not permitted for this engagement.
Agility Partners is seeking a qualified Senior Data Engineer to fill an open position with one of our banking clients. This role supports an Enterprise Risk and Governance technology team and is focused on building, enhancing, and maintaining scalable data products in Snowflake. You’ll work hands‑on with data movement, transformation, and quality, partnering closely with engineering, architecture, and risk stakeholders to enable reliable data used across compliance and governance initiatives.
A Little About This Gig
• Design, build, test, and maintain scalable ETL and data pipelines
• Develop and enhance data products that move data from source systems into Snowflake
• Build pipelines across both on‑prem and cloud environments
• Embed data quality checks, validation, testing, and monitoring into ETL processes
• Write advanced SQL to support analytics and downstream data consumption
• Collaborate with data engineers, architects, and IT partners supporting enterprise risk and governance initiatives
• Contribute in a mix of independent and highly collaborative team settings
The Ideal Candidate
• Bachelor’s degree in a technical field or equivalent work experience
• 4+ years of hands‑on Snowflake experience
• Advanced SQL skills
• Strong experience building and enhancing ETL pipelines
• DBT experience (absolute must)
• Experience programming or architecting backend systems using Java or J2EE
• Python for data engineering use cases
• Experience with API‑based data integrations
• Experience embedding quality, testing, and monitoring into data pipelines
• Familiarity with BI and analytics use cases from a data engineering perspective
• Experience with Archer or other GRC platforms (nice to have)
• Strong written and verbal communication skills
• Comfortable working in fast‑moving, regulated environments
Must be able to work on a W2. C2C is not permitted for this engagement.
Agility Partners is seeking a qualified Senior Data Engineer to fill an open position with one of our banking clients. This role supports an Enterprise Risk and Governance technology team and is focused on building, enhancing, and maintaining scalable data products in Snowflake. You’ll work hands‑on with data movement, transformation, and quality, partnering closely with engineering, architecture, and risk stakeholders to enable reliable data used across compliance and governance initiatives.
A Little About This Gig
• Design, build, test, and maintain scalable ETL and data pipelines
• Develop and enhance data products that move data from source systems into Snowflake
• Build pipelines across both on‑prem and cloud environments
• Embed data quality checks, validation, testing, and monitoring into ETL processes
• Write advanced SQL to support analytics and downstream data consumption
• Collaborate with data engineers, architects, and IT partners supporting enterprise risk and governance initiatives
• Contribute in a mix of independent and highly collaborative team settings
The Ideal Candidate
• Bachelor’s degree in a technical field or equivalent work experience
• 4+ years of hands‑on Snowflake experience
• Advanced SQL skills
• Strong experience building and enhancing ETL pipelines
• DBT experience (absolute must)
• Experience programming or architecting backend systems using Java or J2EE
• Python for data engineering use cases
• Experience with API‑based data integrations
• Experience embedding quality, testing, and monitoring into data pipelines
• Familiarity with BI and analytics use cases from a data engineering perspective
• Experience with Archer or other GRC platforms (nice to have)
• Strong written and verbal communication skills
• Comfortable working in fast‑moving, regulated environments






