

Cypress HCM
Manager, Data Engineering
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Manager, Data Engineering" with a contract length of over 6 months, offering a pay rate of $150 - $175K plus an 8% bonus. Key skills include data engineering, SQL, ETL/ELT pipelines, and experience in regulated financial services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
795
-
🗓️ - Date
May 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Tinley Park, IL
-
🧠 - Skills detailed
#Snowflake #Code Reviews #Integration Testing #Agile #Cloud #Data Pipeline #GCP (Google Cloud Platform) #Azure #Leadership #Scripting #Oracle #Automation #Data Quality #Strategy #Compliance #Java #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Groovy #Unit Testing #Computer Science #Data Integration #dbt (data build tool) #Business Analysis #GitLab #AWS (Amazon Web Services) #Scala #Security #Data Architecture #Documentation #PostgreSQL #Data Processing #Jenkins #Data Engineering
Role description
Location: Remote, USA
Employees: 1,000 | Team Size: 5-8
Industry: FinTech | Payments Domain
Reports To
About the Role
We are seeking an experienced and hands-on Director of Data Engineering to lead and scale a high-performing data engineering team. In this role, you will own the strategy and execution of an enterprise-scale transaction matching and data platform, partnering closely with business stakeholders and senior leadership to deliver robust, scalable data solutions. You thrive in a fast-paced, high-growth environment, bring a bias for action, and know how to balance long-term platform vision with near-term delivery.
What You'll Do
• Lead and develop a team of data engineers focused on building and maintaining data products centered around transaction matching and reconciliation systems.
• Collaborate with business stakeholders and senior leadership to define, plan, and execute the data engineering roadmap, aligning technical investments with business priorities.
• Identify and drive automation opportunities across pipelines, operational workflows, and reporting to reduce manual effort and improve scalability.
• Remain hands-on with coding, code reviews, and the design and development of high-performance ETL/ELT pipelines using modern tooling.
• Design and implement enterprise-scale data architectures spanning on-prem and cloud environments.
• Partner with Product and Business Analyst teams to translate functional requirements into scalable technical solutions.
• Establish and enforce data quality standards, testing frameworks, and SLAs across all integrated data sources and pipelines.
• Oversee the design, implementation, and maintenance of data integrations between on-prem and cloud services, ensuring reliability, performance, and security.
• Drive continuous improvement in engineering practices, tools, and processes; implement long-term solutions to stabilize and scale the data platform.
• Champion documentation culture — architecture diagrams, data dictionaries, pipeline specs, and runbooks.
• Manage team capacity, sprint planning, and delivery in an agile environment; remove blockers and set the team up for success.
• Lead hiring, onboarding, and performance management for the data engineering team; develop talent and foster a high-trust, collaborative culture.
Required Skills & Qualifications
• Bachelor's degree in Computer Science, Engineering, or equivalent technical discipline; advanced degree a plus.
• 8+ years of experience in data engineering, with at least 3 years in a people management or technical lead capacity.
• Strong hands-on proficiency with SQL (Oracle, PostgreSQL, MSSQL, or Snowflake) and a JVM-based language (Java or Groovy); Python scripting is a plus.
• Proven track record designing and delivering large-scale ETL/ELT pipelines and data transformation applications.
• Hands-on experience with DBT and Snowflake.
• Experience working in regulated financial services environments (payments, reconciliation, or safeguarding); exposure to compliance-driven delivery under regulatory deadlines preferred.
• Familiarity with CI/CD tooling (GitLab, Jenkins, or equivalent) and software engineering best practices, including JVM-based microservice architectures.
• Experience with automation frameworks, unit testing, and integration testing for data pipelines.
• Strong working knowledge of cloud platforms (AWS, Azure, or GCP) and cloud data architecture concepts.
• Demonstrated experience leading teams within an Agile delivery environment.
• Excellent stakeholder management and communication skills; ability to translate complex technical concepts for business audiences.
Preferred
• Familiarity with Oracle EPM or similar reconciliation/financial close platforms.
• Experience in payments or financial data processing at enterprise scale.
Benefits: Medical, Vision, Dental, PTO, Sick Time, HAS, 401K
Compensation : $150 - $175K + 8% bonus opportunity
Location: Remote, USA
Employees: 1,000 | Team Size: 5-8
Industry: FinTech | Payments Domain
Reports To
About the Role
We are seeking an experienced and hands-on Director of Data Engineering to lead and scale a high-performing data engineering team. In this role, you will own the strategy and execution of an enterprise-scale transaction matching and data platform, partnering closely with business stakeholders and senior leadership to deliver robust, scalable data solutions. You thrive in a fast-paced, high-growth environment, bring a bias for action, and know how to balance long-term platform vision with near-term delivery.
What You'll Do
• Lead and develop a team of data engineers focused on building and maintaining data products centered around transaction matching and reconciliation systems.
• Collaborate with business stakeholders and senior leadership to define, plan, and execute the data engineering roadmap, aligning technical investments with business priorities.
• Identify and drive automation opportunities across pipelines, operational workflows, and reporting to reduce manual effort and improve scalability.
• Remain hands-on with coding, code reviews, and the design and development of high-performance ETL/ELT pipelines using modern tooling.
• Design and implement enterprise-scale data architectures spanning on-prem and cloud environments.
• Partner with Product and Business Analyst teams to translate functional requirements into scalable technical solutions.
• Establish and enforce data quality standards, testing frameworks, and SLAs across all integrated data sources and pipelines.
• Oversee the design, implementation, and maintenance of data integrations between on-prem and cloud services, ensuring reliability, performance, and security.
• Drive continuous improvement in engineering practices, tools, and processes; implement long-term solutions to stabilize and scale the data platform.
• Champion documentation culture — architecture diagrams, data dictionaries, pipeline specs, and runbooks.
• Manage team capacity, sprint planning, and delivery in an agile environment; remove blockers and set the team up for success.
• Lead hiring, onboarding, and performance management for the data engineering team; develop talent and foster a high-trust, collaborative culture.
Required Skills & Qualifications
• Bachelor's degree in Computer Science, Engineering, or equivalent technical discipline; advanced degree a plus.
• 8+ years of experience in data engineering, with at least 3 years in a people management or technical lead capacity.
• Strong hands-on proficiency with SQL (Oracle, PostgreSQL, MSSQL, or Snowflake) and a JVM-based language (Java or Groovy); Python scripting is a plus.
• Proven track record designing and delivering large-scale ETL/ELT pipelines and data transformation applications.
• Hands-on experience with DBT and Snowflake.
• Experience working in regulated financial services environments (payments, reconciliation, or safeguarding); exposure to compliance-driven delivery under regulatory deadlines preferred.
• Familiarity with CI/CD tooling (GitLab, Jenkins, or equivalent) and software engineering best practices, including JVM-based microservice architectures.
• Experience with automation frameworks, unit testing, and integration testing for data pipelines.
• Strong working knowledge of cloud platforms (AWS, Azure, or GCP) and cloud data architecture concepts.
• Demonstrated experience leading teams within an Agile delivery environment.
• Excellent stakeholder management and communication skills; ability to translate complex technical concepts for business audiences.
Preferred
• Familiarity with Oracle EPM or similar reconciliation/financial close platforms.
• Experience in payments or financial data processing at enterprise scale.
Benefits: Medical, Vision, Dental, PTO, Sick Time, HAS, 401K
Compensation : $150 - $175K + 8% bonus opportunity






