

h3 Technologies, LLC
Lead Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer (Level 9) with a contract length of unspecified duration, offering a competitive pay rate. It requires expertise in Ab Initio, AWS (S3/Redshift), and Hadoop, along with financial services experience. Remote work in the USA is expected.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 27, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Migration #Data Migration #AWS (Amazon Web Services) #Leadership #Cloud #Agile #PySpark #Redshift #Spark (Apache Spark) #AWS S3 (Amazon Simple Storage Service) #HDFS (Hadoop Distributed File System) #S3 (Amazon Simple Storage Service) #Programming #Ab Initio #Data Ingestion #AI (Artificial Intelligence) #Strategy #Data Engineering #Hadoop #"ETL (Extract #Transform #Load)"
Role description
The client is EST, CST is fine as well but need to be able to work EST or CST hours.
Role Title: RM - Lead Data Engineer (Level 9)
Location: Remote (USA)
Work Setup: 100% Remote; candidates must have valid US work authorization.
Business Driver: Financial Services client requires a Level 9 lead to manage cloud migration and ETL modernization.
Scope: Heavy focus on Ab Initio (Metaprogramming/PDL) and transitioning data to AWS (Redshift/S3).
Leadership: This is a "Lead" role; candidate must be able to coach others and estimate efforts for Agile sprints.
TOP HARD SKILLS 1) Ab Initio (Expert) - 5+ years
1. AWS (S3/Redshift) - 5+ years
1. Hadoop/Hive - 5+ years
NICE-TO-HAVES
• PySpark
Cloud Migration project experience
Banking/Financial Services background
DISQUALIFIERS
• Lack of expert-level Ab Initio (Metaprogramming/PDL) experience.
No experience with AWS or Hadoop environments.
• Title: Lead Data Engineer (Level 9)
• Location: Remote (USA)
• POSITION OVERVIEW This role joins the AI & Data team within a major Financial Services environment, focusing on large-scale data warehousing and ETL modernization. As a Lead Data Engineer, you will drive the design and delivery of business solutions that bridge legacy Ab Initio frameworks with modern AWS and Hadoop ecosystems.
DAY-TO-DAY RESPONSIBILITIES
• Design, code, and deliver complex business solutions using Ab Initio, including Metaprogramming, PDL, and generic frameworks.
• Lead ETL projects and technical solution design within the Banking Domain.
• Manage data migration and integration tasks for cloud-based applications.
• Identify technical enablers and calculate the level of effort for data ingestion and transformation in DWH environments.
• Oversee and coordinate deliverables for DWH and Cloud Agile teams.
• Coach team members by sharing best practices, technical insights, and domain expertise.
TOP 3 REQUIRED SKILLS 1) ETL-Abinitio (Expert level, 5+ years)
1. AWS - S3 and Redshift (5+ years)
1. Hadoop Ecosystem - HDFS, Spark, Hive (5+ years)
ADDITIONAL DESIRED SKILLS
• Cloud Migration strategy
• PySpark development
• Financial Services / Banking Domain experience
• Agile team leadership
IDEAL CANDIDATE PROFILE The ideal candidate is a seasoned technical lead with a deep "under the hood" understanding of Ab Initio and modern cloud data stacks. They should possess the professional maturity to guide Agile teams while remaining hands-on in complex ETL design and AWS data migration.
Job Responsibilities
• Lead the design and development of complex ETL solutions using Ab Initio.
• Integrate and migrate data to AWS environments using S3 and Redshift.
• Collaborate with Agile teams to ingest and transform data for high-scale warehousing.
• Provide technical mentorship and best-practice guidance to junior and mid-level engineers.
The client is EST, CST is fine as well but need to be able to work EST or CST hours.
Role Title: RM - Lead Data Engineer (Level 9)
Location: Remote (USA)
Work Setup: 100% Remote; candidates must have valid US work authorization.
Business Driver: Financial Services client requires a Level 9 lead to manage cloud migration and ETL modernization.
Scope: Heavy focus on Ab Initio (Metaprogramming/PDL) and transitioning data to AWS (Redshift/S3).
Leadership: This is a "Lead" role; candidate must be able to coach others and estimate efforts for Agile sprints.
TOP HARD SKILLS 1) Ab Initio (Expert) - 5+ years
1. AWS (S3/Redshift) - 5+ years
1. Hadoop/Hive - 5+ years
NICE-TO-HAVES
• PySpark
Cloud Migration project experience
Banking/Financial Services background
DISQUALIFIERS
• Lack of expert-level Ab Initio (Metaprogramming/PDL) experience.
No experience with AWS or Hadoop environments.
• Title: Lead Data Engineer (Level 9)
• Location: Remote (USA)
• POSITION OVERVIEW This role joins the AI & Data team within a major Financial Services environment, focusing on large-scale data warehousing and ETL modernization. As a Lead Data Engineer, you will drive the design and delivery of business solutions that bridge legacy Ab Initio frameworks with modern AWS and Hadoop ecosystems.
DAY-TO-DAY RESPONSIBILITIES
• Design, code, and deliver complex business solutions using Ab Initio, including Metaprogramming, PDL, and generic frameworks.
• Lead ETL projects and technical solution design within the Banking Domain.
• Manage data migration and integration tasks for cloud-based applications.
• Identify technical enablers and calculate the level of effort for data ingestion and transformation in DWH environments.
• Oversee and coordinate deliverables for DWH and Cloud Agile teams.
• Coach team members by sharing best practices, technical insights, and domain expertise.
TOP 3 REQUIRED SKILLS 1) ETL-Abinitio (Expert level, 5+ years)
1. AWS - S3 and Redshift (5+ years)
1. Hadoop Ecosystem - HDFS, Spark, Hive (5+ years)
ADDITIONAL DESIRED SKILLS
• Cloud Migration strategy
• PySpark development
• Financial Services / Banking Domain experience
• Agile team leadership
IDEAL CANDIDATE PROFILE The ideal candidate is a seasoned technical lead with a deep "under the hood" understanding of Ab Initio and modern cloud data stacks. They should possess the professional maturity to guide Agile teams while remaining hands-on in complex ETL design and AWS data migration.
Job Responsibilities
• Lead the design and development of complex ETL solutions using Ab Initio.
• Integrate and migrate data to AWS environments using S3 and Redshift.
• Collaborate with Agile teams to ingest and transform data for high-scale warehousing.
• Provide technical mentorship and best-practice guidance to junior and mid-level engineers.





