

CBTS
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "X months" and a pay rate of "$X/hour". Key skills required include ETL, SQL, Python, and cloud services (AWS, Azure, GCP). Experience in large-scale database migrations and Payments/Treasury Management products is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 9, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Cincinnati, OH
-
🧠 - Skills detailed
#Data Migration #Automation #GCP (Google Cloud Platform) #Compliance #Azure #"ETL (Extract #Transform #Load)" #Migration #SQL (Structured Query Language) #AWS (Amazon Web Services) #Data Processing #Python #Database Migration #Cloud #Data Quality #Data Engineering
Role description
Job Description:
Qualifications
• Proven experience with large-scale database migrations (10K+ customers, multiple LOBs).
• Strong understanding of Payments and Treasury Management products.
• Expertise in ETL processes, SQL, and Python for automation.
• Familiarity with Cloud data services (AWS, Azure, or GCP).
• Knowledge of data quality, reconciliation, and compliance best practices.
Key Responsibilities
• Design and implement data migration strategies from DB2 to Cloud.
• Build and optimize ETL pipelines and Python automation scripts for data processing and reconciliation.
• Define and execute migration phases, checkpoints, and quality controls.
• Perform data validation and reconciliation to ensure accuracy and compliance.
• Collaborate with DBAs, architects, application teams, and business stakeholders across multiple lines of business.
Job Description:
Qualifications
• Proven experience with large-scale database migrations (10K+ customers, multiple LOBs).
• Strong understanding of Payments and Treasury Management products.
• Expertise in ETL processes, SQL, and Python for automation.
• Familiarity with Cloud data services (AWS, Azure, or GCP).
• Knowledge of data quality, reconciliation, and compliance best practices.
Key Responsibilities
• Design and implement data migration strategies from DB2 to Cloud.
• Build and optimize ETL pipelines and Python automation scripts for data processing and reconciliation.
• Define and execute migration phases, checkpoints, and quality controls.
• Perform data validation and reconciliation to ensure accuracy and compliance.
• Collaborate with DBAs, architects, application teams, and business stakeholders across multiple lines of business.