BCforward

Data Architect / Principal Data Engineer III

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect / Principal Data Engineer III in Cincinnati, OH, for a 9-month contract at $85/hr on W-2. Key skills include DB and SQL knowledge, large-scale migration experience, and expertise in ETL processes and Python.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
680
-
πŸ—“οΈ - Date
October 9, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Cincinnati, OH
-
🧠 - Skills detailed
#DevOps #Data Engineering #BI (Business Intelligence) #Big Data #Spark (Apache Spark) #Java #Azure #Database Migration #Automation #GCP (Google Cloud Platform) #Compliance #Data Architecture #Python #Data Framework #Cloud #Data Modeling #SQL (Structured Query Language) #Data Migration #Data Pipeline #"ETL (Extract #Transform #Load)" #Migration #AWS (Amazon Web Services) #Data Processing #Hadoop #Data Quality
Role description
BCforward is seeking highly motivated and experienced Data Architect / Principal Data Engineer III Note: Must be Local to Ohio and willing to work on a W-2. This is an onsite job. Job Tittle: Data Architect / Principal Data Engineer III Location: 38 Fountain Square Plaza, Cincinnati, OH 45202- Onsite Job Duration: 9 Months Contract Pay Rate: $85/Hr. On W2 Must Have Business Intelligence - Data Engineering DB and SQL knowledge Previous large scale migration experience Nice To Have Experience with Java and WebSphere Application Server (WAS 9) Familiarity with DevOps for data pipelines and big data frameworks (Spark, Hadoop). Knowledge of data modeling and automation frameworks. Python or some ETL language skills Overview: We’re looking for a Data Architect to partner with other LOB's to help lead and execute the migration of commercial enterprise data from DB2 to a modern Cloud platform. This role combines hands-on development with strategic planning, ensuring a phased migration with strong data quality and reconciliation. Key Responsibilities: Design and implement data migration strategies from DB2 to Cloud. Build and optimize ETL pipelines and automation scripts. Use Python for data processing, validation, and reconciliation. Define migration phases, checkpoints, and quality controls. Collaborate with DBAs, architects, and business teams to ensure success. Qualifications: Experience with large-scale database migrations (10K+ customers, multiple LOBs). Strong understanding of Payments and Treasury Management products. Expertise in ETL processes, SQL, and Python for automation. Familiarity with Cloud data services (AWS, Azure, or GCP). Knowledge of data validation, reconciliation, and compliance best practices.