

ProntoDigital, LLC
Data Architect / Principal Data Engineer III
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect / Principal Data Engineer III in Cincinnati, OH, hybrid (3-4 days onsite). Contract length exceeds 6 months, with a pay rate of $55-60/hour. Requires 8+ years in data engineering and strong SQL, ETL/ELT, and cloud experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
440
-
ποΈ - Date
October 9, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Cincinnati, OH
-
π§ - Skills detailed
#Scala #Informatica #DevOps #Data Engineering #BI (Business Intelligence) #Big Data #Spark (Apache Spark) #Database Design #Java #Azure #BigQuery #Data Integrity #GCP (Google Cloud Platform) #Compliance #Data Architecture #DataStage #Python #Azure Data Factory #Metadata #Cloud #Databricks #Data Modeling #Scripting #ADF (Azure Data Factory) #SQL (Structured Query Language) #Data Management #Data Migration #Data Ingestion #AWS Glue #"ETL (Extract #Transform #Load)" #Migration #Strategy #AWS (Amazon Web Services) #Datasets #Hadoop #Synapse #Data Quality
Role description
Job Title: Data Architect / Principal Data Engineer IIILocation: Cincinnati, OH (Hybrid β 3 to 4 days onsite per week)Employment Type: W-2 only (No sponsorship available)Compensation: $55-60/hour + benefits
Position Overview
We are looking for an experienced Data Architect / Principal Data Engineer to lead a strategic data migration initiative, moving from DB2 to a modern cloud-based data platform (Azure, AWS, or GCP). This role is ideal for a hands-on technical leader with a strong background in data architecture, ETL/ELT development, and data quality within complex enterprise environments.
You will collaborate with business, technology, and operations teams to deliver a phased, validated migration strategy that ensures data integrity, governance, and performance throughout the transition.
Key Responsibilities
Lead the architecture and implementation of data migration from DB2 to a cloud data platform, ensuring data consistency, lineage, and traceability.
Design and develop scalable ETL/ELT pipelines for data ingestion, transformation, and delivery.
Automate data quality checks and validation processes using Python or scripting tools.
Define and manage structured migration phases with clear checkpoints and metrics.
Collaborate closely with DBAs, data architects, and business stakeholders to align with compliance and performance standards.
Contribute to enterprise data modeling, metadata management, and governance frameworks.
Support post-migration testing and reconciliation to ensure accurate reporting and business continuity.
Top Required Skills
Proven experience in Data Engineering and Business Intelligence in large-scale enterprise environments
Strong database design and SQL performance optimization skills
Demonstrated experience leading or supporting enterprise-scale data migration projects
Solid understanding of data validation, reconciliation, and quality control practices
Preferred Qualifications
Familiarity with Java and WebSphere Application Server (WAS 9)
Experience with DevOps practices in data engineering, including CI/CD pipeline integration
Exposure to big data tools such as Spark or Hadoop
Proficiency in ETL tools like Informatica, DataStage, or Azure Data Factory
Cloud experience with Azure Synapse, Databricks, AWS Glue, or GCP BigQuery
Background in Payments or Treasury Management systems is a plus
Qualifications
8+ years of experience in data engineering, data architecture, or analytics
3+ years working on data migration initiatives involving high-volume datasets and multiple business units
Proficient in SQL, DB2, ETL/ELT, and Python scripting
Working knowledge of cloud data services (Azure, AWS, or GCP)
Excellent communication and cross-functional collaboration skills
Whatβs Offered
Competitive hourly rate with benefits
Medical, dental, and vision insurance
Optional life and disability coverage
401(k) with matching contributions
Paid vacation and bench time
Training allowances
Employee referral bonuses
Job Types: Full-time, Contract
Pay: From $55.00 per hour
Work Location: On the road
Job Title: Data Architect / Principal Data Engineer IIILocation: Cincinnati, OH (Hybrid β 3 to 4 days onsite per week)Employment Type: W-2 only (No sponsorship available)Compensation: $55-60/hour + benefits
Position Overview
We are looking for an experienced Data Architect / Principal Data Engineer to lead a strategic data migration initiative, moving from DB2 to a modern cloud-based data platform (Azure, AWS, or GCP). This role is ideal for a hands-on technical leader with a strong background in data architecture, ETL/ELT development, and data quality within complex enterprise environments.
You will collaborate with business, technology, and operations teams to deliver a phased, validated migration strategy that ensures data integrity, governance, and performance throughout the transition.
Key Responsibilities
Lead the architecture and implementation of data migration from DB2 to a cloud data platform, ensuring data consistency, lineage, and traceability.
Design and develop scalable ETL/ELT pipelines for data ingestion, transformation, and delivery.
Automate data quality checks and validation processes using Python or scripting tools.
Define and manage structured migration phases with clear checkpoints and metrics.
Collaborate closely with DBAs, data architects, and business stakeholders to align with compliance and performance standards.
Contribute to enterprise data modeling, metadata management, and governance frameworks.
Support post-migration testing and reconciliation to ensure accurate reporting and business continuity.
Top Required Skills
Proven experience in Data Engineering and Business Intelligence in large-scale enterprise environments
Strong database design and SQL performance optimization skills
Demonstrated experience leading or supporting enterprise-scale data migration projects
Solid understanding of data validation, reconciliation, and quality control practices
Preferred Qualifications
Familiarity with Java and WebSphere Application Server (WAS 9)
Experience with DevOps practices in data engineering, including CI/CD pipeline integration
Exposure to big data tools such as Spark or Hadoop
Proficiency in ETL tools like Informatica, DataStage, or Azure Data Factory
Cloud experience with Azure Synapse, Databricks, AWS Glue, or GCP BigQuery
Background in Payments or Treasury Management systems is a plus
Qualifications
8+ years of experience in data engineering, data architecture, or analytics
3+ years working on data migration initiatives involving high-volume datasets and multiple business units
Proficient in SQL, DB2, ETL/ELT, and Python scripting
Working knowledge of cloud data services (Azure, AWS, or GCP)
Excellent communication and cross-functional collaboration skills
Whatβs Offered
Competitive hourly rate with benefits
Medical, dental, and vision insurance
Optional life and disability coverage
401(k) with matching contributions
Paid vacation and bench time
Training allowances
Employee referral bonuses
Job Types: Full-time, Contract
Pay: From $55.00 per hour
Work Location: On the road