

Jobs via Dice
ETL Datastage Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Datastage Engineer in Cincinnati, OH, with a contract length of unspecified duration and a pay rate of "expected hourly pay rate." Key skills include IBM DataStage, Snowflake, DBT, and MettleCI, with experience in data pipeline testing required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Cincinnati, OH
-
🧠 - Skills detailed
#Data Quality #Regression #Monitoring #Automation #dbt (data build tool) #Data Management #Data Mining #Data Processing #Scala #Data Modeling #Programming #"ETL (Extract #Transform #Load)" #Snowflake #Data Science #Data Accuracy #Data Engineering #Data Pipeline #Automated Testing #Disaster Recovery #Unit Testing #DataStage
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Javen Technologies, Inc, is seeking the following. Apply via Dice today!
Seeking to fill in a data Engineer with ETL pipelines using IBM DataStage, Snowflake, DBT, and MettleCI exp for immediate onsite contract opportunity with my direct client in Cincinnati OH.
If interested , share your word document resume, work authorization, expected hourly pay rate, current location, availability for onsite work , skill stack and LinkedIn.
Reach me for more details. thanks.
If not, Appreciate any referrals.
Local to Cincinnati OH location only.
Openings-3
Technical Skills
Must Have
Building and optimizing ETL pipelines using IBM DataStage, Snowflake, DBT, and MettleCI
Experience with data pipeline testing (Unit testing for ETL/ELT components, Integration and regression testing, Data quality and validation frameworks, Automated testing within CI/CD workflows)
Strong technical expertise to deliver efficient, accurate, and scalable data solutions that follow established governance and coding standards
Essential Job Requirements:
This Data Engineer III contract role will support the Principal Data Engineer by developing, testing, and monitoring highquality ETL code for several timesensitive, enterpriselevel initiatives. The position requires strong technical expertise to deliver efficient, accurate, and scalable data solutions that follow established governance and coding standards. Core responsibilities include building and optimizing ETL pipelines using IBM DataStage, Snowflake, dbt, and MettleCI, with a focus on code quality, data accuracy, and operational reliability. The ideal candidate will be able to rapidly contribute to critical project milestones while maintaining strict adherence to best practices and governance processes.
Qualifications:
• Design, build, test, and maintain scalable data management systems and pipelines.
• Develop high performance algorithms, predictive models, prototypes, and custom software components to support analytics and data processing needs.
• Ensure all data systems adhere to business requirements, governance standards, and industry best practices.
• Demonstrated experience with data pipeline testing, including: Unit testing for ETL/ELT components, Integration and regression testing, Data quality and validation frameworks, Automated testing within CI/CD workflows
• Integrate new data management tools, engineering technologies, and automation capabilities into existing architectures.
• Establish and optimize processes for data modeling, data mining, and data production workflows.
• Research and identify new opportunities for leveraging existing data assets.
• Utilize a variety of programming languages, integration tools, and platforms to connect systems and enable seamless data flow.
• Collaborate closely with cross functional partners, including solution architects, IT teams, and data scientists, to achieve project objectives.
• Recommend and implement improvements to enhance data quality, reliability, and performance.
• Maintain and update disaster recovery procedures to ensure system resilience.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Javen Technologies, Inc, is seeking the following. Apply via Dice today!
Seeking to fill in a data Engineer with ETL pipelines using IBM DataStage, Snowflake, DBT, and MettleCI exp for immediate onsite contract opportunity with my direct client in Cincinnati OH.
If interested , share your word document resume, work authorization, expected hourly pay rate, current location, availability for onsite work , skill stack and LinkedIn.
Reach me for more details. thanks.
If not, Appreciate any referrals.
Local to Cincinnati OH location only.
Openings-3
Technical Skills
Must Have
Building and optimizing ETL pipelines using IBM DataStage, Snowflake, DBT, and MettleCI
Experience with data pipeline testing (Unit testing for ETL/ELT components, Integration and regression testing, Data quality and validation frameworks, Automated testing within CI/CD workflows)
Strong technical expertise to deliver efficient, accurate, and scalable data solutions that follow established governance and coding standards
Essential Job Requirements:
This Data Engineer III contract role will support the Principal Data Engineer by developing, testing, and monitoring highquality ETL code for several timesensitive, enterpriselevel initiatives. The position requires strong technical expertise to deliver efficient, accurate, and scalable data solutions that follow established governance and coding standards. Core responsibilities include building and optimizing ETL pipelines using IBM DataStage, Snowflake, dbt, and MettleCI, with a focus on code quality, data accuracy, and operational reliability. The ideal candidate will be able to rapidly contribute to critical project milestones while maintaining strict adherence to best practices and governance processes.
Qualifications:
• Design, build, test, and maintain scalable data management systems and pipelines.
• Develop high performance algorithms, predictive models, prototypes, and custom software components to support analytics and data processing needs.
• Ensure all data systems adhere to business requirements, governance standards, and industry best practices.
• Demonstrated experience with data pipeline testing, including: Unit testing for ETL/ELT components, Integration and regression testing, Data quality and validation frameworks, Automated testing within CI/CD workflows
• Integrate new data management tools, engineering technologies, and automation capabilities into existing architectures.
• Establish and optimize processes for data modeling, data mining, and data production workflows.
• Research and identify new opportunities for leveraging existing data assets.
• Utilize a variety of programming languages, integration tools, and platforms to connect systems and enable seamless data flow.
• Collaborate closely with cross functional partners, including solution architects, IT teams, and data scientists, to achieve project objectives.
• Recommend and implement improvements to enhance data quality, reliability, and performance.
• Maintain and update disaster recovery procedures to ensure system resilience.






