

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a 3-month contract in Abbott Park, IL, offering a pay rate of "unknown." Key skills required include Azure PostgreSQL, ETL/ELT pipeline development, Python proficiency, and experience with OCR tools. Familiarity with SAP Ariba integration is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 27, 2025
π - Project duration
3 to 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Green Oaks, IL
-
π§ - Skills detailed
#AI (Artificial Intelligence) #Consulting #"ETL (Extract #Transform #Load)" #Agile #Scala #Data Pipeline #Scripting #SAP #Data Extraction #Azure #Databases #PostgreSQL #Python #Security #Automation #Data Engineering #Documentation #Database Schema #ML (Machine Learning) #API (Application Programming Interface) #Schema Design #GitHub #REST (Representational State Transfer) #Database Architecture #Metadata
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, DivIHN Integration Inc., is seeking the following. Apply via Dice today!
DivIHN (pronounced divine ) is a CMMI ML3-certified Technology and Talent solutions firm. Driven by a unique Purpose, Culture, and Value Delivery Model, we enable meaningful connections between talented professionals and forward-thinking organizations. Since our formation in 2002, organizations across commercial and public sectors have been trusting us to help build their teams with exceptional temporary and permanent talent.
Visit us at to learn more and view our open positions.
Please apply or call one of us to learn more
For further inquiries regarding the following opportunity, please contact one of our Talent Specialists.
Ragu Mohan at
Nithiya at
Title: Data Engineer
Duration: 3 Months
Location: Abbott Park, IL
Description:
This is a non-exempt Data Engineer position (3-Months Contract)
Project: Supplier Contract Ingestion & Data Pipeline for Negotiation AI
About The Project
Our client is launching a focused 3-month initiative to:
β’ Bulk-ingest over 50,000 supplier contracts into SAP Ariba, with metadata extraction powered by OCR.
β’ Design and implement the database architecture and data flows that will feed our Negotiation AI-including contract detail extraction and supplier spend analytics.
β’ This work currently runs separately from the Negotiation AI MVP, but must be future-ready for seamless integration.
Role Overview
As Data Engineer, you will own the end-to-end data pipelines. This includes designing scalable databases, developing ingestion workflows, collaborating with our internal Machine Learning Engineering team, and structuring supplier spend data. You'll work closely with the Full Stack Developer to co-design the database schema for the Negotiation AI and ensure future compatibility with the ingestion pipeline.
Key Deliverables
β’ Ingestion Pipeline: Build and deploy a robust ETL/ELT pipeline using Azure to ingest 50,000+ contracts.
β’ Metadata Extraction: Configure and run OCR workflows (e.g., OlmOCR/Azure Document Intelligence) to extract key contract fields such as dates, parties, terms etc.
β’ Scalable Database Schema: Design and implement a schema in Azure PostgreSQL to store contract metadata, OCR outputs, and supplier spend data. Collaborate with the Software Developer to design a future-ready schema for AI consumption.
Required Skills & Experience
Data Engineering & ETL/ELT
β’ Experience with Azure PostgreSQL or similar relational databases
β’ Skilled in building scalable ETL/ELT pipelines (preferably using Azure)
β’ Proficient in Python for scripting and automation
OCR Collaboration
β’ Ability to work with internal Machine Learning Engineering teams to validate and structure extracted data
β’ Familiarity with OCR tools (e.g., Azure Document Intelligence, Tesseract) is a plus
SAP Ariba Integration
β’ Exposure to cXML, ARBCI, SOAP/REST protocols is a plus
β’ Comfortable with API authentication (OAuth, tokens) and enterprise-grade security
Agile Collaboration & Documentation
β’ Comfortable working in sprints and cross-functional teams
β’ Able to use Github Copilot to document practices for handover
Preferred Qualifications
β’ Experience with large-scale contract ingestion projects
β’ Familiarity with procurement systems and contract lifecycle management
β’ Background in integrating data pipelines with AI or analytics platforms
Why to join our client?
β’ Focused Scope with Future Impact: Lay the foundation for an AI-driven negotiation platform
β’ Cutting-Edge Tools: Work with SAP Ariba, OCR, Azure, and advanced analytics
β’ Collaborative Environment: Partner with Software Developers and AI specialists
About us:
DivIHN, the 'IT Asset Performance Services' organization, provides Professional Consulting, Custom Projects, and Professional Resource Augmentation services to clients in the Mid-West and beyond. The strategic characteristics of the organization are Standardization, Specialization, and Collaboration.
DivIHN is an equal opportunity employer. DivIHN does not and shall not discriminate against any employee or qualified applicant on the basis of race, color, religion (creed), gender, gender expression, age, national origin (ancestry), disability, marital status, sexual orientation, or military status.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, DivIHN Integration Inc., is seeking the following. Apply via Dice today!
DivIHN (pronounced divine ) is a CMMI ML3-certified Technology and Talent solutions firm. Driven by a unique Purpose, Culture, and Value Delivery Model, we enable meaningful connections between talented professionals and forward-thinking organizations. Since our formation in 2002, organizations across commercial and public sectors have been trusting us to help build their teams with exceptional temporary and permanent talent.
Visit us at to learn more and view our open positions.
Please apply or call one of us to learn more
For further inquiries regarding the following opportunity, please contact one of our Talent Specialists.
Ragu Mohan at
Nithiya at
Title: Data Engineer
Duration: 3 Months
Location: Abbott Park, IL
Description:
This is a non-exempt Data Engineer position (3-Months Contract)
Project: Supplier Contract Ingestion & Data Pipeline for Negotiation AI
About The Project
Our client is launching a focused 3-month initiative to:
β’ Bulk-ingest over 50,000 supplier contracts into SAP Ariba, with metadata extraction powered by OCR.
β’ Design and implement the database architecture and data flows that will feed our Negotiation AI-including contract detail extraction and supplier spend analytics.
β’ This work currently runs separately from the Negotiation AI MVP, but must be future-ready for seamless integration.
Role Overview
As Data Engineer, you will own the end-to-end data pipelines. This includes designing scalable databases, developing ingestion workflows, collaborating with our internal Machine Learning Engineering team, and structuring supplier spend data. You'll work closely with the Full Stack Developer to co-design the database schema for the Negotiation AI and ensure future compatibility with the ingestion pipeline.
Key Deliverables
β’ Ingestion Pipeline: Build and deploy a robust ETL/ELT pipeline using Azure to ingest 50,000+ contracts.
β’ Metadata Extraction: Configure and run OCR workflows (e.g., OlmOCR/Azure Document Intelligence) to extract key contract fields such as dates, parties, terms etc.
β’ Scalable Database Schema: Design and implement a schema in Azure PostgreSQL to store contract metadata, OCR outputs, and supplier spend data. Collaborate with the Software Developer to design a future-ready schema for AI consumption.
Required Skills & Experience
Data Engineering & ETL/ELT
β’ Experience with Azure PostgreSQL or similar relational databases
β’ Skilled in building scalable ETL/ELT pipelines (preferably using Azure)
β’ Proficient in Python for scripting and automation
OCR Collaboration
β’ Ability to work with internal Machine Learning Engineering teams to validate and structure extracted data
β’ Familiarity with OCR tools (e.g., Azure Document Intelligence, Tesseract) is a plus
SAP Ariba Integration
β’ Exposure to cXML, ARBCI, SOAP/REST protocols is a plus
β’ Comfortable with API authentication (OAuth, tokens) and enterprise-grade security
Agile Collaboration & Documentation
β’ Comfortable working in sprints and cross-functional teams
β’ Able to use Github Copilot to document practices for handover
Preferred Qualifications
β’ Experience with large-scale contract ingestion projects
β’ Familiarity with procurement systems and contract lifecycle management
β’ Background in integrating data pipelines with AI or analytics platforms
Why to join our client?
β’ Focused Scope with Future Impact: Lay the foundation for an AI-driven negotiation platform
β’ Cutting-Edge Tools: Work with SAP Ariba, OCR, Azure, and advanced analytics
β’ Collaborative Environment: Partner with Software Developers and AI specialists
About us:
DivIHN, the 'IT Asset Performance Services' organization, provides Professional Consulting, Custom Projects, and Professional Resource Augmentation services to clients in the Mid-West and beyond. The strategic characteristics of the organization are Standardization, Specialization, and Collaboration.
DivIHN is an equal opportunity employer. DivIHN does not and shall not discriminate against any employee or qualified applicant on the basis of race, color, religion (creed), gender, gender expression, age, national origin (ancestry), disability, marital status, sexual orientation, or military status.