

Supply Chain Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Supply Chain Data Engineer" with a contract length of 3 months, remote work location, and a focus on Azure ETL/ELT pipelines, OCR integration, and database design. Key skills include Azure Data Factory, PostgreSQL, and Python.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
440
-
ποΈ - Date discovered
September 27, 2025
π - Project duration
3 to 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Agile #Data Architecture #Storage #Scala #Data Pipeline #SAP #Data Extraction #Azure #Databases #PostgreSQL #Python #Azure Data Factory #Data Quality #Automation #Data Engineering #Documentation #Database Schema #ML (Machine Learning) #API (Application Programming Interface) #Schema Design #Metadata
Role description
Job Title: Supply-Chain Data Engineer
Location: Remote
Duration: 3 Months + Possible Extension
About the Role:
We are seeking a skilled Data Engineer to lead the design and development of scalable data pipelines and databases for a high-impact 3-month project. The role involves building ingestion workflows, integrating OCR for contract metadata extraction, and preparing data flows for future AI integration. Youβll collaborate with Software Developers and Machine Learning Engineers to ensure seamless data architecture and delivery.
Key Responsibilities:
β’ Build and deploy ETL/ELT pipelines in Azure to process large-scale contract data.
β’ Configure and manage OCR workflows (e.g., Azure Document Intelligence, Tesseract) for metadata extraction.
β’ Design and implement scalable database schemas in Azure PostgreSQL.
β’ Collaborate with cross-functional teams on schema design for AI readiness.
β’ Support integration with procurement systems (e.g., SAP Ariba) and APIs.
β’ Ensure data quality, error handling, and proper documentation.
Required Skills:
β’ Strong experience with Azure Data Factory, Blob Storage, PostgreSQL.
β’ Proficiency in Python for automation and data engineering.
β’ Hands-on experience with ETL/ELT pipelines and large-scale ingestion.
β’ Familiarity with OCR tools and validating extracted data.
β’ Exposure to SAP Ariba / procurement systems and API integration.
β’ Strong collaboration skills in Agile environments.
Preferred Qualifications:
β’ Experience with contract ingestion or procurement data projects.
β’ Background in integrating data pipelines with AI/analytics platforms.
Why Join:
β’ Work on a focused 3-month project with high future impact.
β’ Exposure to cutting-edge tools in data engineering, OCR, and procurement systems.
β’ Collaborative, cross-functional environment with software and AI specialists.
Job Title: Supply-Chain Data Engineer
Location: Remote
Duration: 3 Months + Possible Extension
About the Role:
We are seeking a skilled Data Engineer to lead the design and development of scalable data pipelines and databases for a high-impact 3-month project. The role involves building ingestion workflows, integrating OCR for contract metadata extraction, and preparing data flows for future AI integration. Youβll collaborate with Software Developers and Machine Learning Engineers to ensure seamless data architecture and delivery.
Key Responsibilities:
β’ Build and deploy ETL/ELT pipelines in Azure to process large-scale contract data.
β’ Configure and manage OCR workflows (e.g., Azure Document Intelligence, Tesseract) for metadata extraction.
β’ Design and implement scalable database schemas in Azure PostgreSQL.
β’ Collaborate with cross-functional teams on schema design for AI readiness.
β’ Support integration with procurement systems (e.g., SAP Ariba) and APIs.
β’ Ensure data quality, error handling, and proper documentation.
Required Skills:
β’ Strong experience with Azure Data Factory, Blob Storage, PostgreSQL.
β’ Proficiency in Python for automation and data engineering.
β’ Hands-on experience with ETL/ELT pipelines and large-scale ingestion.
β’ Familiarity with OCR tools and validating extracted data.
β’ Exposure to SAP Ariba / procurement systems and API integration.
β’ Strong collaboration skills in Agile environments.
Preferred Qualifications:
β’ Experience with contract ingestion or procurement data projects.
β’ Background in integrating data pipelines with AI/analytics platforms.
Why Join:
β’ Work on a focused 3-month project with high future impact.
β’ Exposure to cutting-edge tools in data engineering, OCR, and procurement systems.
β’ Collaborative, cross-functional environment with software and AI specialists.