

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 3-5 years of experience, focusing on supply chain initiatives. Contract length is unspecified, with a pay rate of "unknown." Required skills include Python, SQL, and cloud platforms (GCP preferred).
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 5, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Atlanta, GA
-
π§ - Skills detailed
#SQL (Structured Query Language) #Forecasting #Data Integration #Python #Terraform #"ETL (Extract #Transform #Load)" #Cloud #Model Deployment #AI (Artificial Intelligence) #Scala #Deployment #Azure #Agile #Data Engineering #Data Science #Datasets #GCP (Google Cloud Platform) #Scrum #AWS (Amazon Web Services) #ML (Machine Learning) #Data Pipeline #Data Transformations
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
About the Role
[We cannot provide sponsorship at this time]
We are hiring a data engineer to join the supply chain pod within a broader cross-functional data organization that also includes digital sales and service sales pods. While this role is focused on supply chain initiatives, the team operates as one unit, and resources may support different pods as needed. Each pod includes a dedicated pod leader, scrum master, and other support roles.
This engineer will work alongside senior engineers who will delegate and guide project work. The primary focus is to support initiatives such as route optimization, slot optimization, AI-driven efforts, and demand forecasting. A key part of the role involves helping transition machine learning models from data scientists to long-term ownership by the GTS team.
The role requires strong data engineering fundamentals, an ability to work closely with data scientists, and a focus on productizing models and building data pipelines that move data into and out of the cloud efficiently.
Responsibilities
β’ Build and maintain robust, scalable data pipelines using Python and SQL
β’ Operationalize and optimize machine learning models developed by data scientists
β’ Move and transform large datasets across cloud platforms (GCP preferred; AWS or Azure acceptable)
β’ Work with demand planning software and help integrate models into tools like o9
β’ Integrate external datasets, including third-party data, with internal operational data and file systems
β’ Collaborate with GTS and cross-functional teams to ensure reliable data delivery and model deployment
β’ Use tools such as Terraform and Dataform to manage infrastructure and data transformations
β’ Support sales and supply chain insights by contributing to attribution modeling and data integration efforts
Requirements
β’ 3 to 5 years of experience in a data engineering role
β’ Proficiency in Python and SQL
β’ Experience with one or more cloud platforms (GCP preferred; AWS or Azure acceptable)
β’ Familiarity with infrastructure-as-code and transformation tools like Terraform and Dataform
β’ Experience working with data scientists and operationalizing ML models
β’ Comfortable working with external data sources and integrating them into existing systems
β’ Ability to work in a fast-paced, agile environment with a pod-based team structure