

Brooksource
Machine Learning Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Junior Machine Learning Ops Engineer in Columbus, Ohio, with a contract-to-hire arrangement. The position requires 1 year of software development experience, proficiency in Python or Java, and familiarity with CI/CD pipelines and cloud services.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
November 20, 2025
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Columbus, Ohio Metropolitan Area
-
π§ - Skills detailed
#AI (Artificial Intelligence) #Docker #GCP (Google Cloud Platform) #Documentation #Google Cloud Storage #C++ #Swagger #GitLab #EC2 #Python #S3 (Amazon Simple Storage Service) #Storage #Data Science #Jenkins #Data Management #Version Control #Security #Data Engineering #AWS SageMaker #Automation #Automated Testing #GIT #Quality Assurance #Scripting #ML (Machine Learning) #Agile #Cloud #Data Pipeline #DataOps #AWS (Amazon Web Services) #SageMaker #Java #Lambda (AWS Lambda)
Role description
Junior Machine Learning Ops (MLOps) Engineer
Columbus, Ohio
Contradct to Hire
Brief Description:
The MLOps Engineer on the Enterprise Data and Analytics Team will work as part of the larger team to collaborate with Data Scientists to productionize Models used by business users to enhance their ability to make better decisions for Huntington and our customers.
Detailed Description:
This role is an entry level position requiring the individual to be curious and have the aptitude to learn while supporting the teamβs overall efforts to productionize Models created by Huntingtonβs Data Scientists.
To be successful the candidate must be driven and well-organized, with strong communication skills. It is essential for the individual to be self-motivated and tenacious, thrive in a collaborative, fast-paced environment, while completing tasks on agreed schedules.
Primary Responsibilities:
β’ Understand the current process and technical complexities of developing and deploying data pipelines and model builds and develop automation solutions to improve and extend the existing process to become an unattended delivery pipeline.
β’ Collaborate closely with product development, architecture, data engineering and testing teams to understand their current build and release processes and make recommendations for improvement through the automation of various tasks.
β’ Partner with cross-functional stakeholders, including development, operations, quality assurance and security, to streamline processes.
β’ Develop and continuously improve automation solutions to enable teams to build and deploy quality data and code efficiently and consistently.
β’ Build automated testing solutions in support of quality management objectives to reduce manual effort.
β’ Debug and troubleshoot machine learning model issues during production to ensure performance and stability.
β’ Work closely with cross-functional stakeholders to analyze and troubleshoot complex production issues.
β’ Prepare and present design and implementation documentation to multiple stakeholders.
β’ Promote automation across the data management and analytics delivery organization.
β’ Schedule and facilitate meetings as needed.
β’ Perform other duties as assigned.
Job Requirements:
Minimum Requirements:
β’ 1 year of experience in software development, automation development role with a bachelorβs degree or equivalent transferable experience through coursework, internships, or work experience in lieu of participation in the Elevate Program.
Skills:
β’ Experience with one or more coding languages such as Python, C++, or Java.
β’ Experience with or understanding of CI/CD pipelines using containerization tools such as Docker.
β’ Experience with automation using platforms such as GitLab CI, Jenkins, etc.
β’ Experience with or exposure to basic cloud storage services (e.g., S3, Google Cloud Storage), compute, (e.g., EC2, Lambda, etc.) or managed machine learning services (e.g., AWS SageMaker, GCP AI Platform).
β’ Experience with OpenAPI or Swagger.
β’ Experience with Version Control tools such as Git.
β’ Ability to investigate and analyze information and draw conclusions based upon data and facts.
β’ Strong technical skills and aptitude with a willingness to learn new languages and technologies.
β’ Excellent verbal and written communications.
β’ Strong technical writing capability.
β’ Ability to translate technology into relatable concepts for business partners and stakeholders.
β’ Highly motivated with strong organizational, analytical, decision making, and problem-solving skills.
β’ Ability to build strong partnerships and to work collaboratively with all business and IT areas.
β’ Ability to effectively handle multiple priorities, prioritize and execute tasks in a high-pressure environment.
β’ High level of professionalism, confidence, and ability to build credibility with team members and business partners.
Preferred Requirements:
β’ Experience with or exposure to Agile methodologies with an understanding of DataOps and ModelOps principles.
β’ Experience developing CI/CD workflows and tools.
β’ Experience in automation scripting skills.
β’ Experience with or understanding of configuration management, test-driven development, and release management.
β’ Experience in the Financial Services Industry.
Junior Machine Learning Ops (MLOps) Engineer
Columbus, Ohio
Contradct to Hire
Brief Description:
The MLOps Engineer on the Enterprise Data and Analytics Team will work as part of the larger team to collaborate with Data Scientists to productionize Models used by business users to enhance their ability to make better decisions for Huntington and our customers.
Detailed Description:
This role is an entry level position requiring the individual to be curious and have the aptitude to learn while supporting the teamβs overall efforts to productionize Models created by Huntingtonβs Data Scientists.
To be successful the candidate must be driven and well-organized, with strong communication skills. It is essential for the individual to be self-motivated and tenacious, thrive in a collaborative, fast-paced environment, while completing tasks on agreed schedules.
Primary Responsibilities:
β’ Understand the current process and technical complexities of developing and deploying data pipelines and model builds and develop automation solutions to improve and extend the existing process to become an unattended delivery pipeline.
β’ Collaborate closely with product development, architecture, data engineering and testing teams to understand their current build and release processes and make recommendations for improvement through the automation of various tasks.
β’ Partner with cross-functional stakeholders, including development, operations, quality assurance and security, to streamline processes.
β’ Develop and continuously improve automation solutions to enable teams to build and deploy quality data and code efficiently and consistently.
β’ Build automated testing solutions in support of quality management objectives to reduce manual effort.
β’ Debug and troubleshoot machine learning model issues during production to ensure performance and stability.
β’ Work closely with cross-functional stakeholders to analyze and troubleshoot complex production issues.
β’ Prepare and present design and implementation documentation to multiple stakeholders.
β’ Promote automation across the data management and analytics delivery organization.
β’ Schedule and facilitate meetings as needed.
β’ Perform other duties as assigned.
Job Requirements:
Minimum Requirements:
β’ 1 year of experience in software development, automation development role with a bachelorβs degree or equivalent transferable experience through coursework, internships, or work experience in lieu of participation in the Elevate Program.
Skills:
β’ Experience with one or more coding languages such as Python, C++, or Java.
β’ Experience with or understanding of CI/CD pipelines using containerization tools such as Docker.
β’ Experience with automation using platforms such as GitLab CI, Jenkins, etc.
β’ Experience with or exposure to basic cloud storage services (e.g., S3, Google Cloud Storage), compute, (e.g., EC2, Lambda, etc.) or managed machine learning services (e.g., AWS SageMaker, GCP AI Platform).
β’ Experience with OpenAPI or Swagger.
β’ Experience with Version Control tools such as Git.
β’ Ability to investigate and analyze information and draw conclusions based upon data and facts.
β’ Strong technical skills and aptitude with a willingness to learn new languages and technologies.
β’ Excellent verbal and written communications.
β’ Strong technical writing capability.
β’ Ability to translate technology into relatable concepts for business partners and stakeholders.
β’ Highly motivated with strong organizational, analytical, decision making, and problem-solving skills.
β’ Ability to build strong partnerships and to work collaboratively with all business and IT areas.
β’ Ability to effectively handle multiple priorities, prioritize and execute tasks in a high-pressure environment.
β’ High level of professionalism, confidence, and ability to build credibility with team members and business partners.
Preferred Requirements:
β’ Experience with or exposure to Agile methodologies with an understanding of DataOps and ModelOps principles.
β’ Experience developing CI/CD workflows and tools.
β’ Experience in automation scripting skills.
β’ Experience with or understanding of configuration management, test-driven development, and release management.
β’ Experience in the Financial Services Industry.






