

LTIMindtree
Data Engineer with MLOps
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with MLOps, offering a remote contract. Required skills include AWS or Azure Databricks, MLOps frameworks, and programming in Python and SQL. A Bachelor's degree and 4+ years of relevant experience are essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Azure #"ETL (Extract #Transform #Load)" #GitHub #SQL (Structured Query Language) #Data Modeling #Python #Big Data #Computer Science #AWS (Amazon Web Services) #Data Framework #Databricks #Azure Databricks #Data Pipeline #Data Manipulation #Data Integration #Scala #Spark (Apache Spark) #PySpark #Data Quality #Programming #Data Engineering
Role description
Title: Data Engineer with MLOps
Location: Remote
Contract
Primary Skill: AWS Databricks or Azure Databricks
Secondary Skill: MLOps frameworks
Job Description:
KEY RESPONSIBILITIES
• Design, develop, and maintain robust data pipelines and architectures to support data-driven decision-making across the organization.
• Manage end-to-end data engineering projects, ensuring alignment with business objectives and the successful delivery of high-quality data solutions.
• Contribute to the strategic vision of the Data Engineering & MLOps team, providing input on initiatives that align with business objectives.
• Apply strong analytical and problem-solving skills to address complex data challenges and optimize data workflows.
MUST HAVE
• Bachelor's degree or Advanced degree in Computer Science, Information Technology, Engineering, or a related discipline.
• 4+ years of experience in data engineering, with a focus on designing and implementing scalable data solutions.
• 4+ years of working experience with AWS Databricks or Azire Databricks, including end-to-end data pipeline development in these environments.
• Expertise in more than 2 areas of data technologies such as ETL processes, data warehousing, or big data frameworks.
• Expertise in GitHub Actions, CI/CD pipeline, and MLOps frameworks.
• 4+ year of working experience with programming languages such as Python, PySpark along with SQL for data manipulation.
• Mastery experience in data modeling, data integration, and data quality management.
• Excellent communication and collaboration skills.
Title: Data Engineer with MLOps
Location: Remote
Contract
Primary Skill: AWS Databricks or Azure Databricks
Secondary Skill: MLOps frameworks
Job Description:
KEY RESPONSIBILITIES
• Design, develop, and maintain robust data pipelines and architectures to support data-driven decision-making across the organization.
• Manage end-to-end data engineering projects, ensuring alignment with business objectives and the successful delivery of high-quality data solutions.
• Contribute to the strategic vision of the Data Engineering & MLOps team, providing input on initiatives that align with business objectives.
• Apply strong analytical and problem-solving skills to address complex data challenges and optimize data workflows.
MUST HAVE
• Bachelor's degree or Advanced degree in Computer Science, Information Technology, Engineering, or a related discipline.
• 4+ years of experience in data engineering, with a focus on designing and implementing scalable data solutions.
• 4+ years of working experience with AWS Databricks or Azire Databricks, including end-to-end data pipeline development in these environments.
• Expertise in more than 2 areas of data technologies such as ETL processes, data warehousing, or big data frameworks.
• Expertise in GitHub Actions, CI/CD pipeline, and MLOps frameworks.
• 4+ year of working experience with programming languages such as Python, PySpark along with SQL for data manipulation.
• Mastery experience in data modeling, data integration, and data quality management.
• Excellent communication and collaboration skills.






