A4 Solutions LLC

AI/ML Tech Lead With AWS Tools

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI/ML Tech Lead with a contract length of "unknown" and a pay rate of "unknown." It requires 4+ years of experience in AI/ML, strong Python skills, and proficiency with AWS tools like SageMaker and Docker.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 18, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Scala #SageMaker #ML (Machine Learning) #Data Processing #AWS Glue #S3 (Amazon Simple Storage Service) #Aurora #Lambda (AWS Lambda) #Leadership #Python #MySQL #Databricks #Security #MongoDB #AWS (Amazon Web Services) #Data Pipeline #Deployment #Datasets #Data Science #VPC (Virtual Private Cloud) #Cloud #Docker #PostgreSQL #Data Lake #AI (Artificial Intelligence) #Data Engineering #"ETL (Extract #Transform #Load)" #IAM (Identity and Access Management)
Role description
Role Description: We are looking for an experienced AI/ML Tech Lead to drive end-to-end delivery of advanced AI and machine learning solutions across our enterprise platforms. In this role, you will lead the architecture, development, and deployment of production-grade ML models, GenAI applications, and scalable data processing pipelines using AWS and modern data engineering frameworks. Key Responsibilities • Lead the full lifecycle of AI/ML solution delivery, from discovery and model selection to production deployment and performance optimization. • Architect and implement solutions using AWS Bedrock, SageMaker, Textract, Lambda, ECS, and EKS. • Develop ML and GenAI models using Python, including data preprocessing, feature engineering, and algorithm selection. • Build scalable data pipelines integrating unstructured and structured data using AWS Glue, Lambda, Step Functions, and Databricks. • Design and manage Docker-based containerized applications and deploy them using AWS ECS/EKS. • Work with Data Lake architectures (S3, Lake Formation) and integrate data from Aurora and MongoDB. • Manage training datasets, labeling workflows, model retraining, and automated ML pipelines. • Collaborate with engineering, cloud, and product teams to translate business requirements into scalable ML architectures. • Ensure solutions meet enterprise standards for security, performance, reliability, and cost optimization. • Mentor ML engineers and data engineers, driving best practices in cloud engineering, MLOps, and AI development. Required Skills & Experience • 4+ years of hands-on experience in machine learning, AI engineering, or data science. • Strong proficiency with Python, ML frameworks, and building end-to-end ML solutions. • Deep experience with AWS AI/ML services, including Bedrock, SageMaker, Textract, and Lambda. • Expertise in containerization with Docker and orchestration using ECS/EKS. • Experience designing solutions using Data Lakes, Databricks notebooks, and scalable ETL pipelines. • Practical experience with Aurora (MySQL/PostgreSQL) and MongoDB. • Solid knowledge of cloud security, IAM permissions, VPC networking, and MLOps practices. • Strong communication, technical leadership, and ability to collaborate across teams.