

Rivago Infotech Inc
AI/ML Architect with Databricks
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI/ML Architect with Databricks, requiring 3+ years of data science experience, advanced Databricks skills, and a Bachelor’s degree in a related field. Contract length exceeds 6 months, with a pay rate of $150,000-$155,000.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
704
-
🗓️ - Date
February 5, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Los Angeles, CA 90001
-
🧠 - Skills detailed
#MLflow #Spark SQL #Delta Lake #Data Pipeline #Deep Learning #PySpark #Scala #Data Processing #Azure DevOps #GitHub #TensorFlow #Databricks #PyTorch #Data Engineering #"ETL (Extract #Transform #Load)" #Statistics #AWS (Amazon Web Services) #ML (Machine Learning) #Forecasting #Mathematics #Azure Databricks #AI (Artificial Intelligence) #Python #Data Science #A/B Testing #NumPy #Data Analysis #Azure #Pandas #SQL (Structured Query Language) #Datasets #Spark (Apache Spark) #DevOps #Anomaly Detection #Programming #Computer Science
Role description
We are seeking a skilled AI/ML Architect with hands-on experience in Databricks to join our team. The ideal candidate has strong analytical capabilities, experience building scalable data pipelines and machine learning models, and the ability to collaborate with cross‑functional teams to drive data‑driven decision‑making.
This role involves working with large datasets, advanced analytics, and modern data engineering and ML frameworks—primarily using Databricks on Azure/AWS.
Skills & Qualifications
Required
Bachelor’s degree or higher in Computer Science, Data Science, Mathematics, Statistics, Engineering, or related field.
3+ years of experience in data science or machine learning roles.
Advanced knowledge of Databricks, including:
PySpark / Spark SQL
Databricks notebooks
Delta Lake
MLflow
Databricks Jobs & Workflows
Strong programming skills in Python (pandas, numpy, scikit‑learn).
Experience working with large-scale data processing.
Solid understanding of machine learning algorithms and statistical techniques
Key Responsibilities
Data Science & Machine Learning
Develop, train, and optimize machine learning and statistical models using Databricks, Python, PySpark, and MLflow.
Perform exploratory data analysis (EDA) to identify trends, patterns, and insights in large datasets.
Deploy ML models into production using Databricks MLflow, Delta Live Tables, or other MLOps pipelines.
Conduct A/B testing, forecasting, segmentation, anomaly detection, or recommendation systems as required by the business.
Data Engineering & Databricks Platform
Build scalable, high‑performance ETL/ELT pipelines using PySpark, SQL, and Databricks workflows.
Work with Delta Lake to ensure high-quality, reliable, and performant data.
Optimize cluster usage and job performance within the Databricks environment.
Collaborate with data engineers to ensure high-quality data availability for modeling.
Business Collaboration
Translate business problems into analytical solutions and present findings to non‑technical stakeholders.
Partner with product, engineering, and business teams to drive data-informed decisions.
Communicate complex statistical concepts in a clear and concise manner.
Preferred
Experience deploying models in production using MLOps frameworks.
Knowledge of Azure Databricks or AWS Databricks environments.
Understanding of CICD pipelines and DevOps concepts (Azure DevOps, GitHub Actions, etc.)
Familiarity with deep learning frameworks (TensorFlow, PyTorch) is a plus.
Key Competencies
Strong analytical and problem‑solving skills
Ability to work in a fast-paced, collaborative environment
Excellent communication and presentation skills
Self-driven with high attention to detail
Job Types: Full-time, Contract
Pay: $150,000.00 - $155,000.00 per year
Work Location: In person
We are seeking a skilled AI/ML Architect with hands-on experience in Databricks to join our team. The ideal candidate has strong analytical capabilities, experience building scalable data pipelines and machine learning models, and the ability to collaborate with cross‑functional teams to drive data‑driven decision‑making.
This role involves working with large datasets, advanced analytics, and modern data engineering and ML frameworks—primarily using Databricks on Azure/AWS.
Skills & Qualifications
Required
Bachelor’s degree or higher in Computer Science, Data Science, Mathematics, Statistics, Engineering, or related field.
3+ years of experience in data science or machine learning roles.
Advanced knowledge of Databricks, including:
PySpark / Spark SQL
Databricks notebooks
Delta Lake
MLflow
Databricks Jobs & Workflows
Strong programming skills in Python (pandas, numpy, scikit‑learn).
Experience working with large-scale data processing.
Solid understanding of machine learning algorithms and statistical techniques
Key Responsibilities
Data Science & Machine Learning
Develop, train, and optimize machine learning and statistical models using Databricks, Python, PySpark, and MLflow.
Perform exploratory data analysis (EDA) to identify trends, patterns, and insights in large datasets.
Deploy ML models into production using Databricks MLflow, Delta Live Tables, or other MLOps pipelines.
Conduct A/B testing, forecasting, segmentation, anomaly detection, or recommendation systems as required by the business.
Data Engineering & Databricks Platform
Build scalable, high‑performance ETL/ELT pipelines using PySpark, SQL, and Databricks workflows.
Work with Delta Lake to ensure high-quality, reliable, and performant data.
Optimize cluster usage and job performance within the Databricks environment.
Collaborate with data engineers to ensure high-quality data availability for modeling.
Business Collaboration
Translate business problems into analytical solutions and present findings to non‑technical stakeholders.
Partner with product, engineering, and business teams to drive data-informed decisions.
Communicate complex statistical concepts in a clear and concise manner.
Preferred
Experience deploying models in production using MLOps frameworks.
Knowledge of Azure Databricks or AWS Databricks environments.
Understanding of CICD pipelines and DevOps concepts (Azure DevOps, GitHub Actions, etc.)
Familiarity with deep learning frameworks (TensorFlow, PyTorch) is a plus.
Key Competencies
Strong analytical and problem‑solving skills
Ability to work in a fast-paced, collaborative environment
Excellent communication and presentation skills
Self-driven with high attention to detail
Job Types: Full-time, Contract
Pay: $150,000.00 - $155,000.00 per year
Work Location: In person






