Clara IT Systems

Data AI Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data AI Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include Databricks Lakehouse, SQL, AI/ML model development, and strong Python/PySpark expertise. Experience in data governance and Azure integration is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 8, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#PySpark #Spark SQL #BI (Business Intelligence) #Deployment #Anomaly Detection #AI (Artificial Intelligence) #Azure #Automation #Data Governance #MLflow #Data Engineering #Spark (Apache Spark) #Databricks #Data Quality #SQL (Structured Query Language) #ML (Machine Learning) #Python
Role description
Please see the required skillset for the AI Engineer (Databricks) role. This role is aligned to the AI & Automation Enablement workstream, including autonomous AI agents, data quality AI, decision intelligence, advanced analytics, ML/AI, Databricks SQL, Unity Catalog, Databricks Lakehouse, AI/BI Genie, and Agent Bricks capabilities. Required Skillsets: AI Engineer (Databricks) • Databricks Lakehouse architecture and implementation • Databricks SQL, notebooks, workflows, and job orchestration • AI/BI Genie implementation and business-facing conversational analytics • Agent Bricks and autonomous AI agent development within Databricks • MLflow for model tracking, deployment, and lifecycle management • AI/ML model development, validation, and operationalization • GenAI and AI agent development using Databricks capabilities • Data quality automation, anomaly detection, and intelligent validation • Unity Catalog, data governance, lineage, and access control • Integration with Azure, APIs, and enterprise data sources • Experience building AI-driven recommendations, smart notifications, and decision intelligence workflows • Strong Python, PySpark, SQL, and data engineering fundamentals