AI Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI Engineer on a 6-month contract-to-hire, remote (EST hours preferred), with a pay rate of $70–$80/hr. Key skills include 5+ years in AI/ML development, Microsoft Fabric, Python, and enterprise data integration.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date discovered
September 17, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Docker #Langchain #BI (Business Intelligence) #Data Pipeline #Big Data #ML (Machine Learning) #Microsoft Power BI #Cloud #Scala #"ETL (Extract #Transform #Load)" #Transformers #Azure #CRM (Customer Relationship Management) #Data Lake #Security #Kubernetes #SQL (Structured Query Language) #Spark (Apache Spark) #AI (Artificial Intelligence) #SharePoint #KQL (Kusto Query Language) #DevOps #Data Processing #API (Application Programming Interface) #Delta Lake #Python #Deployment #Data Integration
Role description
AI Engineer – Remote (EST Hours Preferred) Location: Remote (EST hours preferred) Duration: 6-Month Contract-to-Hire Comp: $70–$80/hr W2 | Conversion: $140K–$150K + bonus + benefits Work Authorization: GC/USC only – client cannot sponsor or transfer H1B Join a cutting-edge AI initiative to build intelligent agents that integrate seamlessly with a Microsoft Fabric Data Lake connecting ERP, CRM, PLM, planning, and helpdesk systems — plus unstructured data from SharePoint, file servers, and document repositories. Your work will enable next-generation RAG systems, vector search, and multi-LLM orchestration to transform how data powers decision-making across manufacturing, supply chain, and customer operations. What You’ll Do: • Architect and implement AI agents leveraging Microsoft Fabric as the central data hub • Build RAG pipelines with FAISS, Pinecone, Weaviate, or ChromaDB for hybrid structured/unstructured search • Integrate with enterprise systems (Infor M3, Salesforce, Anaplan, PLM, Freshworks) and process unstructured data from SharePoint and file systems • Develop multi-LLM solutions using OpenAI GPT-4, Anthropic Claude, Google Gemini, Cohere • Deploy scalable solutions with Docker, Kubernetes, and CI/CD pipelines • Build natural language interfaces for querying data across the organization Must-Have Skills: • 5+ years AI/ML development with enterprise data integration • 3+ years experience with Microsoft Fabric, Azure Data Lake, or Delta Lake platforms • Strong Python (LangChain, LlamaIndex, Transformers), SQL/KQL, and vector database expertise • Experience designing RAG systems and multi-LLM orchestration strategies • Familiarity with SharePoint API integration, event-driven data pipelines, and enterprise security (OAuth/SSO) • Cloud deployment experience (Azure preferred), containerization (Docker/Kubernetes), and DevOps practices Nice to Have: • Microsoft Fabric or Azure AI certifications • AI implementation experience in manufacturing or finance • Power BI development and Spark/big data processing experience • Experience mentoring or leading AI/ML projects #Tech #remote