

Artificial Intelligence Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Artificial Intelligence Engineer on a 6-month contract, paying "pay rate." It requires expertise in AI development, data processing, and integration with Microsoft Fabric Data Lake. Remote work is available for US citizens and Green Card holders only.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
681.8181818182
-
ποΈ - Date discovered
September 17, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Docker #Langchain #Kubernetes #CRM (Customer Relationship Management) #SQL (Structured Query Language) #AI (Artificial Intelligence) #SharePoint #Scala #Data Processing #Delta Lake #Monitoring #Data Lake
Role description
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’ THIS IS A 6 MONTH CONTRACT TO HIRE OPPORTUNITY
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’ US CITIZENS AND GREEN CARD HOLDERS ONLY
REMOTE (EST)
Day to day:
AI Agent Architecture
β’ Design agents connecting primarily to Microsoft Fabric Data Lake as central data repository
β’ Access structured data from Infor M3 ERP, Salesforce CRM, Anaplan, Uncountable PLM, and Freshworks stored in Fabric
β’ Process unstructured data from SharePoint, file systems, and document repositories outside Fabric
β’ Implement RAG systems using FAISS, Pinecone, Weaviate, ChromaDB for hybrid structured/unstructured search
β’ Build natural language interfaces for querying both lakehouse tables and external document sources
β’ Create unified data processing pipelines combining Fabric data with external unstructured content
System Integration & Data Processing
β’ Connect to Microsoft Fabric Data Lake using Delta Lake format and SQL endpoints
β’ Access structured business data from all enterprise systems centralized in Fabric lakehouse
β’ Integrate unstructured data sources: SharePoint documents, file servers, email archives
β’ Process PDFs, Word docs, Excel files, images, and multimedia content from external systems
β’ Implement real-time data streaming from Fabric Event Streams and external file monitoring
β’ Build hybrid search capabilities combining Fabric structured data with external document vectors
Multi-Platform AI Development
β’ Utilize OpenAI GPT-4, Anthropic Claude, Google Gemini, Meta LLaMA, and Cohere APIs
β’ Implement model routing and fallback strategies across AI providers
β’ Build agents using LangChain, LlamaIndex, AutoGen, CrewAI frameworks
β’ Deploy containerized solutions with Docker/Kubernetes for scalability
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’ THIS IS A 6 MONTH CONTRACT TO HIRE OPPORTUNITY
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’ US CITIZENS AND GREEN CARD HOLDERS ONLY
REMOTE (EST)
Day to day:
AI Agent Architecture
β’ Design agents connecting primarily to Microsoft Fabric Data Lake as central data repository
β’ Access structured data from Infor M3 ERP, Salesforce CRM, Anaplan, Uncountable PLM, and Freshworks stored in Fabric
β’ Process unstructured data from SharePoint, file systems, and document repositories outside Fabric
β’ Implement RAG systems using FAISS, Pinecone, Weaviate, ChromaDB for hybrid structured/unstructured search
β’ Build natural language interfaces for querying both lakehouse tables and external document sources
β’ Create unified data processing pipelines combining Fabric data with external unstructured content
System Integration & Data Processing
β’ Connect to Microsoft Fabric Data Lake using Delta Lake format and SQL endpoints
β’ Access structured business data from all enterprise systems centralized in Fabric lakehouse
β’ Integrate unstructured data sources: SharePoint documents, file servers, email archives
β’ Process PDFs, Word docs, Excel files, images, and multimedia content from external systems
β’ Implement real-time data streaming from Fabric Event Streams and external file monitoring
β’ Build hybrid search capabilities combining Fabric structured data with external document vectors
Multi-Platform AI Development
β’ Utilize OpenAI GPT-4, Anthropic Claude, Google Gemini, Meta LLaMA, and Cohere APIs
β’ Implement model routing and fallback strategies across AI providers
β’ Build agents using LangChain, LlamaIndex, AutoGen, CrewAI frameworks
β’ Deploy containerized solutions with Docker/Kubernetes for scalability