SRM Digital LLC

AI Agent Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI Agent Architect with a contract length of "unknown," offering a pay rate of "unknown," and is remote. Key skills include Rust, C#, Python, and experience with MCP, A2A, and data catalog systems.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 1, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Azure #Data Governance #Indexing #Python #Metadata #Databricks #Data Access #AI (Artificial Intelligence) #Microservices #Programming #AWS (Amazon Web Services) #Data Catalog #Compliance #C# #Scala #Cloud #Langchain
Role description
About the Role We’re seeking an exceptional AI Agent Engineer/Architect to design and implement advanced, protocol-compliant agent systems. The ideal candidate will have deep hands-on experience building Model Context Protocol (MCP) and Agent-to-Agent (A2A) integrations, as well as a background in data catalog engineering within major hyperscaler ecosystems. This is a high-impact, hands-on role focused on building next-generation AI agent interoperability across enterprise data platforms. πŸ”§ Key Responsibilities β€’ Architect and develop AI agent frameworks compliant with MCP (Model Context Protocol) and A2A (Agent-to-Agent) communication standards. β€’ Design integrations between AI agents and Microsoft Fabric, OneLake, and Purview data catalogs. β€’ Build high-performance data access, indexing, and semantic enrichment pipelines for metadata and catalog intelligence. β€’ Develop APIs, connectors, and microservices using Rust, C#, and Python. β€’ Optimize agent context management and secure cross-agent communication. β€’ Work closely with AI, data, and platform engineering teams to ensure scalability, interoperability, and compliance. β€’ Contribute to the architecture of a next-gen data intelligence product leveraging LLMs, metadata-driven reasoning, and context-aware agents. 🎯 Required Skills & Qualifications β€’ Prior experience as a Systems Engineer, Architect, or Senior Developer in Dataplex, Databricks Unity Catalog, or OneLake/Purview teams. β€’ Strong programming expertise in Rust, C#, and Python. β€’ Deep understanding of Model Context Protocol (MCP) and Agent-to-Agent (A2A) interoperability standards. β€’ Experience with data catalog systems, metadata services, and enterprise data governance. β€’ Familiarity with Microsoft Fabric ecosystem and OneLake architecture. β€’ Proven ability to build distributed, secure, and high-performance backend systems. β€’ Excellent problem-solving, system design, and communication skills. 🧩 Nice to Have β€’ Familiarity with LLM orchestration frameworks (LangChain, Semantic Kernel, AutoGen). β€’ Background in cloud-native development (Azure, GCP, or AWS). β€’ Knowledge of metadata graph systems or knowledge catalogs.