

SRM Digital LLC
AI Agent Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI Agent Architect with a contract length of "unknown," offering a pay rate of "unknown," and is remote. Key skills include Rust, C#, Python, and experience with MCP, A2A, and data catalog systems.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
November 1, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#GCP (Google Cloud Platform) #Azure #Data Governance #Indexing #Python #Metadata #Databricks #Data Access #AI (Artificial Intelligence) #Microservices #Programming #AWS (Amazon Web Services) #Data Catalog #Compliance #C# #Scala #Cloud #Langchain
Role description
About the Role
Weβre seeking an exceptional AI Agent Engineer/Architect to design and implement advanced, protocol-compliant agent systems. The ideal candidate will have deep hands-on experience building Model Context Protocol (MCP) and Agent-to-Agent (A2A) integrations, as well as a background in data catalog engineering within major hyperscaler ecosystems.
This is a high-impact, hands-on role focused on building next-generation AI agent interoperability across enterprise data platforms.
π§ Key Responsibilities
β’ Architect and develop AI agent frameworks compliant with MCP (Model Context Protocol) and A2A (Agent-to-Agent) communication standards.
β’ Design integrations between AI agents and Microsoft Fabric, OneLake, and Purview data catalogs.
β’ Build high-performance data access, indexing, and semantic enrichment pipelines for metadata and catalog intelligence.
β’ Develop APIs, connectors, and microservices using Rust, C#, and Python.
β’ Optimize agent context management and secure cross-agent communication.
β’ Work closely with AI, data, and platform engineering teams to ensure scalability, interoperability, and compliance.
β’ Contribute to the architecture of a next-gen data intelligence product leveraging LLMs, metadata-driven reasoning, and context-aware agents.
π― Required Skills & Qualifications
β’ Prior experience as a Systems Engineer, Architect, or Senior Developer in Dataplex, Databricks Unity Catalog, or OneLake/Purview teams.
β’ Strong programming expertise in Rust, C#, and Python.
β’ Deep understanding of Model Context Protocol (MCP) and Agent-to-Agent (A2A) interoperability standards.
β’ Experience with data catalog systems, metadata services, and enterprise data governance.
β’ Familiarity with Microsoft Fabric ecosystem and OneLake architecture.
β’ Proven ability to build distributed, secure, and high-performance backend systems.
β’ Excellent problem-solving, system design, and communication skills.
π§© Nice to Have
β’ Familiarity with LLM orchestration frameworks (LangChain, Semantic Kernel, AutoGen).
β’ Background in cloud-native development (Azure, GCP, or AWS).
β’ Knowledge of metadata graph systems or knowledge catalogs.
About the Role
Weβre seeking an exceptional AI Agent Engineer/Architect to design and implement advanced, protocol-compliant agent systems. The ideal candidate will have deep hands-on experience building Model Context Protocol (MCP) and Agent-to-Agent (A2A) integrations, as well as a background in data catalog engineering within major hyperscaler ecosystems.
This is a high-impact, hands-on role focused on building next-generation AI agent interoperability across enterprise data platforms.
π§ Key Responsibilities
β’ Architect and develop AI agent frameworks compliant with MCP (Model Context Protocol) and A2A (Agent-to-Agent) communication standards.
β’ Design integrations between AI agents and Microsoft Fabric, OneLake, and Purview data catalogs.
β’ Build high-performance data access, indexing, and semantic enrichment pipelines for metadata and catalog intelligence.
β’ Develop APIs, connectors, and microservices using Rust, C#, and Python.
β’ Optimize agent context management and secure cross-agent communication.
β’ Work closely with AI, data, and platform engineering teams to ensure scalability, interoperability, and compliance.
β’ Contribute to the architecture of a next-gen data intelligence product leveraging LLMs, metadata-driven reasoning, and context-aware agents.
π― Required Skills & Qualifications
β’ Prior experience as a Systems Engineer, Architect, or Senior Developer in Dataplex, Databricks Unity Catalog, or OneLake/Purview teams.
β’ Strong programming expertise in Rust, C#, and Python.
β’ Deep understanding of Model Context Protocol (MCP) and Agent-to-Agent (A2A) interoperability standards.
β’ Experience with data catalog systems, metadata services, and enterprise data governance.
β’ Familiarity with Microsoft Fabric ecosystem and OneLake architecture.
β’ Proven ability to build distributed, secure, and high-performance backend systems.
β’ Excellent problem-solving, system design, and communication skills.
π§© Nice to Have
β’ Familiarity with LLM orchestration frameworks (LangChain, Semantic Kernel, AutoGen).
β’ Background in cloud-native development (Azure, GCP, or AWS).
β’ Knowledge of metadata graph systems or knowledge catalogs.






