Navigation
Recherche
|
Microsoft aims to improve agent versatility with Copilot Studio updates
lundi 19 mai 2025, 18:00 , par InfoWorld
Microsoft is adding new features and capabilities to Copilot Studio, its low code tool for creating generative AI-based autonomous agents, in an effort to improve the overall versatility of agents for enterprises.
The more versatile the agents, the more widely they can be adopted for a variety of enterprise use cases. The new updates, which were unveiled at the company’s ongoing Build 2025 developer conference, focuses on advancing ease of agent creation, interoperability, and overall usefulness while doubling down on lifecycle management, security, and governance, analysts say. Copilot Studio gets agent orchestration features One of the key updates introduced to Copilot Studio is a set of agent orchestration features. Agent orchestration is becoming an increasingly important capability for enterprises as they need multiple agents to work in tandem to be able to achieve their desired autonomy in target processes and workflows while also generating a return on their investment in agentic software or platforms.In contrast to the existing ability to build and manage singular agent for specific tasks, enterprises via Copilot Studio will be able to create a multi-agent setup for automating a process or workflow from scratch, Lili Cheng, CVP of business applications and platform at Microsoft, explained in a blog post. In addition, enterprises will be able to design a multi-agent system from agents that have been already created, be it in Studio or other Microsoft services, such as Azure AI Foundry, or a different platform altogether, such as Bedrock or Vertex AI, Cheng said. Orchestrating the interaction of agents built on a single platform may not be enough to meet the needs of enterprises, however. Forrester senior analyst William McKeon-White expressed doubt about how seamlessly agents, especially those created in other platforms, will communicate with each other when being driven by large language (LLMs) models instead of a predetermined workflow. And Moor Insights and Strategy principal analyst Jason Andersen warned that enterprises incorporating an already-created agent into a multi-agent system may need to modify it to ensure that it supports the agent communication protocols. Microsoft isn’t the only cloud services provider that is enabling enterprises to build multi-agent systems. Microsoft’s new orchestration capabilities can be compared with Google’s Agent Development Kit and AWS’ multi-agent orchestration features in Bedrock. IBM, too, has been expanding multi-agent and agent communication capabilities via its BEE Agent framework and the Agent Communication Protocol (ACP). Copilot Studio’s orchestration capabilities are currently in private preview, and Microsoft expects to move to public preview soon. However, it hasn’t said when this will happen, nor when it expects to make the feature generally available. Computer control Microsoft is taking a leaf out of the playbook of Anthropic, the company that first showcased an LLM-driven agent controlling a computer just like a human: Enterprises will soon be able to experiment with a new computer use ability in Copilot Studio. Available only in the US via Microsoft’s Frontier program, the computer use ability will enable enterprises to automate complex, UI-based tasks such as data entry, invoice processing, and market research with visible reasoning, according to Microsoft’s Cheng. Google, too, has been building a project named Jarvis, which will allow users to automate tasks such as research and shopping over the Chrome browser with the help of the company’s Gemini 2.0 large language model (LLM). The obvious application of such functionality is in interacting with legacy desktop applications that have no APIs, but Andersen at Moor Insights and Strategy warned that enterprises may find it difficult to put the feature to work in this context, because of challenges such as incompatible user interfaces. Model Context Protocol support raises questions Anthropic has also been setting the pace with its development of Model Context Protocol (MCP), which can be used to provide agents with more contextual knowledge and external tools. MCP is fast becoming the most popular agent communication protocol, and Microsoft has now added support for it inside Copilot Studio. The idea is to enable enterprises to use agents for a variety of tasks or give these agents access to tools to complete required tasks, Microsoft said. It’s hard to say how useful Microsoft’s implementation of MCP will be as there’s some confusion, among analysts at least, as to how the MCP integration works inside Copilot Studio. McKeon-White suggested that MCP may be integrated as a connector inside Copilot Studio where it can be consumed as a standard integration, while Andersen leaned towards Copilot Studio itself acting as an MCP client, enabling users to connect agents to a variety of different services from disparate software vendors. Microsoft also hasn’t provided details of how it plans to manage the security issues that are showing up in other MCP implementations. McKeon-White expects Microsoft to provide the necessary security measures through the connector interface itself, although he still has questions about how authentication or other permissions are maintained during an interaction. Code interpreter for Agents in Copilot Studio As part of its strategy to make agents more powerful and adaptable to a wider range of use cases, Microsoft is previewing a code interpreter for Copilot Studio that can be used either dynamically or with the Prompt Builder. In dynamic mode, the interpreter generates python code and executes it live at runtime, enabling agents to analyze Excel files, generate pipelines and visualizations, and solve complex math problems, Cheng said. Or, with the Prompt Builder, “Enterprise users can define and edit Python code at design time. When the prompt runs, it executes the same preconfigured logic —ideal for repeatable tasks like Create, Read, Update, and Delete operations on Dataverse tables,” Cheng said. Regardless of how it’s accessed, analysts said, the interpreter opens up the potential for agents to perform tasks they are not usually expected to do, such as solving math problems. “What this capability allows is for Copilot to dynamically create code to hand-off to and to execute tasks on its behalf while enabling more deterministic work,” McKeon-White said. Both Google and AWS already support similar features, he added. Other updates to Copilot Studio includes newer controls for adding data sources for context, grounding responses, security features, and a Visual Studio Code extension — already generally available via the Visual Studio Marketplace — that will allow developers to connect to Copilot Studio and edit agents directly.
https://www.infoworld.com/article/3989489/microsoft-aims-to-improve-agent-versatility-with-copilot-s...
Voir aussi |
56 sources (32 en français)
Date Actuelle
mar. 20 mai - 03:42 CEST
|