Navigation
Recherche
|
Azure AI Foundry tools for changes in AI applications
mercredi 20 novembre 2024, 10:00 , par InfoWorld
The way we use artificial intelligence is changing. Chatbots aren’t going away. We’ll continue to use them to deliver basic, natural language, self-service applications. But the future belongs to multimodal applications, built on large language models (LLMs) and other AI models, that act as self-organizing software agents. These more complex AI applications will require more thought, more code, more testing, and more safeguards.
An AI evolution requires a similar evolution in our development tools. Although we’ve seen Power Platform’s Copilot Studio begin to deliver tools for building task-focused agents, more complex AI applications will require a lot more work, even with support from frameworks like Semantic Kernel. Much of Azure’s current AI tools, beyond its Cognitive Services APIs, are focused on building grounded chatbots, using Microsoft’s Prompt Flow framework to add external vector indexes to LLMs for retrieval-augmented generation (RAG), along with wrapping calls and outputs in its own AI safety tools. It’s a proven approach to building and running Microsoft’s own Copilot services, but if enterprises are to get the next generation of AI services, they need new tools that can help deliver custom agents. Introducing Azure AI Foundry At Ignite 2024, Microsoft released its Azure AI Foundry SDK. Instead of focusing on services like Azure AI Studio (as good as it is), the company takes Azure AI development to where the developers are: their development environments. Azure AI Foundry will plug into IDEs and editors such as Visual Studio and Visual Studio Code, as well as into platforms like GitHub. Microsoft describes Azure AI Foundry as “a soup-to-nuts platform for building and evaluating and deploying at-scale AI applications.” That doesn’t mean an end for Azure AI Studio. Instead, it’s going to get a new role as a portal where you can manage your models and the applications using them. It will serve as a bridge between business and development, allowing application owners, stakeholders, and architects to share necessary metrics about your code. The new portal will help manage access to tools and services, using your Azure subscription to bring key information into a single view. This helps manage resources and privileges and reduces the risk of security breaches. Knowing what resources you’re using is key to ensuring that you have the right controls in place and that you aren’t overlooking critical infrastructure and services. More than code with Azure AI Foundry Part of Azure AI Foundry is an update to the Azure Essentials best practices documentation, which now sensibly sits alongside tools like Azure Migrate, the Cloud Adoption Framework, and the Well Architected guidelines. Development teams should visit this portal for architectural and design best practices developed across Microsoft’s partners and its own services team to help build cloud-powered applications. Azure AI Foundry will include tools that help you benchmark models and choose the right model for your application. Using the same metrics for different models, you can see which fits your data best, which is most efficient, which is most coherent, and how much they will cost to run. Run this new benchmarking tool on both public training data and your own to make the right decision early, reducing the risk of choosing a model that doesn’t fit your requirements or your data. The result should be a common platform for developer and business stakeholder collaboration. As modern AI migrates from chatbot to intelligent process automation via agents, this approach is going to become increasingly important. Development teams must understand the business problems they are trying to solve, while business analysts and data scientists will be needed to help deliver the necessary prompts to guide the AI agents. Alongside the new development tool, Azure is expanding its library of AI models. With nearly 2,000 options, you can find and fine-tune the appropriate model for your business problems. In addition to OpenAI’s LLMs and Microsoft’s own Phi small language model, other options include a range of industry-specific models from well-known vendors, such as Rockwell Automation and Bayer. Additional new features will make it easier and faster to prepare training data and use it to fine-tune models. Merging AutoGen and Semantic Kernel Closely related to the launch of Azure AI Foundry is the planned merger of Microsoft Research’s AutoGen agentic AI framework with Semantic Kernel. The combination will help you develop and operate long-running stateful business processes, hosting components in Dapr and Orleans. As AutoGen builds on Orleans, there’s already enough convergence between the stable Semantic Kernel and AutoGen’s multi-agent research project. AutoGen will remain a research platform for experimenting with complex contextual computing projects. Projects can then be ported to Semantic Kernel, giving you a supported runtime for your agents to run them in production. Microsoft is giving plenty of notice for this transition, which should be in early 2025. If we’re to deliver business process automation with agentic AI, we need to connect our agents to business processes. That’s easy for Copilot Studio, as it can take advantage of the Power Platform’s existing connector architecture. However, building and managing your own connection infrastructure can be complex, even with services like Azure API Management. Access to enterprise data helps orchestrate agents, and at the same time provides grounding for LLMs using RAG. Managing AI integrations with Azure AI Agent Service Alongside Azure AI Foundry, Microsoft is rolling out Azure AI Agent Service to support these necessary integrations with line-of-business applications. Azure AI Agent Service simplifies connections to Azure’s own data platform, as well as to Microsoft 365. If Azure AI Foundry is about taking AI to where developers are, Azure AI Agent Service is taking it to where your business data is, in tools like the data lakes in Microsoft Fabric and the enterprise content stored in SharePoint. Azure AI Agent Service builds on Azure’s infrastructure capabilities, adding support for private networks and your own storage. The intent is to take advantage of Azure’s existing certifications and regulatory approvals to quickly build AI tools to deliver compliant applications. This move should help enterprises adopt AI, using Azure AI Foundry to bring relevant stakeholders together and Azure AI Agent Services to apply the necessary controls—for both internal and external approvals. Improving Azure’s AI infrastructure As well as new software features, Azure is adding more AI-specific infrastructure tools. AI applications hosted in Azure Container Apps can now use serverless GPUs for inferencing, scaling Nvidia hardware to zero when not in use to help keep costs down. Other options improve container security to reduce the risks associated with using LLMs on sensitive data, whether it’s personally identifiable information or commercially sensitive data from line-of-business platforms. Ignite is where Microsoft focuses on its business software, so it’s the right place to launch a developer product like Azure AI Foundry. Azure AI Foundry is designed to build AI into the complete software development life cycle, from design and evaluation to coding and operation, providing a common place for developers, AIops, data scientists, and business analysts to work together with the tools to build the next generation of AI applications. With agentic AI apps, it’s clear that Microsoft thinks it’s time for enterprises to go beyond the chatbot and use AI to get the benefits of flexible, intelligent, business process automation. Building on Azure AI Foundry and Semantic Kernel, we’re able to deliver the context-aware long transaction applications we’ve wanted to build—ensuring that they’re both trustworthy and regulatory compliant.
https://www.infoworld.com/article/3609020/azure-ai-foundry-tools-for-changes-in-ai-applications.html
Voir aussi |
56 sources (32 en français)
Date Actuelle
jeu. 21 nov. - 10:23 CET
|