MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
said
Recherche

Open source has a ‘massive role to play’ in AI orchestration platforms, says Microsoft CEO

jeudi 1 mai 2025, 04:10 , par InfoWorld
Microsoft CEO Satya Nadella says he is “very optimistic” that technology has sufficiently advanced to support more complex, next-gen capabilities such as multi-agent AI orchestration, and open source is a key component.

Chips are getting better, cycle times are faster, and system software, model architecture, and kernels are constantly being optimized, resulting in what he described as a 10x performance boost every six to 12 months.

“We are in some crazy sort of hyperdrive Moore’s Law,” Nadella said during a fireside chat with Meta CEO Mark Zuckerberg at Meta’s inaugural LlamaCon developer event. “Any one of these tech platform shifts has not been about one S curve, it’s been multiple S curves that compound.”

Distillation is ‘like magic’

Microsoft’s ideal, Nadella said, is an orchestration layer that will offer the ability to mix and match AI models, with users pulling different aspects of intelligence from different models in areas where they excel. Open source “absolutely has a massive, massive role to play” in the building out of such platforms.

“I’m not dogmatic about closed source or open source, both of them are needed in the world,” he said. “And customers will demand them, right?”

Whether it be interplay between SQL, MySQL, Postgres, Linux, Windows, or other products, having a posture that allows interoperability is incredibly important, Nadella noted. Enterprise customers want to be able to distill custom models built with their intellectual property (IP), and open weight models will have a “huge structural advantage” in supporting that, compared to closed models.

“Taking a large model and being able to distill it into a smaller model that has even that same model shape is a big use case,” he said.

The challenge is making that available to those unable to build their own infrastructure, or those less technically sophisticated, he noted. The goal for a company like Microsoft is to build the tooling for these models as a service. Hyperscalers can deploy this infrastructure as a cloud service and build tools around it.

So, Nadella pointed out, every tenant of Microsoft 365 could have a distilled, task-specific model that they could create as an agent or a workflow and invoke from within Copilot. “That, to me, is a breakthrough scenario,” he said.

Zuckerberg agreed: “That just seems like one of the coolest things that I think is going to get built.” He also called model distillation “like magic.”

“You basically can make it so that you can get 90% or 95% of the intelligence of something that is 20 times larger in a form factor that is so much cheaper and more efficient to use,” he said.

As powerful as it is, though, there are concerns about doing distillation in a safe and secure way, he conceded, particularly when open source models are coming from different countries (such as DeepSeek, from China). He noted: “How do you make sure that you’re not inheriting security vulnerabilities or different values that are kind of problematic?”

AI takes a greater role in coding

Nadella and Zuckerberg also agreed on the dramatic possibilities for AI in coding. Nadella said that AI generates “fantastic” Python code, and estimated that up to 30% of the code currently in Microsoft’s repositories has been written by software.

Zuckerberg said Meta’s bet is that, in the next year, “maybe half” of software development will be completed by AI. Eventually, he said, “every engineer is effectively going to end up being more like a tech lead with their own little army of triggering agents.”

But successful integration is the sweet spot; AI must work seamlessly into current repos and developer workflows, Nadella emphasized. “It’s one thing to build a new greenfield app, but none of us get to work on complete greenfield all the time,” he said.

Instead, devs are working in large code bases with many workflows, so effective integration within the tool chain is paramount. This is the systems work that any engineering team needs to be doing, said Nadella, and it’s where teams will see the greatest productivity gains.

AI is an ‘existential priority’

Ultimately, Nadella described AI as “a pretty existential priority” and said its success “will come down to developers being able to go at it fearlessly.”

The world requires productivity gains in every function, from healthcare, to retail, to broad knowledge work, he said, and AI has the promise to deliver that. “Software in this new form of AI is the most malleable resource we have to solve hard problems.”

Still, it’s not enough that the technology is just there, he noted; it also requires a management and cultural change. Case in point: Electricity was around for 50 years before people figured out that it could revolutionize factories.

“Just thinking of this as the horseless carriage is not going to be the way we are going to get to the other side,” Nadella said. “So it’s not just tech. Tech has got to progress. You’ve got to put that into systems that actually deliver the new work, work artifact and workflow.”

Databricks: Everything’s going towards open source

Ali Ghodsi, Databricks co-founder and CEO, also sat down with Zuck at LlamaCon; one of his big takeaways was that, in the long run, everything is going to move towards open source.

“You get this sort of open research and open sharing,” he said. “You get a whole world working on these models, and you just have much, much more rapid progress.”

Like Nadella, Ghodsi described a “cross pollination” between different models. People are slicing, mixing and matching, and cobbling different elements together, tasks that would be “completely impossible” without open source.

Zuckerberg added that it’s “really gratifying” to move more towards open source, pointing out that just a year ago, Llama was the main major open source model, which, of course, is now no longer the case.

“You have the ability to take the best parts of the intelligence from the different models and produce exactly what you need,” he said. “It feels like sort of an unstoppable force.”

At the same time, there’s a big question in open source around the fact that, while models are open, the data they’re trained on is sometimes withheld for any number of reasons, Zuckerberg noted. By contrast, reasoning models are trained using problems with verifiable answers, so if devs are giving them hard math or coding problems, the data is much less proprietary.

This is an interesting area to explore, he said: New types of open source releases that aren’t just the weights of the model, but also some of its reasoning traces.

This kind of ongoing exploration led Ghodsi to conclude, “It’s Day Zero of the AI era. The most amazing applications are yet to be invented. This is a complete white space.”

The “data advantage” will be the underpinning of success, Ghodsi said. Builders need to collect the right data, refine it, pass it through models — and repeat — to create a ‘flywheel’ or network effect, continually iterating on and improving products with use.

He urged devs: “Try it out, use it, explore those crazy new ideas. It’s going to be exciting times ahead.” 
https://www.infoworld.com/article/3975472/open-source-has-a-massive-role-to-play-in-ai-orchestration...

Voir aussi

News copyright owned by their original publishers | Copyright © 2004 - 2025 Zicos / 440Network
Date Actuelle
ven. 2 mai - 14:03 CEST