MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
developers
Recherche

Someone needs to make AI easy

lundi 6 janvier 2025, 10:00 , par InfoWorld
2024 was a year filled with AI possibilities; 2025 needs to turn those possibilities into realities. For that, we need the cloud companies to stop churning out “primitives” such as large language models (LLMs) and instead give developers solutions. In mid-2024 I suggested we need a “Red Hat for AI,” a company that can deliver “more trust (open models) and fewer moving parts (opinionated platforms that require guesswork to choose and apply models).”

I still think that’s what we need. If not a Red Hat, an AWS to demystify AI and remove the “undifferentiated heavy lifting” AI vendors keep foisting on developers.

So much possibility, so much confusion

Keeping up with AI in 2024 was an exercise in futility. Every other day a new LLM was released, each one bigger, faster, better, or cheaper than the one before. (Here’s one attempt to summarize what happened just in generative AI.) Some developers put in the work to decipher which LLM to use where, but for mainstream developers, AI became one big “WTH?!?”

Few developers did a better job of figuring out how to effectively use AI than Simon Willison. In his article “Things we learned about LLMs in 2024,” he simultaneously susses out how much happened in 2024 and why it’s confusing. For example, we’re all told to aggressively use genAI or risk falling behind, but we’re awash in AI-generated “slop” that no one really wants to read. He also points out that LLMs, although marketed as the easy path to AI riches for all who master them, are actually “chainsaws disguised as kitchen knives.” He explains that “they look deceptively simple to use … but in reality you need a huge depth of both understanding and experience to make the most of them and avoid their many pitfalls.”

If anything, this quagmire got worse in 2024. Incredibly smart people are building incredibly sophisticated systems that leave most developers incredibly frustrated by how to use them effectively. “The default LLM chat UI is like taking brand-new computer users,” Willison argues, “dropping them into a Linux terminal and expecting them to figure it all out.” It’s not pretty and it’s getting worse, not better.

Small wonder that the biggest AI flops of 2024, as captured by MIT Technology Review, largely relate to our collective inability to turn AI into applications that normal people want to use. Even basic things, like Apple’s attempt to summarize my incoming text messages, are more confusing than useful. Some of this stems from the inability to trust AI to deliver consistent results, but much of it derives from the fact that we keep loading developers up with AI primitives (similar to cloud primitives like storage, networking, and compute) that force them to do the heavy lifting of turning those foundational building blocks into applications.

Calling AWS and Microsoft

We may need an AWS or perhaps a Microsoft to make sense of the surfeit of AI options. I say “perhaps a Microsoft,” because the market seems to need something akin to what Microsoft did for networking newbies: clear documentation, intuitive user interfaces, etc. AWS won big for the first decade of cloud computing by giving developers familiar primitives, i.e. the same LAMP building blocks they had in on-premises environments but with the flexibility of elasticity.

By contrast, read through the marketing description of Amazon SageMaker. AWS talks about “an integrated experience for analytics and AI with unified access to all your data” (sounds good) using “familiar AWS tools for model development, generative AI, data processing, and SQL analytics” (also good; don’t make developers learn new tools). But then AWS falls into the trap of insisting that developers want and need “purpose-built tools.” “Purpose-built” feels like a euphemism for “we’re going to offer you everything,” so much in fact that figuring out which model to use may start to seem like a coin toss rather than a clear decision.

Again, Microsoft won big in networking, operating systems, and developer tools by offering opinionated, easy-to-use options for mainstream IT administrators, developers, etc. These never appealed to the alpha geeks but guess what? The real money isn’t in appeasing the alpha geeks’ appetite for arcane options of infinite configurability. The real money is in providing easy options for people who may like technology but care even more about being able to get home in time for their kids’ games, bowling night, or whatever.

Keep in mind that it’s the alphas talking on X (or more likely Mastodon or Bluesky) about the latest and greatest models. For them, offering a wealth of AI primitives is a great idea. But for mainstream developers and data scientists, there’s a real opportunity for the cloud vendors that stop pushing AI application development onto their would-be customers. When they do, AI will stop being a market rich in hope but poor in widespread adoption.
https://www.infoworld.com/article/3631811/someone-needs-to-make-ai-easy.html

Voir aussi

News copyright owned by their original publishers | Copyright © 2004 - 2025 Zicos / 440Network
Date Actuelle
mer. 8 janv. - 02:34 CET