|
Navigation
Recherche
|
The cost of abandoned genAI projects? Garbage code, orphan apps, and security issues
lundi 1 décembre 2025, 13:38 , par ComputerWorld
Gunslinging IT leaders with high generative AI (genAI) experiment failure rates are creating high-tech junk that will cost money to maintain after projects are abandoned.
Recent surveys indicate that failed genAI efforts will leave in their wake a lot of garbage code, abandoned apps and security issues — all of which might not be visible to IT leaders. By 2030, 50% of enterprises are expected to be dealing with delayed AI deployments or higher maintenance costs from delayed or abandoned projects, Gartner said in a survey released last month. “The punitively high cost of maintaining, fixing or replacing AI-generated artifacts such as code, content and design, can erode genAI’s promised return on investments,” Arun Chandrasekaran, a distinguished vice president analyst at Gartner, said in a statement. GenAI is developing fast, with new features seemingly arriving every few weeks or months, making it hard for IT leaders to keep up. As a result, poorly architected AI upgrades could create what the industry calls “technical debt,” in which the cost of maintenance rises over time. Short-term fixes could leave tools and code with limited reuse value, increasing the maintenance costs. Quick fixes to integrating AI tools, which operate differently, atop legacy enterprise tools could also create technical debt, venture capitalists recently said. Studies from Omdia, McKinsey, MIT, and Forrester put projected failure rates for genAI projects as high as 95%. And while the new tools could reduce costs and increase productivity, they also can pile on new technical debt, HFS Research said in a study last month. The study was done in conjunction with software firm Unqork. About 43% of the participants surveyed by HFS expect AI to create new technical debt, while more than 80% expect cost reductions and productivity gains. In the long run, 55% expect AI to reduce total tech debt; 45% expect it to increase. AI will “accelerate” tech debt in brittle and code-heavy architectures, HFS Research CEO Phil Fersht said, adding that enterprises should “re-engineer their foundations, productize integration, embed governance.” For many IT leaders, genAI’s value is more in business transformation than the underlying technology, panelists said during a fireside chat at this year’s Microsoft’s Ignite conference. At telecom firm Lumen, IT decision-makers first identify the business problem they want to solve, then figure out the AI tools and technologies needed to achieve that outcome, Sean Alexander, senior vice president of connected ecosystem at Lumen, said at Ignite. After that, the company measures outcomes. At Pfizer, executives have settled on an approach to first trust AI, then bring business functions into it, Tim Holt, vice president of consumer technology and engineering at the company, said during the panel discussion. “I think that’s where we’re definitely heading — having everyone be thinking about like, ‘I’ve solved this process, I’ve been following exactly the way it exists today,” he said. “Now let’s blow it up and reimagine it….’ And that’s exciting.” Leaders with limited technical knowledge are driving AI strategies and focusing on the outcomes side of the equation. For BASF Agricultural Solution, AI is a fundamental pillar of future business strategy, said Mona Riemenschneider, head of global online communications at the company. “It is part of … how can we create value by using AI technologies,” she said. Even so, IT leaders will need to keep a close tabs on implementations and consider the architectural stability of AI systems, Gartner said. Otherwise, there could be blind spots overlooked in areas such as security and compliance; those might come back to bite IT leaders. Security issues could include rogue apps and data leaks from poorly maintained AI projects. By 2030, about 40% of enterprises will “experience security or compliance incidents linked to unauthorized shadow AI,” Gartner said. Security vulnerabilities were also a concern for 59% of the participants surveyed by HFS, followed by legacy integration at 50%. Gartner recommended that IT leaders authorize and set usage guidelines on AI tools, avoid vendor lock-in, and ensure interoperability. (Vendors such as Nvidia own proprietary software stacks that force companies to use their GPUs. But many are turning to open standards to implement AI.)
https://www.computerworld.com/article/4096138/the-cost-of-abandoned-genai-projects-garbage-code-orph...
Voir aussi |
56 sources (32 en français)
Date Actuelle
lun. 1 déc. - 15:12 CET
|








