MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
cloud
Recherche

Navigating the AI frontier

vendredi 23 août 2024, 11:00 , par InfoWorld
The cloud computing boom brought innovation and showed us the need for strategic planning to avoid costly mistakes such as “shadow IT” (IT activities or purchases made without the knowledge of the IT department), which led to unexpected expenses and security issues. With generative AI rising, IT leaders can apply these lessons to avoid similar pitfalls with “shadow AI,” the unchecked use of AI without precise planning.

Generative AI promises significant economic impact, but it demands careful strategic planning. Public clouds can’t be the go-to option for AI; you must also consider the value of on-premises deployments. Most cloud fans see this as blasphemy, but it will likely save many enterprises millions of dollars during the next few years. Studies show that running AI on premises can be more cost-effective than in the cloud. An on-premises approach prioritizes data safety, sovereignty, and proper management, sidestepping potential issues like data gravity and costly reconfigurations.

Why cloud isn’t always the answer for AI

The cloud computing revolution heralded a new era of innovation, offering unparalleled access to computing resources and enabling digital transformation on a massive scale. However, rapid adoption and hasty implementation also resulted in escalating costs, security vulnerabilities, and governance challenges, all common with shadow IT.

As enthusiasts once rushed to the cloud for immediate access and flexibility, a vibrant drive exists to adopt AI technologies. McKinsey estimates that generative AI could potentially add between $2.6 trillion and $4.4 trillion to the global economy annually. However, unchecked enthusiasm can lead to high costs and strategic missteps, specifically when companies rely heavily on public cloud services but don’t monitor cloud costs or make sure the expenses are aligned with business goals.

One of the primary lessons from the cloud era is the importance of strategizing for cost optimization from the outset. A study by the Enterprise Strategy Group found that hosting an open source large language model (LLM) with retrieval-augmented generation (RAG) on premises was 38% to 75% more cost-effective than comparable services in the public cloud. API-based approaches were even less efficient. This highlights the importance of scrutinizing AI deployment options, favoring on-premises or hybrid solutions to manage costs better and maintain control over the data.

The second critical takeaway is prioritizing bringing AI capabilities to where the data resides. Gartner predicts that by 2025, 75% of enterprise-generated data will be created and processed outside traditional, centralized data centers. This shift requires organizations to rethink their data management strategies to ensure data safety, sovereignty, and compliance.

An on-prem-first approach maintains these priorities while leveraging existing infrastructure. It allows businesses to strategically utilize public cloud resources when it makes sense, avoiding the need for significant re-engineering later due to challenges such as data gravity. Furthermore, continuous evaluation and adaptation are crucial as genAI evolves. The technology landscape is ever-changing, and strategies that work today may not be effective tomorrow. This requires built-in adaptability, enabling organizations to pivot as technologies and business needs evolve.

By adopting these measures, IT organizations can deploy genAI in a way that fosters innovation and ensures sustainability. This means looking out for the business rather than the current trends. It means not getting your solutions at cloud computing conferences but working from your own requirements to the solution, considering costs all along the way.

A plan to reach your destination

The era of genAI offers a chance to blend technological advancements with strategic foresight, ensuring that AI serves as a cornerstone for innovation rather than a costly experiment. Prioritizing cost efficiencies, data management, and adaptive strategies will be essential to realizing the full potential of AI technologies in a world increasingly driven by intelligent insights.

First, companies should assess their existing infrastructure and define clear AI objectives, aligning them with business goals to guide investments and technology choices. Embracing an on-premises approach can be more cost-effective, utilizing existing infrastructure to maintain control over data while strategically leveraging cloud resources where necessary for agility.

Data sovereignty and security are paramount; enterprises need robust governance to prevent unauthorized AI deployments, ensure compliance, and prevent shadow AI issues. Staying on top of market trends and collaborating with AI experts will help enterprises adapt to technological shifts and refine strategies as needed. Additionally, investing in building in-house AI expertise and fostering a culture of innovation will empower teams to manage and optimize AI initiatives. Businesses can continuously evaluate AI performance and make necessary adjustments by implementing success metrics and adopting an iterative approach.

This strategic preparation enables enterprises to leverage AI for sustainable innovation and avoid costly setbacks. The lessons from the cloud computing era offer a vital road map to successfully navigating the complexities of the AI-driven future.
https://www.infoworld.com/article/3491416/navigating-the-ai-frontier.html

Voir aussi

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Date Actuelle
sam. 9 nov. - 17:33 CET