|
Navigation
Recherche
|
Apple, Private Cloud Compute, and trusted AI
mercredi 5 novembre 2025, 15:23 , par ComputerWorld
The growing desire for sovereign cloud is transitioning to become a need for sovereign AI. Companies and individuals want the benefits of artificial intelligence, but don’t want to risk their data by sharing it with third-party firms without clear security and privacy mandates. Many users want or need to keep their data protected by national boundaries.
All of these desires are an Apple opportunity. Apple in the middle Here’s how that opportunity could work – indeed, to some extent, it’s already happening: Private Cloud Compute (PCC) is Apple’s private system to deliver Apple Intelligence services from the cloud. The idea is that those tasks its devices can’t yet handle at the edge can be handed off to Apple’s servers for processing. PCC is built to operate in great privacy. To ensure it keeps that promise, Apple has opened its system up to unprecedented scrutiny. Requests made of the service are cryptographically secured so Apple doesn’t know the question, doesn’t know the answer, and doesn’t know who made the query in the first place. It also doesn’t keep the question. This is far ahead of many cloud-based AI firms. Data controllers What this means is that AI services provided by Apple or via PCC are as secure as they can be — and while that doesn’t entirely resolve the need for territorial protection of data, it does go an awful long way to ensuring corporate information is well protected. In time, as those PCC servers roll off Apple’s US production line and get racked up in its server farms worldwide, Apple will be able to provide access to these services on a more localized basis. Apple Intelligence Europe, or Japan, for example. The solutions aren’t precisely data controllers, as they don’t collect any data. What the system doesn’t yet do is act as an intermediary. Think of it like this: You want to make an AI request of a third-party GenAI service (it doesn’t matter which one). You file your request, which is sent to the PCC system, anonymized, and then despatched to a third-party system for additional processing. That arrangement would still leave some things exposed, such as any documents or images you might use, but would leave your identity and the nature of your request obfuscated. While this isn’t quite sovereign AI, it comes nearer to becoming that. Dump pipes with smart machinery Of course, AI firms are going to resist becoming service providers to Apple. They will recognize that the very data Apple’s systems protect is the data they want to devour to inform their own large language models (LLMs). Perhaps this is why Cupertino’s speculated arrangement with Google calls for the latter’s Gemini AI to run natively on Apple’s own servers. This may deliver the kind of privacy protection people are beginning to demand. Once again, as PCC servers are installed internationally, it might become possible for Apple to offer up access to those services on a regional basis, enabling enterprise users to securely use its own AI suites to handle geographically-constrained or sensitive data. When data does have to be shared externally, Apple’s existing system gives users a chance to approve – or disallow – that task. It injects trust and control into that relationship. If the PCC system were to become an intermediary to third-party AI services, people would be more likely to choose to access those services through Apple’s systems rather than anything else. Open markets Will AI service providers like this? Probably not. They might argue that giving customers the option to enjoy trusted access to their services is anti-competitive. But Apple could argue that depriving customers of access to their services within this trust boundary is also inherently anticompetitive. It is, after all, quite clear that access to trusted AI is something people need, and opening markets is meant to ensure competitors are able to deliver things consumers want. That means third-party AI services must open up, so others can access their services in innovative ways, such as via PCC. Markets are either open, or they aren’t. It’s inevitable that pure AI companies will become service providers, rather than anything else. Apple, as a combined hardware/software/services company, is therefore in a good position to become the most trusted intermediary through which to access all these services, thanks to PCC. Doing so should support what seems to be its AI game plan, which is to provide its own suite of highly useful AI tools, while enabling its customers to access other services they might need without sacrificing the privacy and security so important to the Apple experience. As the cards fall into place To reiterate, the company’s growing presence in enterprise IT means Apple has an opportunity to become the go-to platform for trusted AI, potentially evolving to become a provider of trusted, sovereign AI. That’s all thanks to smart use of Private Cloud Compute as an adjunct to its proliferating hardware ecosystem. You can follow me on social media! Join me on BlueSky, LinkedIn, and Mastodon.
https://www.computerworld.com/article/4084959/apple-private-cloud-compute-and-trusted-ai.html
Voir aussi |
56 sources (32 en français)
Date Actuelle
mer. 5 nov. - 18:26 CET
|








