Apple’s Next-Gen Siri May Rely on Google Servers: What It Means for Apple’s AI Strategy
- Editorial Team

- Mar 3
- 5 min read

Apple's next-generation Siri might use Google's infrastructure, which shows that the company wants to make AI better but is having trouble doing so.
Apple has long promised to make Siri a truly smart assistant, but it hasn't always followed through. Now, reports say that the company is looking into using Google's server infrastructure to support the backend of its next-generation AI-powered assistant. This would be a big change for Apple, which has always been proud of owning and controlling almost every part of its technology stack.
The Verge says that Apple has asked Google to look into setting up servers that could help power a more advanced, Gemini-based version of Siri that meets Apple's strict privacy requirements. Apple had said before that the new Siri would work on Apple devices and its own "Private Cloud Compute." However, this new information suggests that Apple may rely more on Google's infrastructure than it had thought.
Why Would Apple Need Google's Help?
AI capabilities have become a major battleground for the biggest tech companies in the world over the past few years. Google, Meta, OpenAI, Microsoft, and other companies have spent billions on custom AI chips, cloud infrastructure, and large language models. Apple first said that its own "Apple Intelligence" features, like a smarter Siri, would do AI tasks on devices whenever possible and use the company's own cloud when necessary.
But Apple's efforts to improve its infrastructure have not been able to keep up with demand. According to reports, Apple’s Private Cloud Compute (PCC) is not being used much. It was made to run cloud-based AI workloads while keeping data private and encrypted. People who work in the industry say that only about 10% of its capacity is used on a regular basis. This could mean that the technology isn't widely used yet or that apps haven't fully adopted it.
At the same time, competitors like Google and Microsoft have been quickly expanding their AI server infrastructure to support language models that can handle more complex requests. Apple needs a lot of processing power to do things like answer questions with a lot of context, summarize user content, or plan things, which are all things that a modern AI assistant should be able to do. Apple's own data centers might not be ready yet to handle that kind of load.
Google is an unexpected partner.
What This Means for Siri and Apple
Apple first talked about working with Google in early 2026. They said that a new generation of "Apple Foundation Models" would be based on Google's Gemini technology. Gemini is a powerful set of large language models that Google has been promoting as the foundation of many AI services, from search summaries to task automation. People at Apple's 2024 Worldwide Developers Conference promised "personalized" Siri features, but they haven't come out yet. These models are supposed to help with that.
Apple would have to change its strategy a lot if it used Google's servers—or servers that Google helps set up for Apple—to run these more advanced AI functions. For a long time, Apple has pushed the idea that its AI work puts privacy first, with processing on the device being a key part of the experience. But AI tasks that require a lot of computing power, like those that need to understand a lot of context and make up answers, often need more computing power than a phone can handle in real time.
Apple's possible use of Google's infrastructure doesn't mean that the company is giving user data to a third party. Even if Google or one of its partners owns the raw compute hardware, Apple may still use technology that keeps user inputs encrypted and processes them in ways that meet its privacy standards. Still, the optics are hard: Apple has been saying for years that its AI is different because it is private by default. Using a competitor for the backend of one of its most important AI products makes that message less clear.
Delays, Expectations, and Angry Customers
There is a lot of information about how Siri has problems. Siri used to be one of the best things about Apple's ecosystem, but it has fallen behind Google Assistant and Amazon Alexa in terms of features and accuracy. Apple's own attempts to "revitalize" Siri with generative AI were announced with a lot of fanfare, but they have been delayed many times. People thought new features would come with iOS 26.4, but the first beta of iOS 26.4 didn't have the improved Siri at all. This suggests that Apple is still working out bugs and performance issues.
That delay has had effects on things outside of Apple's own plans as well. In 2025, Apple was sued in a class-action lawsuit for falsely advertising Siri's new AI features, which weren't actually available when the devices came out. The lawsuit said that customers were tricked into buying devices based on features that didn't exist yet.
At the same time, rivals keep moving forward. Google and Samsung recently added Gemini-powered AI features to their devices that are already on the market. These features can help with tasks that Siri can't yet handle. That makes it even more important for Apple to get its own AI plan back on track.
A change in strategy, but not a simple surrender
Some people who watch and analyze the tech industry have said that Apple's possible server partnership with Google means that Apple has lost the AI race. But that way of looking at it might be too simple. Apple has worked with competitors on backend technologies like cloud hosting and networking infrastructure for a long time, while still making the user experience different. The most important thing will be how Apple handles the privacy and security issues that come up with any such agreement, making sure that users' private information isn't shared or processed in ways that go against Apple's own rules.
It's also important to note that Apple is said to be spending a lot of money—possibly around $1 billion a year—to license and build AI models based on Gemini that Apple can run on servers it controls. That means Apple isn't just outsourcing everything; it's trying to mix outside technology with its own ecosystem in a way that still protects privacy and performance.
What's next for Siri and Apple's AI
In the end, Apple's choice to rely on Google's infrastructure, if it happens, shows how modern AI needs a lot of computing power that even the biggest companies have trouble building from scratch. It shows how fierce and costly the AI arms race has become.
Users hope that this means a smarter and more useful Siri that can summarize content, understand context across apps, and give real help instead of just generic answers. For Apple, this means finding a way to balance new ideas with its long-standing commitment to user privacy. This is becoming more and more difficult as AI tasks become more complex and require data and computing power to leave a user's device.
One thing is clear: Apple's AI strategy isn't set in stone; it's changing quickly, and its relationships with rivals like Google are becoming a new front in the larger tech race.



Comments