According to Wccftech, Apple is planning to use a customized version of Google’s Gemini AI model with 1.2 trillion parameters to power its revamped Siri voice assistant. The tech giant will pay Google around $1 billion per year for this AI partnership, which dwarfs Apple’s current 1.5 billion-parameter model used for Siri. Apple tested both OpenAI’s ChatGPT and Anthropic’s Claude before settling on Google’s technology for handling complex user requests through its Private Cloud Compute framework. The revamped Siri is expected to launch with iOS 26.4 and will bring new capabilities including in-app actions, personal context awareness, and on-screen awareness. This deal represents the latest transaction between the two companies, building on Google’s existing $20 billion annual payments to Apple for default search engine placement.
The billion-dollar band-aid
Here’s the thing about this deal – it’s essentially Apple admitting they’re way behind in the AI race and need a temporary fix. A 1.2 trillion parameter model versus their current 1.5 billion parameter one? That’s like going from a bicycle to a spaceship. But Apple’s not just throwing money at the problem blindly – they’re being strategic about what gets outsourced.
The customized Gemini model will handle Siri’s query planner and summarizer functions, while Apple keeps the Knowledge search system on its own on-device LLMs. That’s smart positioning. It lets them leverage Google‘s massive AI infrastructure for the heavy lifting while maintaining control over the core user experience. And crucially, this arrangement won’t put Google’s search AI at the “apex position” within Apple’s ecosystem.
Who wins here?
So who actually benefits from this $1 billion annual deal? Both companies, honestly. Google gets another massive revenue stream and validation of its AI technology. Apple gets to launch a competitive Siri without waiting years to develop comparable technology in-house. But let’s be real – the biggest winner might be users who’ve been frustrated with Siri’s limitations.
The timing is everything. With iOS 26.4 expected to bring the Siri overhaul, Apple needed something that works now, not in 2025. They’ve even given the project an internal codename – Glenwood – and put Vision Pro creator Mike Rockwell and software chief Craig Federighi in charge. That tells you how seriously they’re taking this.
The bigger picture
Now, here’s what’s really interesting – Apple isn’t planning to rely on Google forever. They’re still working on their own in-house solution. This Gemini partnership is basically a very expensive crutch while they build their own AI legs. But it raises questions about how long that will take and whether Apple can ever truly catch up to Google and OpenAI in the AI arms race.
Meanwhile, this deal is just the latest chapter in the complicated Apple-Google relationship. As recent reports show, Google already pays Apple $20 billion annually for search default status. Adding another billion for AI services? That’s pocket change in comparison, but it does make Apple increasingly dependent on its longtime rival.
Basically, Apple’s playing the long game here. They’re using Google’s technology as a stopgap while they develop their own AI capabilities. It’s expensive, it’s temporary, but it might just be what they need to stay relevant in the AI-powered future. The question is whether users will notice the difference – or care which company’s AI is actually powering their interactions with Siri.
