There’s an internal app at Cupertino that looks just like the bots you know: multiple conversations, chat history, longer back-and-forth.
It remembers context. It references past chats. It sounds like exactly what people have been begging Apple for.
This thing is strictly for testing the new Siri. Engineers are using it to see how far Siri can stretch with large language models.
That means better context, smoother multitasking across apps, and deeper access to your personal data. Basically, the stuff Siri should have done a decade ago.
Why keep it locked up? Because Apple doesn’t ship science experiments. ChatGPT, Gemini, Claude — they’re all cool and all broken. They spit out confident nonsense every day.
Apple knows if Siri does that, it’s dead. So the company is hiding the messy parts until it’s sure the new version won’t embarrass itself.
The catch is that the new Siri isn’t coming until 2026. Apple wanted it in iOS 18. Didn’t happen. Scrapped the plan. Rebuilt it on a new LLM architecture.
The target is now iOS 26.4, which is a polite way of saying “maybe March, maybe later.” Until then, you get the same Siri you’ve been ignoring.
There’s also a redesign coming. A “humanoid” look, apparently. Which feels like Apple trying very hard to make you forget that the original Siri was basically a microphone icon that couldn’t reliably set a timer.
Still, here’s the real point: Apple’s secret chatbot looks like ChatGPT because Apple needs Siri to be at least that good.
But Apple doesn’t actually want a chatbot. It wants Siri to feel invisible, like something you command, but not someone you debate.
If the company can pull that off while stealing the best parts of the chatbot craze, then Siri might finally matter again.
And if not? Well, at least the engineers at Apple got to play with the coolest ChatGPT clone you’ll never touch.
Would you trust a ChatGPT-like Siri with your personal data, or is privacy too big a risk? Let us know your thoughts.