Apple spent years beating a single drum: your health data is untouchable. Your steps, your sleep, your ECG readings, they stay on your device, locked behind a wall of trust that Apple built brick by brick.
But OpenAI just showed up with a sledgehammer. With the launch of ChatGPT Health, OpenAI is making a massive bet that you’ll value convenience over that hard-won privacy.
And for anyone paying attention, this launch didn’t come out of nowhere. It closely matches earlier signs that OpenAI was already laying the groundwork for deeper Apple Health access.
They’re inviting you to plug in everything: Apple Health, MyFitnessPal, your actual medical records, all so a chatbot can tell you why you’re tired or how to fix your diet.
On paper, it’s a dream assistant. In reality, it undercuts the entire “walled garden” philosophy Apple fans have lived by for a decade.
Apple users are notoriously skeptical for a reason. We’ve been trained to ask: Who sees this? Where is it stored? Is it encrypted? OpenAI says the right things, the data is “sandboxed” and won’t be used to train their models by default.
But let’s be real: this isn’t end-to-end encryption. If a court order comes knocking, that data is accessible. More importantly, HIPAA doesn’t apply here.
This isn’t some locked-down medical record. Once you upload it, you’re basically just putting your private health history on someone else’s server and hoping for the best.
For anyone used to Apple’s “local-first” storage, moving that data to OpenAI feels like a massive step backward in control.
There’s also the “hallucination” problem. Apple is famously conservative; they’ll show you a graph of your heart rate, but they won’t tell you what it means because they don’t want the liability.
OpenAI is taking the opposite approach. They want to be the interpreter. But we’ve already seen ChatGPT confidently misread data and give questionable medical “advice.”
For a company that sometimes still struggles with basic math, asking it to analyze your bloodwork feels risky.
At the end of the day, Apple wants to own the hardware, but OpenAI wants to own the conclusions.
Consent is one thing, but true control is another. Apple stayed away from giving medical advice because they knew the stakes were too high.
OpenAI is diving in headfirst, betting that we’re all tired enough of “data silos” that we’ll hand over the keys to our medical history for the sake of a personalized workout plan.
It’s a classic trade: certainty for convenience. And if history tells us anything about tech “land grabs,” the risks aren’t just hypothetical, they’re inevitable.