As ChatGPT Health Launches, Promising to Make Healthcare Simple - Is It Safe, Or Is AI Already Biased Against Women?
ChatGPT Health is here to analyse your medical data - but can AI really replace your GP?
Celebrity news, beauty, fashion advice, and fascinating features, delivered straight to your inbox!
You are now subscribed
Your newsletter sign-up was successful
Somewhere between Googling that pain at 2 am and tracking your steps, your AI doctor’s office quietly levelled up - and fast. This January, OpenAI rolled out ChatGPT Health in the US: a dedicated corner of the platform where users can upload medical records, sync apps like Apple Health, Oura, and MyFitnessPal, and receive advice tailored to them, minus the jarring hold music. Meanwhile, from across the pond, we're watching, like having front-row seats to the future of healthcare - popcorn optional.
Built with input from more than 260 doctors across 60 countries, the tool hasn’t exactly skipped medical school. Europe, however, is keeping its distance… for now.
AI has already become the world’s busiest waiting room. One in four of ChatGPT’s 800 million regular users asks a health question each week; more than 40 million people a day treat it as their first port of call - something that once felt experimental, now looks instinctual.
Stats highlight that in 2025 alone, Brits ran nearly 50 million health-related Google searches, while almost two in three admit to using AI to self-diagnose (a pastime I’ve certainly recently adopted). None of this is new behaviour; it may simply reflect GP wait times stretching into weeks.
This is where I hesitate: AI reflects the data it’s trained on. As Zehra Chatoo, founder of Code for Good Now and former Meta strategy lead, warns, “The biggest mistakes AI makes? Who it doesn't see.” When women’s health has historically been underrepresented in data, blind spots aren’t accidental - they raise serious questions about how reliable these systems will be for women in the long run.
Before we get personal and share our lab results, we must ask the tougher questions. How private is your information? Could AI ever replace a GP? And what happens when technology itself carries inherited bias? For OpenAI, health is big business. For patients? It’s more complicated. So, buckle in - we asked the experts what AI in healthcare can do, plus what they fear could go wrong. Keen to read more about how AI could boost your wellbeing? Don't miss our deep dive on AI run coaching apps, here.
As ChatGPT Health launches in the US, we ask: is it already biased against women?
What is ChatGPT Health?
Boiled down, ChatGPT Health is essentially a hub for your medical records - think lab results, clinical history and visit summaries. Feed it your information, and when you ask a question, the answers are “grounded in the information you’ve provided,” says the OpenAI team.
Celebrity news, beauty, fashion advice, and fascinating features, delivered straight to your inbox!
The company is clear that this isn’t about replacing your GP. Instead, the goal is to help you make sense of your health data, track and spot patterns over time, and feel more prepared ahead of an appointment - not to diagnose or prescribe. That foundation has shaped how the tool behaves; “from when it nudges users to seek medical follow-up to how it balances clarity with care in more sensitive moments.”
To do that, ChatGPT Health can now integrate with health tools that are already staples in our everyday lives. Link Apple Health to enable metrics such as sleep, steps, and activity. Sync with Function to access detailed blood test market data. Connect MyFitnessPal, and it can pull in nutrition data, recipes, dietary patterns - all designed to give a deeper snapshot of your health, without positioning AI as your GP.
@openai Introducing ChatGPT Health, a dedicated space for conversations to help you feel more informed, prepared, and confident navigating your health. ChatGPT Health allows you to securely connect electronic medical records and wellness apps like Apple Health, MyFitnessPal, and Peloton so ChatGPT can help explain test results, prepare for doctor visits, advise on diet and workouts, and compare insurance plans. Created in close collaboration with physicians, ChatGPT Health is designed to help you navigate medical care, not replace it. To keep your information private and secure, your health chats, files, and memories are kept in a separate dedicated space. Health conversation info never flows into your regular chats and is not used for model training. You can view or delete Health memories in Health or Settings > Personalization. We’re rolling out to a small group of users first so we can learn and improve the experience, and plan to expand to everyone on web & iOS soon. Join the waitlist for early access: https://chatgpt.com/health/waitlist
♬ original sound - OpenAI
Is it Safe To Upload Our Medical Data?
Letting AI rummage through your medical history is, understandably, an eye-watering prospect. To tackle that very obvious concern, in the US, OpenAI has partnered with b.well, a secure data connectivity platform that lets users link their records directly to the feature. It’s also guarded behind extra privacy guardrails, with its own separate history, so nothing spills from chat to chat - because the last thing any of us needs is a hormone deep-dive gatecrashing Monday morning’s brainstorm.
For added reassurance, Health chats aren’t used for training, and if you change your mind, you can delete them within 30 days.
Even with these safeguards in place, experts warn: "AI can be a helpful guide, but patients should be careful about what they share. Uploading personal data doesn’t replace professional assessment, and over-reliance on generalised AI advice can lead to confusion or delays in care,” says Dr Kasim Usmani, medical GP.
Zehra adds, “AI has enormous potential to make health information more accessible - but we have to engineer trust. It can’t be assumed.” For AI healthcare to feel as safe as talking to your local GP, we need clear rules, genuine consent, and strong standards. “Until these are in place, it’s wise to share highly sensitive health details online with the same caution you would any other deeply personal information. AI should empower people, but only if it’s built on transparency, consent and care.”
If you’re a sceptic like me, that little nagging thought is creeping in right about now - where, exactly, does it all vanish to?
The Hidden Risks: How AI Can Misread Women’s Health
Those blind spots we briefly mentioned? For women’s health, they’re not minor misses - they can be risky. According to Zehra, “Women are adopting AI at 20 to 22% lower rates than men. That matters because AI is technology shaped by its users. Fewer participants mean less representation by default. I often say the biggest mistakes in AI are rooted in who it doesn’t see, who it doesn’t count”
The long and short of it? AI can only work with the data it’s been fed. For women whose symptoms often show up differently or who are underrepresented in datasets, that means potential misreads, or even misdiagnoses, are a reality we exist in.
Dr Saia Ghafur, Lead for Digital Health at Imperial College London, notes that the bias could go beyond gender: “AI doesn’t just inherit these gaps, it inherits everything the evidence-based has overlooked, whether that is ethnicity, socioeconomic status or geography. These systems can’t magically correct systematic inequities; they’re trained on them.”
Take dermatology. “An AI-powered mole checker trained predominantly on lighter skin tones may simply be less sensitive when assessing conditions on darker skin.” The technology isn’t sinister; it’s limited by the data it was fed.
The lesson is loud: AI grows fast, and so do its blind spots…and the fallouts. Zehra’s advice? Don’t rely on inputting all your health data into the machine. Be intentional. Until AI catches up, women’s health deserves more than algorithmic guesses; it deserves attention, awareness, and a healthy dose of scepticism.
The Limits of AI Diagnosis: Could It Ever Take Over Your GP?
Let's just say, we're not there yet. While AI can scan symptoms in seconds, it can’t physically see you. “Clinical medicine relies on direct assessment - including examination and observation - which AI cannot perform,” says Dr Kasim. A chatbot cannot capture tone, body language, or the subtle clinical cues that shape a diagnosis.
Then there’s the question of context; “AI outputs are based on generalised data that may not reflect the patient's specific demographic or clinical situation,” he explains. It works on averages; your GP works on you. And because many symptoms are non-specific, he warns that AI can sometimes offer an overly broad list of possibilities - occasionally heightening anxiety rather than easing it.
That said, Dr Kasim wants to stress that there are not only downsides. “AI can improve patient education, encourage appropriate self-care, and even give people a private space to explore symptoms they might feel embarrassed to raise.” Used well, it can make consultations more productive.
So, where does that leave us? AI can be useful as a resource, but it should never be viewed as a substitute. Helpful guide, yes. Replacement GP? Not quite.
The Best Ways to Use ChatGPT Health
If, or when, ChatGPT Health ventures into UK territory, the message from our experts is clear: use it, but don’t hand over the reins.
As Zehra reminds us, AI is predictive. So the best way to ensure it doesn’t outsmart you is to co-create with it. “Use AI to sense-check, to fill small gaps, to spark questions - but don’t surrender your judgement. Ask for the sources, where the information is coming from. Compare the information with multiple AI’s: Gemini, Claude, Chat, and compare answers. Be Sceptical, not cynical.”
Dr Ghafur drops a health-sized caution flag: “Don't use AI tools as a substitute for seeking care when you're genuinely concerned. Some tools can provide false reassurance, leading patients to delay seeking the appropriate consultation. Other escalate minor concerns, sending people to emergency services unnecessarily.”
So, dearest readers, we have our conclusion. You have agency, and AI has predictions. It can support, educate and empower. Absolutely. But it works best as a well-regulated, purpose-driven assistant, not an all-knowing authority we unquestioningly give power to. Use it thoughtfully. Then talk to a human.
Shop MC UK approved health tools now:
Hydration is one of the simplest ways to stay well - and lululemon's Back to Life Sports bottle promises to make it easy. Acting as a visual cue and helping you to hit your daily hydration goals, drinking enough water daily will boost energy, digestion and overall wellbeing.
Last but by no means least, if you are on the fence about using AI for your healthcare data, noting down your symptoms in a journal can be an easy way to keep track of what's going on with you. We're fans of these Papier journals - they're simply organised, but super effective.

Ellie-Mae is a freelance journalist specialising in women’s health, with bylines in Vogue, Dazed, The Guardian, and The Evening Standard. A proud advocate for endometriosis and adenomyosis, she’s making it her mission to turn whispered women’s health stories into bold, open conversations. Outside of work, you’ll find her hiking in the hills with her pomeranian (because yesm poms can hike too), digging into the latest women’s health trends, or hunting down the best sauna in town.