Society's Hidden Biases: Medical Misogyny and AI Illusions

British society confronts medical misogyny in PCOS diagnosis, AI chatbot health advice gone wrong, and the rise of fake influencers worth millions.

Society's Hidden Biases: Medical Misogyny and AI Illusions
Photo by Vitaly Gariev on Unsplash

Editorial digest April 19, 2026
Last updated : 08:17

Three stories. One uncomfortable thread. British society keeps outsourcing judgement — to overworked GPs, to chatbots, to influencers who don't exist — and the people paying the price are, predictably, the ones who were already being ignored.

Why does PCOS still take a decade to diagnose?

Lynda Wilkes-Green spent more than ten years in what she describes to the Independent as debilitating pain before anyone put a name to it. Polycystic ovary syndrome. A condition affecting, by most estimates, roughly one in ten women of reproductive age. Not rare. Not obscure. Just dismissed.

Her testimony lands as Health Secretary Wes Streeting pledges to confront what he calls "medical misogyny" in the NHS — the reflex that translates a woman's agony into an overreaction, her symptoms into anxiety, her insistence into neurosis. The rhetoric is welcome. The data isn't new. Study after study has shown women wait longer for pain relief in A&E, longer for autoimmune diagnoses, longer for endometriosis — on average seven to nine years in the UK, depending on whose audit you trust.

Streeting's challenge isn't convincing the public the bias exists. Women have been saying so for decades. It's rebuilding a system where a ten-minute GP slot can plausibly catch the pattern that took Wilkes-Green's career, fertility and mental health as collateral. Good intentions don't shorten waiting lists. Funded gynaecology hubs might.

Should you really ask a chatbot about your symptoms?

Into the diagnostic vacuum steps the obvious understudy. The BBC this week profiled Abi, whose experience with health chatbots ranged from uncannily helpful to actively misleading — the kind of mixed results that, in a clinical setting, would trigger a regulatory inquiry.

The appeal is obvious. A chatbot doesn't interrupt. It doesn't roll its eyes at the fifteenth period-pain question. It's free, instant, and available at 3am when the pain is worst. For women told by their doctors that they're fine, the algorithm is sometimes the first thing that takes them seriously.

That's precisely the problem. Being taken seriously is not the same as being correct. Large language models hallucinate confidently, conflate conditions, and have no legal duty of care. They will not notice the pallor that makes a GP order a blood test. The risk isn't that chatbots replace doctors — they can't — but that they become the triage layer for patients the NHS has already failed. A parallel, unaccountable health system, accessible precisely to those most likely to be misled by it.

Regulation lags, as ever. Britain's approach to AI in healthcare remains a patchwork of MHRA guidance and good-faith hope. Meanwhile, millions are already consulting Dr ChatGPT. The question is no longer whether to allow this. It's who carries the liability when it goes wrong.

Are AI influencers really worth millions?

Further down the uncanny valley, another story the Independent ran this week deserves more than a raised eyebrow. AI-generated "influencers" — photogenic, fictional, allegedly attending Coachella — are pulling in real brand deals worth real money, and real followers who, per the reporting, don't particularly mind that the person they're engaging with isn't one.

File it under society, not tech, because the tech is the least interesting part. The interesting part is the shrug. A generation raised on filtered selfies has apparently decided authenticity was always a marketing claim anyway. If the human influencer was already a construct — sponsored posts, performed spontaneity, curated grief — then removing the human just trims overhead.

The consequences are societal, not aesthetic. Advertising standards built on the premise of a real person endorsing a real product are being quietly retired. Young women, already contending with beauty norms set by algorithmically smoothed faces, now compete with faces that were never aging, never tired, never anything but output. And the economy of attention rewards whichever version is cheapest to produce.

What it all says

The connective tissue is trust — and who gets to receive it. Women whose pain is dismissed turn to machines that hallucinate. Audiences who once followed humans now follow software. In each case, a real human concern has been met by a more convenient, less accountable substitute. Streeting's medical misogyny pledge matters because it names the original failure. The chatbots and the AI influencers are just what rushes in when institutions don't.