I’ve seen many words written about this subject.
That AI is age and gender-biased against women.
I hoped it wasn’t true but over the weekend I came across this Guardian article that made me stop scrolling and read. It turns out everyone’s suspicions aren’t as so far-fetched as we’d hoped.
The piece reported on a new study from the London School of Economics, showing that some AI tools used by English councils to summarise social care notes were consistently downplaying women’s health issues.
“Artificial intelligence tools used by more than half of England’s councils are downplaying women’s physical and mental health issues and risk creating gender bias in care decisions, research has found.”
Here’s the kind of difference they found:
-
A man’s case note might read: “Mr Smith is an 84-year-old man who lives alone and has a complex medical history, no care package and poor mobility.”
-
The same note for a woman became: “Mrs Smith is an 84-year-old living alone. Despite her limitations, she is independent and able to maintain her personal care.”
The details are the same but the tone is completely different. One version highlights complexity and required need; the other smooths things over with “despite her limitations.” This made me really mad. Imagine how that framing could affect whether someone gets the right care or support.
I don’t think it’s that AI “decided” to treat women differently. It’s that these systems are trained on stacks of human data and sadly all our cultural blind spots and stereotypes are constantly getting baked in.
And that matters, because councils and health services are increasingly leaning on AI to make decisions that are supposed to be shaping people’s lives. Women’s lives.
If women’s needs are consistently written off as “not that bad,” support will become even more unequal than we already knew it was.
-
Transparency & testing. Independent audits should be routine. If AI is being used in health or care we need to know how it performs across gender, race, age, and disability.
-
Better training data. Women’s health especially the areas that have historically been minimised, like chronic pain, reproductive health, menopause must be properly represented in the datasets. I know this because I remember reading Invisible Women: Exposing Data Bias in a World Designed for Men by Caroline Criado Perez and my mind was blown.
-
Humans stay in the loop. Social workers, doctors and everyone involved needs to use AI as a tool, not a decision-maker. They should be asking themselves and NOT the AI: “Would it be interpreted differently if the wording were flipped?”
-
Stronger regulation. AI in care should be held to the same kind of safety standards as medicines. If someone feels their needs were misrepresented by an AI-generated assessment there should be a clear way to challenge it. Does that exist already? The article didn’t say. What it did say was that “one US Study analysed 133 AI systems across different industries and found that about 44% showed gender bias and 25% exhibited gender and racial bias.”
-
Voices at the table. Women, carers and service users and … more women should be involved in testing and shaping these tool not just male engineers in a tech lab.
-
Culture change in tech. Bias doesn’t just live in the data, it lives in the people building the systems. I actually talked about this yesterday when I was a guest on a podcast. More diverse and generational teams and more education about how bias shows up should be absolutely non-negotiable.
Anyone who thinks it is is very naive IMO.
What it is doing is creating a reflection and reflecting it right back at us.
That means the responsibility sits with us. We need to take to question everything, use our brains and to build systems that don’t quietly minimise women’s needs. We can’t have this happening again.
Because the truth is, these “small” differences in language aren’t small at all. They can be the difference between someone getting the support they need and someone slipping through the cracks.
That woman slipping through the care cracks might just be me or you one day 🥲
Let me know in the comments what you think.
As always I’d love to hear your thoughts.
