By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Tech Consumer JournalTech Consumer JournalTech Consumer Journal
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Reading: AI Medical Tools Provide Worse Treatment for Women and Underrepresented Groups
Share
Sign In
Notification Show More
Font ResizerAa
Tech Consumer JournalTech Consumer Journal
Font ResizerAa
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Search
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Have an existing account? Sign In
Follow US
  • Contact
  • Blog
  • Complaint
  • Advertise
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Tech Consumer Journal > News > AI Medical Tools Provide Worse Treatment for Women and Underrepresented Groups
News

AI Medical Tools Provide Worse Treatment for Women and Underrepresented Groups

News Room
Last updated: September 21, 2025 7:03 pm
News Room
Share
SHARE

Historically, most clinical trials and scientific studies have primarily focused on white men as subjects, leading to a significant underrepresentation of women and people of color in medical research. You’ll never guess what has happened as a result of feeding all of that data into AI models. It turns out, as the Financial Times calls out in a recent report, that AI tools used by doctors and medical professionals are producing worse health outcomes for the people who have historically been underrepresented and ignored.

The report points to a recent paper from researchers at the Massachusetts Institute of Technology, which found that large language models including OpenAI’s GPT-4 and Meta’s Llama 3 were “more likely to erroneously reduce care for female patients,” and that women were told more often than men “self-manage at home,” ultimately receiving less care in a clinical setting.  That’s bad, obviously, but one could argue that those models are more general purpose and not designed to be use in a medical setting. Unfortunately, a healthcare-centric LLM called Palmyra-Med was also studied and suffered from some of the same biases, per the paper. A look at Google’s LLM Gemma (not its flagship Gemini) conducted by the London School of Economics similarly found the model would produce outcomes with “women’s needs downplayed” compared to men.

A previous study found that models similarly had issues with offering the same levels of compassion to people of color dealing with mental health matters as they would to their white counterparts. A paper published last year in The Lancet found that OpenAI’s GPT-4 model would regularly “stereotype certain races, ethnicities, and genders,” making diagnoses and recommendations that were more driven by demographic identifiers than by symptoms or conditions. “Assessment and plans created by the model showed significant association between demographic attributes and recommendations for more expensive procedures as well as differences in patient perception,” the paper concluded.

That creates a pretty obvious problem, especially as companies like Google, Meta, and OpenAI all race to get their tools into hospitals and medical facilities. It represents a huge and profitable market—but also one that has pretty serious consequences for misinformation. Earlier this year, Google’s healthcare AI model Med-Gemini made headlines for making up a body part. That should be pretty easy for a healthcare worker to identify as being wrong. But biases are more discreet and often unconscious. Will a doctor know enough to question if an AI model is perpetuating a longstanding medical stereotype about a person? No one should have to find that out the hard way.

Read the full article here

You Might Also Like

The Gathering’ Draft May Bring More Crossover Sets

We Need to Talk About Smart Glasses

James Gunn Wants ‘Man of Tomorrow’ to Show Lex Luthor’s Layers

‘Black Phone 2’ Is Everything You Want In a Horror Sequel

An Asteroid Could Smash Into the Moon in 2032. Here’s Why We Should Destroy It

Share This Article
Facebook Twitter Copy Link Print
Previous Article We Need to Talk About Smart Glasses
Next Article The Gathering’ Draft May Bring More Crossover Sets
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1kLike
69.1kFollow
134kPin
54.3kFollow

Latest News

The Switch 2’s Game Ports Have Reached Their High-Water Mark
News
With ‘Super Mario Galaxy,’ the Switch 2 Feels More Like the Wii Than Ever
News
‘Dead’ Star Caught Snacking on Pluto-Like Object
News
Now, Netflix Is Rumored to Want Warner Bros.
News
Jimmy Kimmel and Disney Begin Talks to Revive His Show
News
Dakota Fanning’s ‘Vicious’ Fatally Flubs a Killer Premise
News
Big Tech Tells H-1B Workers Not to Leave Country Due to Trump’s New Policy
News
Netflix and Guillermo del Toro Team on ‘Boy in the Iron Box’ Film
News

You Might also Like

News

An AI Stan Lee Hologram Is Coming to LA Comic Con (Really)

News Room News Room 4 Min Read
News

DeepSeek Model ‘Nearly 100% Successful’ at Avoiding Controversial Topics

News Room News Room 3 Min Read
News

The ‘Lego Batman’ Devs Want to Make a New, Definitive Bat-Game

News Room News Room 3 Min Read
Tech Consumer JournalTech Consumer Journal
Follow US
2024 © Prices.com LLC. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • For Advertisers
  • Contact
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?