cross-posted from: lazysoci.al/post/25674400
ai automates existing biases 🌈
Submitted 3 days ago by sabreW4K3@lazysoci.al to technology@beehaw.org
https://www.theregister.com/2025/05/02/open_source_ai_models_gender_bias/
cross-posted from: lazysoci.al/post/25674400
ai automates existing biases 🌈
Thing echoing the internet’s average opinion echoes the internet’s average opinion, completely obsolete study finds
MagicShel@lemmy.zip 3 days ago
I think researchers are trying to make AI models more aware, but they are trained on a whole lot of human history, and that is going to be predominantly told from white male perspectives. Which means AI is going to act like that.
Women and people of color, you should probably treat AI like it’s that white guy who means well and thinks he’s woke but lacks the self-awareness to see he is 100% part of the problem. (I say this as a white guy who is 100% part of the problem, just hopefully with more self-awareness.)
nesc@lemmy.cafe 3 days ago
Everyone should treat ‘ai’ like a program that it is. Your guilt compex is irrelevant here.
MagicShel@lemmy.zip 3 days ago
Has nothing to do with guilt-complex. Why would I feel guilty for being privileged? I feel fortunate, and obliged to remain aware of that.
Treating AI like a “program,” however, is a pretty useless lead in to what you really posted to say.
GammaGames@beehaw.org 3 days ago
The program is statistically an average white guy that knows about a lot of things but doesn’t understand any of it soooooo I’m not even sure what point you thought you had
valkyrieangela@lemmy.blahaj.zone 3 days ago
If you feel guilty about this, you may be part of the problem
Kichae@lemmy.ca 3 days ago
There is no reason to even suggest that AI ‘means well’. It doesn’t mean anything, let alone well.
MagicShel@lemmy.zip 2 days ago
Of course. It’s an analogy. It is like someone who means well. It generates text from the default perspective, which is white guy with a bunch of effort to make it more diverse with a similar end result. The responses might sound woke but take a closer look and you’ll find the underlying bias.