Comment on Fresh evidence of ChatGPT’s political bias: study

Iridium@lemmy.world ⁨10⁩ ⁨months⁩ ago

Using the Political Compass is a bit of a strange way to conduct research. I do think it is important to identify biases of course, but at some point you have to look at the bigger picture and realise why the bias exists.

In order to swing ChatGPT more to the right (if you want to balance it at neutral in the end), you’d have to inject it with more racism, anti-science conspiracy and American Christian views - none of which are particularly pleasant.

Do we want a LLM that limits facts about COVID-19 so that those who view it as a conspiracy feel validated?

Do we want it to respond that homosexual people don’t exist? Or even to say “I can’t give a response to this that remains politically neutral”?

Or if someone asks how old the earth is, do we want it to reply with “about 3000 years old”?

Or to contest climate change?

Do we want to sacrifice accuracy in favour of neutrality just because one party has a denial stance on these topics?

source
Sort:hotnewtop