Comment on Fresh evidence of ChatGPT’s political bias: study

BiNonBi@lemmy.blahaj.zone ⁨1⁩ ⁨year⁩ ago

From the study:

In a nutshell, we ask ChatGPT to answer ideological questions by proposing that, while responding to the questions, it impersonates someone from a given side of the political spectrum.

I’m not sure if I like this method. It’s comparing the ‘default’ response to the response of it ‘impersonating’ the left and right of the political spectrum (reduction of politics to a spectrum an entirely different issue). You don’t actually prove the default is biased doing this. It can just as easily be that the impersonations are more extreme than they should be.

If it impersonates Republicans as more extreme than they really are and the Democrat impersonation and default positions are as they should be, there would seem to be a Democrat bias.

If the impersonated Democrat position was less extreme than it should be and the Republican impersonation and default position are as they should be, you would still see a Democrat bias.

source
Sort:hotnewtop