that you need to get conspiracy theorists to sit down and do the treatment. With their general level of paranoia around a) tech, b) science, and c) manipulation, that not likely to happen.
You overestimate how hard it is to get a conspiracy theorist to click on something.
you need a level of “AI” that isn’t going to start hallucinating and instead enforce the subjects’ conspiracy beliefs. Despite techbros’ hype of the technology, I’m not convinced we’re anywhere close.
They used a purpose-finetuned GPT-4 model for this study, and it didn’t go off script once.
Butterbee@beehaw.org 2 months ago
It’s not even fundamentally possible with the current LLMs. It’s like saying “Yes, it’s totally possible to do that! We just need to invent something that can do that first!”
halm@leminal.space 2 months ago
I think we agree on the limited capability of (what is currently passed off as) “artificial intelligence”, yes.