There is also that guy in the US that submitted arguments to the judge that referenced hallucinated case law.
qyron@sopuli.xyz 4 days ago
AI models to “aid” in court, listening to witnesses in order to assert if said person is telling the truth or lying are being proposed in my country.
Argument: it will speed up trials, declogging the justice system by extension.
Most lawyers are horrified, as well some judges.
Meanwhile, a judge as been suspended and reprimanded for using AI tools to write his decisions for him.
Yes, the bot did allucinate arguments and used argumentation in common law style, while my country is civil law model.
kameecoding@lemmy.world 4 days ago
LeninsOvaries@lemmy.cafe 4 days ago
You know, an AI designed to tell if a witness is lying would be really useful…
For autism diagnosis. If the AI thinks the patient is lying no matter what they say, the patient has autism.
qyron@sopuli.xyz 4 days ago
How come?
Blemgo@lemmy.world 4 days ago
Uncommon speech patterns and behaviours. People with ASD are more likely to be suspected to be lying when they are telling the truth, due to avoidance of eye contact, lower stress threshold, talking about unnecessary tangents that seem unrelated to the topic and uncommon stress reactions like fawning.
qyron@sopuli.xyz 3 days ago
You’re describing me but I am not autistic. Can we again just say it is just a bad idea all together.