Even if they weren’t from the source, if it adds a link you can avoid hallucinations by reading the original website it supposedly found the information from.
I know this works with Perplexity.ai and the paid version of ChatGPT
Comment on Cry cry
hubobes@sh.itjust.works 1 day agoAre you certain that the answer was actually from their sources? I had multiple occasions where Mistral/ChatGPT gave me sources and I felt like something was off. I then followed the sources and could not find what they found according to themselves. I then asked them to quote the actual text they used to provide said answer and after drilling them a few more times they concluded that yes, the thing they said was actually not anywhere to be found in the sources they provided.
Even if they weren’t from the source, if it adds a link you can avoid hallucinations by reading the original website it supposedly found the information from.
I know this works with Perplexity.ai and the paid version of ChatGPT
Mmh, perhaps I lucked out or missed something. Everything looked good when I tested it.
AnUnusualRelic@lemmy.world 3 hours ago
This has also happened to me several times.