Comment on "No, seriously. All those things Google couldn't find anymore? Top of the search pile. Queries that generated pages of spam in Google results? Fucking pristine on Kagi – the right answers, over and over again."

<- View Parent
TehPers@beehaw.org ⁨8⁩ ⁨months⁩ ago

Be careful relying on LLMs for “searching”. I’m speaking from experience here - getting actually accurate results from the current generation of LLMs, even with RAG, is difficult. You might get accurate results most of the time (even 80% or more), but it can be difficult to identify the inaccurate results due to the confidence models present their output with when hallucinating.

Also, if your LLM isn’t doing retrieval-augmented generation (RAG), then it isn’t actually a search and won’t find results more recent than the data it was trained off of.

source
Sort:hotnewtop