AI does not hallucinate, since it has no conciousness or thinking.
Comment on Google Gemini deletes use's code
NeatNit@discuss.tchncs.de 3 days ago
None of this would happen if people recognized that, at best, AI has the intelligence level of a child. It has a lot of knowledge (some of which is hallucinated, but that’s besides the point) but none of the responsibility that you’d hope an adult would have. It’s also not capable of learning from its own mistakes or being careful.
There’s a whole market for child safety stuff: corner foam, child-proof cabinet locks, power plug covers, etc… You want all of that in your system if you let the AI run loose.
SweetCitrusBuzz@beehaw.org 2 days ago
NeatNit@discuss.tchncs.de 1 day ago
I genuinely considered writing “confabulated” instead of “hallucinated” but decided to stick with the latter because everyone knows what it means by now. It also seems that ‘hallucination’ is the term of art for this: en.wikipedia.org/…/Hallucination_(artificial_inte…
So while I appreciate pedantry and practice it myself, I do stand by my original phrasing in this case.
SweetCitrusBuzz@beehaw.org 1 day ago
It isn’t pedantry in the case I’m making. I’m making more of a moral/ethical point in that it’s unfair and probably ableist to people who do actually hallucinate to compare it with something that doesn’t actually do that.
NeatNit@discuss.tchncs.de 1 day ago
I see, that’s different from how I interpreted it. Thanks for clarifying.
I don’t really see it that way. To me it’s not downplaying anything. AI ‘hallucinations’ are often disastrous, and they can and do cause real harm. The use of the term in no way makes human hallucinations sound any less serious.
As a bit of a tangent, unless you experience hallucinations yourself, neither you nor I know how those people who do feel about the use of this term. If life has taught me anything, it’s that they won’t all have the same opinion or reaction anyway. Some would be opposed to the term being used this way, some would think it’s a perfect fit and should continue. At some point, changing language to accommodate a minority viewpoint just isn’t realistic.
I don’t mean this as a blanket statement though, there are definitely cases where I think a certain term is bad for whatever reason and agree it should change. It’s a case by case thing. The change from
master
tomain
as the default branch name in git springs to mind. In that case I actually think the termmaster
is minimally offensive, but literally no meaning is lost if switching tomain
and that one is definitely not offensive so I support the switch. For ‘hallucination’ it’s just too good of a fit, and is also IMO not offensive. Confabulation isn’t quite as good.
megopie@beehaw.org 2 days ago
Exactly, They’re just probabilistic models. LLMs are just outputting something that statistically could be what comes next. But that statistical process does not capture any real meaning or conceptualization, just vague associations of when words are likely to show up, and what order they’re likely to show up in.
JoMiran@lemmy.ml 3 days ago
A child, on acid and meth. You should never let it run lose, no matter how many safeguards.
furrowsofar@beehaw.org 3 days ago
I am never quite sure if the I in AI stands for intelligence or ignorance.