“Yeah, let’s try it. [Korean-style steak sauce is] not something I’ve made before,” says Mancuso, remembering his script, “so I could definitely use the help.”
Then at the end of the article they embed an Instagram video from 2023 of Mancuso making a Korean-style steak sauce. *chef’s kiss*
AppleTea@lemmy.zip 2 days ago
Goddamn, a gaming outlet saying what the serious grown-up press should have been saying from the start!
AFKBRBChocolate@lemmy.ca 2 days ago
I’m an old fart - I got my degree in CS in 1985, and I’ve been paying attention to the predictions and advancements in AI for a very long time. I have at least as much issue with the way people think and talk about it as the author, but probably less of an issue with it being called AI. Remember that for decades, the informal working definition of AI was “A computer doing anything that usually requires a human.” So for ages, they said we’d have AI if a computer could read a page of printed text out loud in English. That seemed almost unattainable when it was first talked about, but now it’s so trivial that no one would consider it AI.
People have tried to make definitions that are crisper than that, but few if any of those definitions requires anything we’d call “thinking.” The frustrating thing is that the general public talks all the time about AI as if it’s conscious . Even when we’re talking about its flaws, we use words like “hallucinating,” which is something only thinking beings can do.
To me, LLMs are the worst things because to so many people they seem like the are (or could be) thinking entities. They respond to questions in a lifelike manner and can construct (extrapolate?) somewhat novel responses. But they’re also the least useful to us as a society. I’m much more interested in the Machine Learning applications for distilling gobs of data to develop new medicines or identify critical items in images that humans don’t have the mental bandwidth for. But LLMs get all the press.
krunklom@lemmy.zip 2 days ago
No arguments to what you wrote.
I’d add that llms are increasingly the only way I can find useful technical information on anything anymore.
Of course this is solving a problem that shouldn’t fucking exist in the first place, and I still need to take that information back to a search engine to verify it and do actual research, which may be the point.
…
Search is so. Fucking. Broken.