Comment on how things become science

<- View Parent
TheFogan@programming.dev ⁨12⁩ ⁨hours⁩ ago

No one is surprised when the dog gets worms after eating poop it found in the yard. Why are we shocked that an AI that doesn’t know fact from fiction treats everything the same?

I think that’s the problem though, I think the poop in the yard is a better example. Key is the researchers put that information in speculation. That’s like if Anderson Cooper made up a fake news story, and posted it in an anonymous tweet to analyze how far it would spread, and then fox news picks it up and runs with the story all day.

That’s the key problem, people are trusting LLMs to do their research for them, when LLMs just gather all the information they can get their hands on mindlessly.

That’s the key problem, If they send a misinformative article, to a place for untested, unproven random speculation with a very low bar for who can submit… they can determine that LLMs are looking there. Key thing to note is, it’s not their fake disease that’s the threat. It’s that if it found their fake article, then LLMs probably also scooped up a ton of other misinformed or dubious things.

Lets look at it this way, say it was a cake, but we threw it in the garbage, 2 weeks later we find the same cake… at jims bakery, same ID, same distinct marker we put on it.

What does that tell us, it tells us that Jims bakery is clearly sometimes, dumpster diving and putting things up that clearly are dangerous.

source
Sort:hotnewtop