Comment on Long Cow is coming

<- View Parent
AdolfSchmitler@lemmy.world ⁨4⁩ ⁨months⁩ ago

There’s an idea about “autistic ai” or something where you give ai an objective like “get a person from point a to b as fast as you can” and the ai goes so fast the g force kills the person but the ai thinks it was a success because you never told it to keep the person alive.

Though I suppose that’s more human error. Something we take as a given but a machine will not.

source
Sort:hotnewtop