Comment on ‘Terminator-style’ loss of control is biggest AI risk, Technology Secretary says
SmoothIsFast@citizensgaming.com 11 months ago
The biggest risk is idiots not understanding its a prediction engine based on probabilities from its training set and trying to assign intelligence to it, like they are doing here. They are not gonna go out of control skynet style and gain sentience, most likely just hit actual edge cases and fail completely. Like ai target detection showing bushes as tanks, pickups as tanks, etc. Or self driving cars running into people. If the environment and picture is new a probability engine has two choices, have false negatives and prevent any unknowns from getting a proper detection or have false positives which may cause severe harm.