Coming here because I saw how downvoted this post was on Reddit lol. I love that it’s triggering the Elon fanboys.
Marc Rober shows why Tesla's camera-only self-driving system is dangerous
Submitted 1 week ago by Ghostalmedia@lemmy.world to videos@lemmy.world
https://www.youtube.com/watch?v=IQJL3htsDyQ
Comments
Lukeazade@lemmy.world 1 week ago
imvii@lemmy.ca 1 week ago
Maybe it was downvoted because of Mormon weirdo Mark Rober and not the content itself?
Jollyllama@lemmy.world 1 week ago
Based on comments it’s the Tesla stans.
Lukeazade@lemmy.world 1 week ago
Yeah don’t get me wrong I’m not a Mark Rober fan and I don’t think he’s making this video because he’s anti Elon even, he’s just making it because it’s popular to hate Elon and Tesla at the moment. It happens to be a good thing, but unfortunately, I think Mark isn’t doing it out of virtue.
danc4498@lemmy.world 1 week ago
Is he a weirdo for being Mormon, or something else?
simplejack@lemmy.world 1 week ago
Seems like most of the downvotes are in the Telsa communities. The other communities upvoted it heavily.
melfie@lemmings.world 1 week ago
Self-driving in general has been overhyped by grifter tech bros like Elon and really shows the current limits of ML. Today, ML models are basically fuzzy, probabilistic functions that map inputs to outputs and are not capable of actual reasoning. There is a long tail of scenarios where a self-driving car will not generalize properly (i.e., will kill people). Throwing increasingly more data and compute at it won’t suddenly make it capable of reasoning like a human. Like other ML use cases, self-driving is a cool concept that can be put to good use under the right conditions, and can even operate mostly without human supervision. However, anyone claiming it’s safe to let today’s “self-driving” cars shuttle humans around at high speeds with no safeguards in place is either an idiot or a sociopath.
Ghostalmedia@lemmy.world 1 week ago
IMHO, this was really a video about camera-only automatic emergency braking, not autonomous driving.
Lots of cars have AEB now since a lot of regulators are requiring it, but most use a combination of cameras and ultrasonic sonic. The top-of-the-line systems have LiDAR, cameras, and ultrasonic.
Tesla’s sensors lack redundancy. If the cameras are obstructed or can’t distinguish shapes, the vehicle can’t fall back to another system.
Ghostalmedia@lemmy.world 1 week ago
Bonus deep dive about using LiDAR to map out space mountain
harryprayiv@infosec.pub 1 week ago
I wouldn’t exactly call that a deep dive.
ChaoticNeutralCzech@feddit.org 1 week ago
The channel is for 5-year-olds, they would drown in a real deep dive
Bahnd@lemmy.world 1 week ago
Im shocked Disney isnt theowing a fit over that. Their legal team must be busy this week
chetradley@lemm.ee 1 week ago
Getting into a legal battle with an immensely popular YouTuber would probably cost them a lot more in bad publicity than they would reasonably make from a lawsuit. I guarantee someone at Disney is doing or already has done the calculations.
Fizz@lemmy.nz 1 week ago
Insane that the telsa drives into spaces its unsure of. So dangerous
RamblingPanda@lemmynsfw.com 1 week ago
That’s the thing that got me. I would have issues spotting that child through the fog as well, but I wouldn’t have sped through it.
DaveyRocket@lemmy.world 1 week ago
A Tesla stopped for me at a crosswalk and I insisted, you go on ahead, I ain’t trusting Musk Tech with my life.
deranger@sh.itjust.works 1 week ago
What makes you think it’s unsure?
Fizz@lemmy.nz 1 week ago
True its not unsure but it should be. If it doesn’t have good viability it should have slowed down or disengaged auto pilot.
Kbobabob@lemmy.world 1 week ago
Maybe it is sure but that doesn’t make it accurate
danc4498@lemmy.world 1 week ago
Sure, but their sensors will detect if you aren’t paying enough attention and report back to Tesla headquarters to get the lawyers ready before you can even get out of your car.
harryprayiv@infosec.pub 1 week ago
I’ve been shit-talking Elon’s (absolutely boneheaded) decision to intentionally eschew system-redundancy in systems that are critical responsible for human life for years now. Since he never missed an opportunity to show off his swastikar in MANY of his previous videos, I had assumed Mark Rober was a sponsored member of the alt-right intellectual dark web. But I’m pleasantly surprised to see that this video is a solid (WELL-justified) smear. 👌
imvii@lemmy.ca 1 week ago
I had assumed Mark Rober was a sponsored member of the alt-right intellectual dark web.
He is.
elfin8er@lemmy.world 1 week ago
Source?
mysticpickle@lemmy.ca 1 week ago
slaacaa@lemmy.world 1 week ago
Thank god it doesn’t have LIDAR sensors, much cheaper to repair the front this way
Tap for spoiler
/s
Sceptique@leminal.space 1 week ago
Ahah Tesla is like a 2000s knock-off of good existing technology
RickC137@lemmy.world 1 week ago
I am not a fan of Tesla/Elon but are you sure that no human driver would fall for this?
jj4211@lemmy.world 1 week ago
Lets assume that a human driver would fall for it, for sale of argument.
Would that make it a good idea to potentially run over a kid just because a human would have to, when we have a decent option to do better than human senses?
RickC137@lemmy.world 1 week ago
What makes you assume that a vision based system performs worse than the average human? Or that it can’t be 20 times safer?
I think the main reason to go vision-only is the software complexity of merging mixed sensor data. Radar or Lidar alone also have their limitations.
I wish it was a different company or that Musk would sell Tesla. But I think they are the closest to reaching full autonomy. Let’s see how it goes when FSD launches this year.
ThePunnyMan@lemm.ee 1 week ago
Part of the problem is the question of who is at fault if an autonomous car crashes. If a human falls for this and crashes, it’s their fault. They are responsible for their damages and the damages caused by their negligence. We expect a human driver to be able to handle any road hazards. If a self driving car crashes who’s fault is it? Tesla? They say their self driving is a beta test so drivers must remain attentive at all times. The human passenger? Most people would expect a self driving car would drive itself. If it crashes, I would expect the people that made the faulty software to be at fault, but they are doing everything they can to shift the blame off of themselves. If a self driving car crashes, they expect the owner to eat the cost.
RickC137@lemmy.world 1 week ago
As soon as we have hard data from real world use and FSD is safer than the average human, it would be unethical to not solve the regulatory and legal issues and apply it on a larger scale to save human lives.
If a human driver causes a crash, the insurance pays. Why shouldn’t they if a computer caused the crash, which drives safer overall, if only by let’s say 10%.
undeffeined@lemmy.ml 1 week ago
The road runner thing seems a bit far fetched yeah. But there were also tests with heavy rain and fog which were not passed by Tesla.
Ghostalmedia@lemmy.world 1 week ago
The road runner thing isn’t far fetched. Teslas have a track record of t-boning semi trucks in overcast conditions, where the sky matches the color of the truck’s container.
RickC137@lemmy.world 1 week ago
Should be fine if the car reduces speed to account for the conditions. Just like a human driver does.
oplkill@lemmy.world 1 week ago
Isnt there a rule if weather very heavy and you cant see you must stop driving immediately
ayyy@sh.itjust.works 1 week ago
All the other cars he tested stopped just fine.
TheSealStartedIt@feddit.org 1 week ago
That is a completely legitimate question. That you are downvoted says a lot about the current state of Lemmy. Don’t get me wrong, I’m all for the Musk hate, but it looks like a nuanced discussion on topics where Nazi-Elon is involved is currently not possibe.
sir_pronoun@lemmy.world 1 week ago
What about the claims that he only used Autopilot, and not Tesla’s Full Self Driving?
(Context: I hate Tesla, just curious for the sake of an honest argument)
Ulrich@feddit.org 1 week ago
Not any tangible difference in this scenario. Both use vision only. And both use the same computers.
sir_pronoun@lemmy.world 1 week ago
But do they use a different software? Maybe FSD is more advanced than autopilot and could have reacted better?
Just playing devil’s advocate here.
sour@feddit.org 1 week ago
The other car only used emergency breaking, so there’s that.
pelespirit@sh.itjust.works 1 week ago
He was helping out Tesla by doing that. He was helping them get the wins they got instead of just Tesla massacring the kid every time. Note to self: As a pedestrian and you see a tesla, don’t cross the street.
thesohoriots@lemmy.world 1 week ago
I like to help a Tesla out by throwing it batteries, Philly style.
ayyy@sh.itjust.works 1 week ago
All the other cars he tested stopped just fine. Who cares about fiddling with modes and shit.
Snapz@lemmy.world 1 week ago
“Full shelf driving” still needs to be in quotes. It’s a feature’s brand name for a product that doesn’t actually have full self driving capabilities.
Try not to carry water for their attempted, repeated lie.
Manalith@midwest.social 1 week ago
Philip DeFranco had him on Yesterday and he said the reason he didn’t use FSD was that it required you to input an address, but that there isn’t any difference in terms of the sensors being used.
Given that the other car didn’t appear to have a version of FSD either, I’m unclear as to why Autopilot wasn’t the correct move for the most accurate comparison.
GaMEChld@lemmy.world 1 week ago
I’m bearish on TSLA, but still saw there’s some controversy surrounding his testing methodology and shortcomings in that video. Was talked about a bit on Philip DeFranco.
Ghostalmedia@lemmy.world 1 week ago
IMHO, at the end of the day, all of those vehicles have emergency braking system. It doesn’t matter if he was in FSD, Autopilot, or manual control, AED (Automatic Emergency Braking) should’ve stopped or slowed the vehicle.
NotMyOldRedditName@lemmy.world 1 week ago
Ya AEB is an always on thing.
AEB might not always prevent a crash, but it should slow the vehicle at the very least so the crash has less energy.
You could have a system that never prevents a crash and you’d still get an insurance discount due to the slowing benefits.
devilish666@lemmy.world 1 week ago
I never trust self driving/autonomous car no matter how advance their tech are
simplejack@lemmy.world 1 week ago
Having clocked in a lot of hours in San Francisco cabs, Ubers, Lyfts, and Waymos, IMHO, the Waymos are the least terrifying - by far.
My opinion might change if they’re ever allowed to travel at high speeds on a highway, but in a congested city where you can rarely get above 35mph, they feel really good.
No aggressive or distracted driving, no tipping, no stinky ass air freshers, and generally no double parking to pick people up.
I’m a convert.
JoMiran@lemmy.ml 1 week ago
Props to Benn Jordan for doing this a year ago on a slightly lower budget.
www.youtube.com/watch?v=2DOd4RLNeT4