Tesla’s recall of 2 million cars relies on a fix that may not even work::Tesla agreed to the recall last week after a federal investigation the system to monitor drivers was defective and required a fix.
Tesla’s website says that Autopilot and more sophisticated “Full Self Driving” software cannot drive themselves
Full self driving
Cannot drive themselves
Christ I can smell the bullshit all the way from Canada.
The “we can do it by end of this year” he’s been toting since 2016 wasn’t a giveaway?
Dude, sit tight. Full self driving is coming at the end of 2017!
So they are pretty much trying to figure out how to make sure the driver is paying attention to the road? IDK, maybe make the car respond to the steering wheel so that the driver has to move it or the car will not turn? That would ensure the driver is actually looking at the road.
Alternatively ask them questions about the surroundings. “Driver, what state is the car in front of you from? You have 3 second to answer or FSD will be disabled”.
Just because a driver has their hands on the wheel doesn’t mean they’re watching the road. They might be watching a movie.
As for asking about number plates - that sounds like a distraction that would cause accidents rather than prevent them.
For me these systems need to be really clear. Either the person is driving, in which case they are fully responsible for every crash, or the car is driving, in which case the car is fully responsible. There’s no room for any grey area in the middle.
In my opinion Tesla should be forced to refund anyone who was told their car has “full self driving”. I’m OK with autopilot though, since the airplane and boat version of that feature has always pretty much been “just keep going in a straight line until a human disengages autopilot”.
Asking questions was obviously a joke.
As for the rest I don’t know what would it take to make sure the driver is paying attention. Distracted driving is the most common cause of accidents so clearly even in normal cars we can’t be sure drivers are paying attention. I think we can agree cruise control is generally good but I have no idea what happens once the car has line following. Is it the same? You focus on the road more? Or do you stop paying attention completely? I think it’s a questions to scientists really. Someone has to test it rigorously before it’s actually added to the cars. My feeling is that once you don’t have to drive by yourself (as in turn and brake) you eventually stop paying attention, so yeah, either the car drives itself 100% or you drive.
Note: it’s not the HTSB or any other agency’s responsibility to figure out a solution for Tesla. They just need to figure out what the bar for safety is, and tell tesla “make it as safe as full low light eye tracking, with whatever solution you want. But if you can’t make it at least that safe your cars shouldn’t be allowed back on the roads”.
I was the biggest cheerleader for self driving cars because i hate driving - but “our best self driving car still can’t self drive at all” isn’t good enough, and letting them keep doing half assed shit like this does more harm to bringing people around to the technology than good.
A better solution, experts say, would be to require Tesla to use cameras to monitor drivers’ eyes to make sure they’re watching the road. Some Teslas do have interior-facing cameras. But they don’t see well at night, unlike those in General Motors or Ford driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.
In case you were wondering who wrote the article
This is the best summary I could come up with:
Tesla’s recall of more than 2 million of its electric vehicles — an effort to have drivers who use its Autopilot system pay closer attention to the road — relies on technology that research shows may not work as intended.
But research conducted by NHTSA, the National Transportation Safety Board and other investigators show that merely measuring torque on the steering wheel doesn’t ensure that drivers are paying sufficient attention.
“I do have concerns about the solution,” said Jennifer Homendy, the chairwoman of the NTSB, which investigated two fatal Florida crashes involving Teslas on Autopilot in which neither the driver nor the system detected crossing tractor trailers.
Missy Cummings, a professor of engineering and computing at George Mason University who studies automated vehicles, said it’s widely accepted by researchers that monitoring hands on the steering wheel is insufficient to ensure a driver’s attention to the road.
But they don’t see well at night, unlike those in General Motors or Ford driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.
Kelly Funkhouser, associate director of vehicle technology for Consumer Reports, said she was able to use Autopilot on roads that weren’t controlled access highways while testing a Tesla Model S that received the software update.
The original article contains 1,028 words, the summary contains 212 words. Saved 79%. I’m a bot and I’m open source!
Tesla agreed to the recall? Did they really have a choice?
Here I am hoping that Tesla, Twitter, Space X, and any other brand associated with Elon Musk burn to the fucking ground. Burn baby burn, show this wanna be emperor that he’s wearing nothing at all.
This is so fucking stupid it actually makes me mad. A tiny percentage of people died misusing the feature and now Tesla is forced to upgrade people to a technology that doesnt exist yet??? For free??? Holy shit this is dumb. Tesla should just relabel it to auto assist or something
Buyer’s remorse is rough I take it.