Elon Musk: Autopilot Feature Was Disabled In Pennsylvania Crash (latimes.com) 166
An anonymous reader writes: In response to the third reported Autopilot crash, which was the first of three where there were no fatalities, Tesla CEO Elon Musk says that the Model X's Autopilot feature was turned off. He tweeted Thursday afternoon that the onboard vehicle logs show that the semi-autonomous driving feature was turned off in the crash. "Moreover, crash would not have occurred if it was on," he added. The driver of the Model X told police he was using the Autopilot feature, according to the Detroit Free Press. The vehicle flipped over after hitting a freeway guardrail. U.S. auto-safety regulators have been investigating a prior crash that occurred while Tesla's Autopilot mode was activated. Late Thursday afternoon and into early Friday, Musk made some comments on the improvements made to its radar technology used to achieve full driving autonomy. "Working on using existing Tesla radar by itself (decoupled from camera) w temporal smoothing to create a coarse point cloud, like lidar," he tweeted. "Good thing about radar is that, unlike lidar (which is visible wavelength), it can see through rain, snow, fog and dust." Musk has rejected Lidar technology in the past, saying it's unnecessary to achieve full driving autonomy. Consumer Reports is calling on Tesla to "disable hands-free operation until its system can be made safer."
Hands-free? (Score:4, Insightful)
Tesla cars don't support hands-free operation. You're supposed to keep your hands on the steering wheel while using autopilot, and the car will disable auto pilot after a while if you take your hands off the wheel.
Perhaps they should reduce that timeout to discourage people from taking their hands off the wheel entirely.
Re:Hands-free? (Score:5, Informative)
Re: Hands-free? (Score:4, Informative)
Better than I thought though (Score:3)
My Drivers Ed teacher thir*cough* years ago had a story, I don't know if it was true but it was funny. A kid he was teaching was told to put on the cruise control during the freeway portion. He did, and took his hands off the wheel. He thought Cruise Control was an auto-pilot.
He ended up passing, but they stopped having people use cruise control.
Re: (Score:2)
Re: (Score:2)
The legend is of a boob in a motor home who engaged cruise control and went in the back to lie down.
Re: (Score:2)
Tells, but does not enforce. And so people do the completely predictable thing and release the wheel and stop paying attention to the road when the car is apparently doing a perfectly good job of handling things.
Bottom line - the car *does* do hands-free driving, it just isn't yet competent to do it without constant oversight. Compare to other vehicles (BMW?) that have basically the same system, but include contact sensors on the wheel to ensure drivers keep their hands in position.
Re: (Score:2)
What Tesla and Musk need to do is re-name the feature to something like Driving Assist, or Safety Assist. It is NOT a completely autonomous driving system, you are not meant to take your hands off of the wheel or your mind off of the road. I think that the accidents so far are people ignoring warnings and somehow thinking that Autopilot means that the car can drive itself.
The name is only an issue in that it discredits Tesla's attempts to throw blame on drivers.
Call it an autopilot or not drivers will still figure out that you don't need to be paying attention to be driving, and they'll inevitably zone out, watch DVDs, and even drift off to sleep like this guy possibly did.
The fix is simple, if you're sitting in the driver's seat then you're the one steering the car. The AI can still act as a safeguard but until it's ready to assume full responsibility for the journey the hu
That radar really worked well in florida eh elon (Score:2)
Face it, LIDAR is too pricey at the moment and all the car makers are trying to get a crap system out ahead of each other. Someone died because of it.
Re:That radar really worked well in florida eh elo (Score:5, Insightful)
And for what it's worth, that doesn't mean the system couldn't/shouldn't be improved. It just means they didn't die because of the system.
Re: (Score:2)
If the accident was preventable, the driver should of prevented it. They should be paying attention to the road and be in a position to respond. If it wasn't preventable by the driver, then the system is working at least as well as the driver in that situation. Either way, the system isn't responsible.
How the hell does this point of view get upmodded? You're basically saying that if the driver fails and causes an accident, it's the drivers fault. If the system fails and causes an accident, it's still the drivers fault?
You're saying that, no matter what happens, it can never be the system's fault.
Re: (Score:2)
Comment removed (Score:5, Insightful)
Re: (Score:3)
Tech fails all the time, and thats OK, because programmers find the holes and improve the technology. Like people, its not full-proof. It gets better when people use it more. How many windows patches would the
Re:That radar really worked well in florida eh elo (Score:4, Interesting)
This kind of thing - at the very least the finger pointing surrounding it - is why until now nobody put "beta" heavy machinery in the hands of the general public.
The general public should never be assumed to use things as designed - not all product liability lawsuits are as frivolous as they are sometimes portrayed.
I would probably find Tesla negligent just on the grounds that they are assuming people won't abuse (or even simply misuse!) the feature. Waivers notwithstanding - it would be interesting to see those in court, because I guarantee just about everyone who signed one would have to say "I just signed it to get the shiny, I don't know what it said" if they were being honest. This means there is no evidence of expectation in the general public that these things are "beta".
Put another way: you can call something "beta" all you want in theory, but if you're selling it to the general public, it ain't in practice.
Re: That radar really worked well in florida eh el (Score:1)
Re: (Score:2)
It's a little more than that. Autopilot combines adaptive cruise control (matches speed of traffic ahead), lane-centring assist (automatically turns to keep you in the middle of the lane if you drift around), automatic emergency braking (if you get too close to something at speed and it will automatically hit the breaks if you don't), and automatic lane changing (hit the signal and it changes lanes for you).
Except for the lane changing trick, none of these are new things. Adaptive cruise control has exist
Re: (Score:2)
Ugh, really? I am concerned about all these things that just add failure modes (will the car only operate in limp-home mode if there is a problem with the auto-brake system?), raise barriers to entry for new vehicle companies, and remove incentives for people to have situational awareness (couple with failure modes - if people are used to auto brake, so don't pay attention, then there is a problem with auto brake, what h
Re: (Score:2)
I think the statistic for auto-brake last I saw was an estimated 14 billion Euros a year in savings from crashes that not longer happen. That is a lot and I mean a lot of money to be saving and there was a not insignificant number of lives to be saved as well. Basically the return on investment is significantly greater than one, so it is a no brainer really.
Of course it's not so good if you are an auto crash repair company or on the organ transplant waiting list, but that is the old buggy whip problem.
Re: (Score:2)
Any articles? I couldn't seem to find anything other than this one [euroncap.com] that says crashes are reduced by 38%, but there were no cost figures.
That said, I don't think it's a buggy-whip problem: 14B Euro per year might be correct, but I don't think it's a win for society in monetary terms - I'd question the assertion that the return on investment is greater than one.
For example, the mandate in the US for rear back-up camera costs society about $25 million per life saved, because it's a couple hundred bucks times s
Re: (Score:2)
No. No one died because of technology. People died because they are not using the technology correctly. Tesla let the drivers know that the tech was in beta, Telsa let the drivers know to pay attention and still be alert and able to take control if the system fails to do exactly what it failed to do.
And *that* is a human factors engineering problem. Here you have a system which allows the driver to pay less attention *and* it expects the driver to take over in an emergency? That combines the worst of two separate systems.
Re: (Score:2)
This article has nothing to do with somebody dying. Are you talking about the semi truck incident in Florida? Could it have at least something to do with a driver using BETA software NOT paying attention to the road as they were instructed to do numerous times?
the X's Autopilot (Score:2)
Test flight OK, except autoland very rough. (Score:1)
Autoland not installed on this aircraft.
Maybe the driver believed it was enabled? (Score:2)
If the driver believed the autopilot was on when it was off, then we have to ask "why did he think it was on when it was off?"
Was the driver not paying attention to the system, and just assumed it was on, or did the system lie and tell the driver that it was on when it wasn't?
Re: (Score:2)
Maybe he disabled the annoying hands-are-off-the-wheel reminder, just not in the way he intended.
Re:Maybe the driver believed it was enabled? (Score:4, Insightful)
Or maybe the driver just lied to cover up the fact that he was a poor driver who lost control and hit a guard rail. After all a car hitting a guard rail has never happened before driver assist was implemented. Nor has any driver attempted to cover their a$$ by lying to the cops about what they were doing that led up to the crash. I dropped the roach and was trying to find it when I slammed into the car ahead of me...err no that's not what happened
Re:Maybe the driver believed it was enabled? (Score:5, Insightful)
Exactly. Autopilot has to be the best excuse for wrecking your car ever invented. "No no, I didn't do it, the car did it itself. Really!"
Re: (Score:2)
Which, as demonstrated in this comment section, people refuse to believe.
Re:Maybe the driver believed it was enabled? (Score:5, Interesting)
Now, although disabling automatic systems on manual input has been the standard for as long as automatic systems have been available, I am beginning to wonder if it really is the right decision here. People seem to be turning it off without realising that they have done it.
Re: (Score:3)
It's hard to not realize you've disabled it. First, there's a distinct two-tone chime. Second, if regenerative breaking is enabled at maximum (which I think most people do), the car slows down noticably unless you press the accelerator. And there's just a "feel" with the torque on the wheel or something. It's just hard to miss unless it's your first day using it or you're just not paying attention at all.
Re:Maybe the driver believed it was enabled? (Score:5, Informative)
The Tesla logs were reported as saying:
Now, you can believe this or not, but it doesn't match up with your hypothesis.
Re: (Score:2)
Re: (Score:2)
Thanks! I hadn't read that bit of information.
That looks disturbingly like the driver fell asleep, and didn't wake up fully when they took control. Ouch - how do we fix that one???
By not having an autopilot that requires human intervention.
The problem with the autopilot doing its own thing for a while and then handing control back to the user is that the user may not be in a state where they're able to safely drive.
They might be fiddling with a DVD player, reaching into the glove compartment, or had fallen asleep because they weren't required to pay attention while the car was driving itself.
There's no safe way to hand control back to the driver while the car is in motion, either the
Re: (Score:3)
That's a hard one. Notably, we have yet to fix the problem of driver drifting off to sleep even when there isn't an autopilot to back them up.
Re:Maybe the driver believed it was enabled? (Score:4, Insightful)
It can. Note that in an airplane, the pilot is expected to remain alert and at the controls while autopilot is engaged.
Re: (Score:2)
It can. Note that in an airplane, the pilot is expected to remain alert and at the controls while autopilot is engaged.
"Expected" doesn't mean "always works that way".
There are a number of aircraft accidents in recent years that occurred just after the a/p kicks out, and the pilots then lost control of a perfectly flyable aircraft. There are others where the pilot's primary response to control or attitude problems has been merely to repeatedly try and engage autopilot. There are also aircraft that have crashed due to pilots thinking autopilot (or auto-thrust or auto anything) was engaged when it wasn't. Look up "automati
Re: (Score:2)
Humans are quite inventive when it comes to ways to screw up. Drivers are expected to not watch TV while driving but I have seen it. They're expected not to talk or text on the cellphone. They're expected to be sober. They're expected to remain awake.
I don't see why one particular thing that drivers might abuse should be the focus, especially when it is one of the few that has even a chance of reducing fatalities. That's not to say that we shouldn't learn from the problems, but not everything is properly tr
Re: (Score:2)
Re: (Score:3)
This can be fully in line with the driver thinking the autopilot being on-line. Potentially the driver was trained by the system over time that he just has to shortly move the steering wheel in answer to a "hands on wheel!" requests by the system to be allowed to take his hands off again for one/some minutes. Only this time he did it too late so the system did not re-engage.
This is the problem of allowing long stretches of hands-off with only short stretches of hands-on because one originally promised "comp
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The Tesla logs were reported as saying:
Now, you can believe this or not, but it doesn't match up with your hypothesis.
It perfectly matches up with what I (and others) have been saying about partial autonomous driving: if the car drives perfectly for 50 minutes, and then requires the human to take over, the human may not be in a position to do so.
Driving should be fully autonomous or not at all - partial autonomy is no good. We'll have fully autonomous cars when we have perfect general purpose AI.
Re: (Score:1)
Re: (Score:2)
Actually, there is feedback that it has been disabled. The car makes a chime noise, and there are dash indicators.
Of course a driver oblivious enough to let a car drive into a guardrail is also likely oblivious enough to ignore said feedback.
Wasn't me! (Score:3, Insightful)
You can almost see this autopilot thing becoming an excuse whenever the car crashes. Me? Nah! It was the car! I swear it was the car! I am an always responsible driver! How can you dare saying that I am responsible for the car... it was the car itself I tell you! This AI is just bad I swear!
So now that it is clear that *I* the driver has no fault on the crash, could you please not raise my insurance?... or prosecute me for killing that pedestrian, or running over that biker?... it was clearly not the alcohol... but the car!
I'm glad they kept logs. Even if the logs are not 100% reliable, it is better than just the word of an honest driver that just happens to have someone to blame it onto.
its not musk's call (Score:1)
musk must be desperate to assert things that cannot be verified independently and in direct contradiction to victim' claims.
this sort of thing, even if true, should be better coming from an independent source.
Re:its not musk's call (Score:5, Insightful)
Re: (Score:2)
Re: (Score:3)
Thanks for your generalisation about a company who's CEO voluntarily disclosed to regulators who weren't investigating the autopilot feature that the autopilot feature has been the result of 2 crashes prior. Clearly they are not to be trusted.
Your comment shows us a great deal about your ability to use your brain.
Re: (Score:2)
NTSB investigation? (Score:2)
Is that incident being investigated by the NTSB? Because parties to an NTSB investigation who release information outside the framework of that process tend to get a not-so-nice letter from the Justice Dept.
sPh
Re: NTSB investigation? (Score:2)
The one true metric should be.... (Score:5, Insightful)
Re:The one true metric should be.... (Score:4, Interesting)
For fatal accidents, the per-mile rate is lower with Autopilot enabled.
But perhaps that doesn't tell the true story. Autopilot cannot be used in many situations, what if those situations are more dangerous? In other words, if the Autopilot can only be enabled on roads that are generally safer, then pure per-mile statistics are misleading.
Re: (Score:1)
Very true, and while it's difficult to adjust for this, the Tesla also has proven remarkably safe when accidents do occur compared to other vehicles, so how much does good safety characteristics decrease the reported number of fatal crashes.
Re: (Score:3)
No - it doesn't tell the whole story. The per-mile rate for non Autopilot vehicles is based on multiple billions of miles driven, where a single 'extra' death tomorrow would change the fourth or fifth decimal place. The per-mile rate for Autopilot is based on a much smaller sample size, a single d
Re: (Score:2)
Privacy? (Score:2)
Re: (Score:3)
Not me.
I look forward to the day when car crashes are looked on the same as jet crashes. And one of the first goals with a passenger jet crash is to get the data off the flight recorders to crash investigators. The FAA takes the concept that crashes are simply not acceptable, just not "something that happens", and whatever caused a given crash doesn't just get marked "WILLNOTFIX" and closed. The latter is basically the situation with car accidents today. I strongly support, at least on an aspirational l
Re: (Score:2)
With aircraft crashes, there is usually a large organization/corporation at least partially at fault: the airplane manufacturer, or an airline that hired the pilot or scheduled a flight through dangerous conditions. If a plane goes down and everyone inside dies, the organization still exists, and can be fined for their negligence.
In contrast, if a drunk driver swerves their car into an oncoming lane and dies, punishing the driver for DUI isn't an option, and does sadly little to deter others from drunk driv
Re: (Score:1)
In the case of autonomous and semi-autonomous cars however there is an opportunity to fix certain forms of driver error, especially the type where the driver's error is inaction.
Take the fatal tesla crash into a semi-trailer. The driver was supposed to be responsible for overseeing the safe operation of the car even with autopilot turned on, but that doesn't at all negate the fact that the autopilot shouldn't have crashed in the first place. The two solutions are to either ban autopilot (either industry wid
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
We're never going to get to that day. Automation is never getting into economy cars.
Why not? And why do you think it's something only the government can make happen?
Re: (Score:2)
Re: (Score:1)
Remember Moscow Rule # 4 ~ "Don't look back; you are never completely alone."
A Safer Cruise Control (Score:4, Insightful)
Turned off (Score:2, Funny)
Re: (Score:2)
I wonder how easy it is to tell if Autopilot is engaged? I mean, if it's just one small graphic maybe people are getting confused and thinking that it's on when it's really off.
The UI design could definitely be improved. Reduce the hands-off-wheel time limit to 5 seconds, and make it beep loudly and incessantly when AP is off and the drive isn't gripping the wheel at speed.
Re: (Score:2)
And there we go.... (Score:2)
One fact about humans.
They don't take responsibility for their actions, and they lie hard about accidents and try to place blame elsewhere.
User error again? I'm shocked. SHOCKED! (Score:2)
Well, not that shocked.
User error again, so the SEC investigation is unwarranted. Why are they not investigating GM for covering up the fatal ignition switch problems, or Toyota for their safety issues, or VW for cheating emissions tests across VW and Audi vehicle lines?
"First of three.. no fatalities" = FUD (Score:5, Interesting)
In response to the third reported Autopilot crash, which was the first of three where there were no fatalities
The first crash in Florida was the guy who got killed going under the truck while watching his DVD. [time.com].
The second crash was a gallery owner in Detroit and he and his passenger survived without any injuries
The third crash - the one apparently without autopilot - hit a guard rail in Montana [digitaltrends.com]. "The two occupants walked away without major injuries."
I don't know why this "fatalities in two crashes" myth is so pernicious. It was also falsely claimed in this Slashdot story [slashdot.org] on the third crash last Monday. But all of the linked articles are absolutely clear that there's been only one fatality, so it's not like the various submitters are just getting bad information from the media. Instead, the Subbys appear to be making up the second fatality out of nothing.
A more skeptical person than me would wonder if someone shorted TSLA.
Maybe Tesla SHOULD be blamed for this? (Score:2)
According to TFS, Elon Musk believes if the autopilot was active it would have prevented this accident from happening.
So let's just take his word for that. Driver makes an error and causes a crash that autopilot would have prevented, while driving a car that has the autopilot function installed and in good working order but the driver decided to operate the car fully manually.
We have cars with technologies like traction control, anti-lock braking, assisted braking/steering options, there are various collisi
Re: (Score:2)
This is interesting... although I don't think the technology is ready for something like this. Consider the thought experiment of having to select one of two options, where in one you die and in the other you kill somebody else, for instance: you're going on the highway, and ahead of you, blocking all the lanes, are different subjects, and hitting anyone will kill them. What should the autopilot do?
Simple: slam on the brakes and come to a full stop as soon as possible. Autopilot will be at least a second faster than any human in this, lessening whatever impact. Maybe the accident can not be fully prevented, but that one extra second of braking can make a huge difference, including that between life and death.
Re: (Score:2)
If slamming on the brakes causes an accident (almost certainly a rear-end collision - something that appears to far more prevalent in the US than other parts of the world considering anecdotal evidence on /.), it simply means that the people behind it were either keeping insufficient distance or not paying attention or both. Both those issues would be solved with an autopilot, and even more so a way for cars to broadcast a "I'm braking!" kind of warning to other nearby vehicles.
Backwards! (Score:2)
Consumer Reports is calling on Tesla to "disable hands-free operation until its system can be made safer."
I'm calling on some drivers to disable hands-on operation until they can be made safer drivers.
Re:Beyond a doubt (Score:5, Insightful)
Re: (Score:1)
Also, the code that writes the log file entries is probably extremely simple and easy for forensics to independently verify.
Re:Beyond a doubt (Score:4, Insightful)
So what. It's not an either/or situation. Both sides have an incentive to spin things, that's not by itself an argument for the other side.
Yes, there are dishonest people out there. But also anyone with experience in mass production will tell you that real life has a tendency to make flukes and quirks happen sooner than later. There's just too many factors in the production line and in the end user environment to account for all possible defects. Cars are not simple apps running on heroku, there's a physical and mechanical element to consider.
Ever heard of problems with early MAP sensors back when they were installed inside the cab? Difference of temperature in winter tended to cause condensation in the line, but by the time the car was on the lift at the dealership, the condensation was gone. In hindsight the problem is obvious, but back then the "conclusive tests" made by the manufacturer made them treat honest people like crooks or idiots.
Re: (Score:3)
So what. It's not an either/or situation. Both sides have an incentive to spin things, that's not by itself an argument for the other side.
Yep, that's when you look at history. On the one side we have a driver who we have no history on talking to the police.
On the other side you have a company who voluntarily reported that an autopilot feature was activated during a crash to regulators, who while they stand to lose a lot have shown a history of honesty.
Who do you believe?
Re: (Score:1)
Yep, that's when you look at history. On the one side we have a driver who we have no history on talking to the police.
On the other side you have a company who voluntarily reported that an autopilot feature was activated during a crash to regulators, who while they stand to lose a lot have shown a history of honesty.
Who do you believe?
Your argument sounds a lot like the reasoning banks use when they think that someone who never had a debt in their lives is a higher liability than someone with debts.
Lack of data isn't a good argument for not trusting someone.
Re: (Score:3)
Re: (Score:2)
Your argument sounds a lot like the reasoning banks use when they think that someone who never had a debt in their lives is a higher liability than someone with debts.
Lack of data isn't a good argument for not trusting someone.
Of course it is. The very basis for the word trust implies that you know something about that person. Lack of data gives you every reason not to have any trust.
The banks are right too but for the wrong reasons. It is a given that someone with a debt record is less of a liability as their behaviour is well known. Where the system falls down in the USA most of all is that the banks don't look beyond the often incorrect bank records. In much of the other part of the world banks will also look at your history o
Re: (Score:2)
Having been rear ended a few times in my life there's on thing I've learnt, there's rarely good faith in a traffic accident. There's little faith if the person has an insurance premium that will go up. There's no faith if the person doesn't have insurance at all.
Re: (Score:1)
Unless there's a complete programming/logging failure at a nearly unbelievable level of incompetence, the claims can be verified. The average person doesn't realize that the claims can be verified even by third parties like the government. The company in question does realize that. Who do you think is going to be more likely to tell the truth?
That doesn't sound like a very tricky situation to me.
Re: (Score:2)
Re: (Score:2)
Yes, because the word of the driver who totaled his car is also entirely reliable. I mean, it's not like he'd be liable if he crashed it himself, but could get a lot of money if Autopilot was the problem.
The autopilot also limits speed to legal limits. Which driver do you know follows the posted limits. Therefore, to exceed the posted speed limit, you disable autopilot.
Re:Beyond a doubt (Score:5, Interesting)
The Tesla logging system is not under investigation for being unreliable.
Whether a hardware system is on or not is entirely different from any data that may be generated by it. There's a number of events required to enable autopilot, and all are logged.
Re: (Score:1)
Says you. Here's what the WSJ article says:
The National Transportation Safety Board also is investigating the crash to determine whether it reveals systemic issues tied to development of driverless cars
See the part about "systemic issues"? That's pretty much the opposite of focusing an investigation on a single hardware component like you imply.
Re: (Score:3)
Yes because WSJ is the epitome of engineering information....
Got any quotes from Fox news as well?
Re: (Score:2)
No, the WSJ is not the epitome of engineering information, we've seen that with the bullshit story they published about Theranos. But they're still able to quote the NHTSA, and if that's not up to your Tesla fanboi standards just fucking google it the same quote has been widely reported by other organizations.
Re: (Score:3)
I'm curious. Does the onboard driver assist computer phone home with log data to Tesla Motors? I'm pretty sure that in the case of an accident investigation, the car was impounded by the authorities. It would be pretty extraordinary if the highway patrol or accident investigators gave the information up to Tesla, considering Tesla didn't own the car.
Re: (Score:2)
I believe it does phone home, but I think that I read something that implied that Tesla had got physical access to the car also. Since there is no standard for storing and accessing this type of data and the formats are typically proprietary and secret, law enforcement's only way to get it is to ask the manufacturer.
Re: (Score:1)
But binary logs are better! Lennart Poettering says so.
Re: (Score:2)
Because it is a well established practice. Airbus/Boeing always receive logs by satellite and can get a better idea even before they can lay their hands on the aircraft.
Re: (Score:2)
The logging software is not under investigation. The company who owns the software who is under investigation were the ones who voluntarily disclosed their software's part in an earlier crash. And we compare this to a man who stands to lose his car, his insurance, and get a fine from the police.
It may not be foolproof evidence but on the balance of probability given the history I know which side I'd believe.
Re: (Score:2)
Re: I wouldn't be bragging quite so much... (Score:3, Funny)
Wait. I thought the problem was that it DIDN'T miss the truck..