Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Communications Transportation AI Businesses Network Networking Operating Systems Software The Internet News Build Hardware Technology

Elon Musk: Autopilot Feature Was Disabled In Pennsylvania Crash (latimes.com) 166

An anonymous reader writes: In response to the third reported Autopilot crash, which was the first of three where there were no fatalities, Tesla CEO Elon Musk says that the Model X's Autopilot feature was turned off. He tweeted Thursday afternoon that the onboard vehicle logs show that the semi-autonomous driving feature was turned off in the crash. "Moreover, crash would not have occurred if it was on," he added. The driver of the Model X told police he was using the Autopilot feature, according to the Detroit Free Press. The vehicle flipped over after hitting a freeway guardrail. U.S. auto-safety regulators have been investigating a prior crash that occurred while Tesla's Autopilot mode was activated. Late Thursday afternoon and into early Friday, Musk made some comments on the improvements made to its radar technology used to achieve full driving autonomy. "Working on using existing Tesla radar by itself (decoupled from camera) w temporal smoothing to create a coarse point cloud, like lidar," he tweeted. "Good thing about radar is that, unlike lidar (which is visible wavelength), it can see through rain, snow, fog and dust." Musk has rejected Lidar technology in the past, saying it's unnecessary to achieve full driving autonomy. Consumer Reports is calling on Tesla to "disable hands-free operation until its system can be made safer."
This discussion has been archived. No new comments can be posted.

Elon Musk: Autopilot Feature Was Disabled In Pennsylvania Crash

Comments Filter:
  • Hands-free? (Score:4, Insightful)

    by Guspaz ( 556486 ) on Friday July 15, 2016 @07:51PM (#52521967)

    Tesla cars don't support hands-free operation. You're supposed to keep your hands on the steering wheel while using autopilot, and the car will disable auto pilot after a while if you take your hands off the wheel.

    Perhaps they should reduce that timeout to discourage people from taking their hands off the wheel entirely.

  • Face it, LIDAR is too pricey at the moment and all the car makers are trying to get a crap system out ahead of each other. Someone died because of it.

    • by Xenx ( 2211586 ) on Friday July 15, 2016 @08:11PM (#52522103)
      If the accident was preventable, the driver should of prevented it. They should be paying attention to the road and be in a position to respond. If it wasn't preventable by the driver, then the system is working at least as well as the driver in that situation. Either way, the system isn't responsible.

      And for what it's worth, that doesn't mean the system couldn't/shouldn't be improved. It just means they didn't die because of the system.
      • If the accident was preventable, the driver should of prevented it. They should be paying attention to the road and be in a position to respond. If it wasn't preventable by the driver, then the system is working at least as well as the driver in that situation. Either way, the system isn't responsible.

        How the hell does this point of view get upmodded? You're basically saying that if the driver fails and causes an accident, it's the drivers fault. If the system fails and causes an accident, it's still the drivers fault?

        You're saying that, no matter what happens, it can never be the system's fault.

        • by Xenx ( 2211586 )
          The point of my statement was that the driver is supposed to be able to respond in emergency situations and not rely upon the autopilot. I didn't say there couldn't be a problem with the autopilot, just that the onus is on the driver.
    • No. No one died because of technology. People died because they are not using the technology correctly. Tesla let the drivers know that the tech was in beta, Telsa let the drivers know to pay attention and still be alert and able to take control if the system fails to do exactly what it failed to do.

      Tech fails all the time, and thats OK, because programmers find the holes and improve the technology. Like people, its not full-proof. It gets better when people use it more. How many windows patches would the
      • by ThosLives ( 686517 ) on Friday July 15, 2016 @09:29PM (#52522461) Journal

        ..the tech was in beta...

        This kind of thing - at the very least the finger pointing surrounding it - is why until now nobody put "beta" heavy machinery in the hands of the general public.

        The general public should never be assumed to use things as designed - not all product liability lawsuits are as frivolous as they are sometimes portrayed.

        I would probably find Tesla negligent just on the grounds that they are assuming people won't abuse (or even simply misuse!) the feature. Waivers notwithstanding - it would be interesting to see those in court, because I guarantee just about everyone who signed one would have to say "I just signed it to get the shiny, I don't know what it said" if they were being honest. This means there is no evidence of expectation in the general public that these things are "beta".

        Put another way: you can call something "beta" all you want in theory, but if you're selling it to the general public, it ain't in practice.

        • Isn't "autopilot" just a fancy version of cruise control, meaning there is an almost uncountable amout of situationans that a driver may encounter that this system would NOT be able to deal with? This reminds me of the Simpsons in a couple epiodes where a character though cruise control ment self driving and would crashbthe car shortly thereafter.
          • It's a little more than that. Autopilot combines adaptive cruise control (matches speed of traffic ahead), lane-centring assist (automatically turns to keep you in the middle of the lane if you drift around), automatic emergency braking (if you get too close to something at speed and it will automatically hit the breaks if you don't), and automatic lane changing (hit the signal and it changes lanes for you).

            Except for the lane changing trick, none of these are new things. Adaptive cruise control has exist

            • The latter is also going to become mandatory for all new cars in the USA and EU in 2022.

              Ugh, really? I am concerned about all these things that just add failure modes (will the car only operate in limp-home mode if there is a problem with the auto-brake system?), raise barriers to entry for new vehicle companies, and remove incentives for people to have situational awareness (couple with failure modes - if people are used to auto brake, so don't pay attention, then there is a problem with auto brake, what h

              • by jabuzz ( 182671 )

                I think the statistic for auto-brake last I saw was an estimated 14 billion Euros a year in savings from crashes that not longer happen. That is a lot and I mean a lot of money to be saving and there was a not insignificant number of lives to be saved as well. Basically the return on investment is significantly greater than one, so it is a no brainer really.

                Of course it's not so good if you are an auto crash repair company or on the organ transplant waiting list, but that is the old buggy whip problem.

                • Any articles? I couldn't seem to find anything other than this one [euroncap.com] that says crashes are reduced by 38%, but there were no cost figures.

                  That said, I don't think it's a buggy-whip problem: 14B Euro per year might be correct, but I don't think it's a win for society in monetary terms - I'd question the assertion that the return on investment is greater than one.

                  For example, the mandate in the US for rear back-up camera costs society about $25 million per life saved, because it's a couple hundred bucks times s

      • by Agripa ( 139780 )

        No. No one died because of technology. People died because they are not using the technology correctly. Tesla let the drivers know that the tech was in beta, Telsa let the drivers know to pay attention and still be alert and able to take control if the system fails to do exactly what it failed to do.

        And *that* is a human factors engineering problem. Here you have a system which allows the driver to pay less attention *and* it expects the driver to take over in an emergency? That combines the worst of two separate systems.

    • by kaybee ( 101750 )

      This article has nothing to do with somebody dying. Are you talking about the semi truck incident in Florida? Could it have at least something to do with a driver using BETA software NOT paying attention to the road as they were instructed to do numerous times?

  • turned off the Autopilot.
  • Autoland not installed on this aircraft.

  • If the driver believed the autopilot was on when it was off, then we have to ask "why did he think it was on when it was off?"

    Was the driver not paying attention to the system, and just assumed it was on, or did the system lie and tell the driver that it was on when it wasn't?

    • Maybe he disabled the annoying hands-are-off-the-wheel reminder, just not in the way he intended.

    • by Archfeld ( 6757 ) <treboreel@live.com> on Friday July 15, 2016 @08:16PM (#52522129) Journal

      Or maybe the driver just lied to cover up the fact that he was a poor driver who lost control and hit a guard rail. After all a car hitting a guard rail has never happened before driver assist was implemented. Nor has any driver attempted to cover their a$$ by lying to the cops about what they were doing that led up to the crash. I dropped the roach and was trying to find it when I slammed into the car ahead of me...err no that's not what happened

    • by robbak ( 775424 ) on Friday July 15, 2016 @08:36PM (#52522245) Homepage
      Possibly the driver, seeing the bridge or rail coming up and being uncomfortable with the approach speed, tapped the brake. This would have disabled the autopilot.

      Now, although disabling automatic systems on manual input has been the standard for as long as automatic systems have been available, I am beginning to wonder if it really is the right decision here. People seem to be turning it off without realising that they have done it.

      • by KavyBoy ( 35619 )

        It's hard to not realize you've disabled it. First, there's a distinct two-tone chime. Second, if regenerative breaking is enabled at maximum (which I think most people do), the car slows down noticably unless you press the accelerator. And there's just a "feel" with the torque on the wheel or something. It's just hard to miss unless it's your first day using it or you're just not paying attention at all.

      • by WalksOnDirt ( 704461 ) on Friday July 15, 2016 @11:27PM (#52522791)

        The Tesla logs were reported as saying:

        Prior to the collision, Autosteer was in use periodically throughout the approximately 50-minute trip.

        The most recent such use ended when, approximately 40 seconds prior to the collision, the vehicle did not detect the driverâ(TM)s hands on the wheel and began a rapidly escalating set of visual and audible alerts to ensure the driver took proper control.

        When the driver failed to respond to 15 seconds of visual warnings and audible tones, Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel.

        Approximately 11 seconds prior to the collision, the driver responded and regained control by holding the steering wheel, applying leftward torque to turn it, and pressing the accelerator pedal to 42%. Over 10 seconds and approximately 300m later and while under manual steering control, the driver drifted out of the lane, collided with a barrier, overcorrected, crossed both lanes of the highway, struck a median barrier, and rolled the vehicle.

        Now, you can believe this or not, but it doesn't match up with your hypothesis.

        • by robbak ( 775424 )
          Thanks! I hadn't read that bit of information. That looks disturbingly like the driver fell asleep, and didn't wake up fully when they took control. Ouch - how do we fix that one???
          • Thanks! I hadn't read that bit of information.

            That looks disturbingly like the driver fell asleep, and didn't wake up fully when they took control. Ouch - how do we fix that one???

            By not having an autopilot that requires human intervention.

            The problem with the autopilot doing its own thing for a while and then handing control back to the user is that the user may not be in a state where they're able to safely drive.

            They might be fiddling with a DVD player, reaching into the glove compartment, or had fallen asleep because they weren't required to pay attention while the car was driving itself.

            There's no safe way to hand control back to the driver while the car is in motion, either the

          • by sjames ( 1099 )

            That's a hard one. Notably, we have yet to fix the problem of driver drifting off to sleep even when there isn't an autopilot to back them up.

        • by WoOS ( 28173 )

          This can be fully in line with the driver thinking the autopilot being on-line. Potentially the driver was trained by the system over time that he just has to shortly move the steering wheel in answer to a "hands on wheel!" requests by the system to be allowed to take his hands off again for one/some minutes. Only this time he did it too late so the system did not re-engage.

          This is the problem of allowing long stretches of hands-off with only short stretches of hands-on because one originally promised "comp

        • More importantly, it suggests that the driver may have THOUGHT that the autopilot was on. Actually, it suggests a secondary point. The autopilot should NOT turn off until the car is either stopped or it detects that the driver is controlling the vehicle (hands on the wheel).
        • The Tesla logs were reported as saying:

          Prior to the collision, Autosteer was in use periodically throughout the approximately 50-minute trip.

          The most recent such use ended when, approximately 40 seconds prior to the collision, the vehicle did not detect the driverâ(TM)s hands on the wheel and began a rapidly escalating set of visual and audible alerts to ensure the driver took proper control.

          When the driver failed to respond to 15 seconds of visual warnings and audible tones, Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel.

          Approximately 11 seconds prior to the collision, the driver responded and regained control by holding the steering wheel, applying leftward torque to turn it, and pressing the accelerator pedal to 42%. Over 10 seconds and approximately 300m later and while under manual steering control, the driver drifted out of the lane, collided with a barrier, overcorrected, crossed both lanes of the highway, struck a median barrier, and rolled the vehicle.

          Now, you can believe this or not, but it doesn't match up with your hypothesis.

          It perfectly matches up with what I (and others) have been saying about partial autonomous driving: if the car drives perfectly for 50 minutes, and then requires the human to take over, the human may not be in a position to do so.

          Driving should be fully autonomous or not at all - partial autonomy is no good. We'll have fully autonomous cars when we have perfect general purpose AI.

  • Wasn't me! (Score:3, Insightful)

    by Anonymous Coward on Friday July 15, 2016 @07:59PM (#52522021)

    You can almost see this autopilot thing becoming an excuse whenever the car crashes. Me? Nah! It was the car! I swear it was the car! I am an always responsible driver! How can you dare saying that I am responsible for the car... it was the car itself I tell you! This AI is just bad I swear!

    So now that it is clear that *I* the driver has no fault on the crash, could you please not raise my insurance?... or prosecute me for killing that pedestrian, or running over that biker?... it was clearly not the alcohol... but the car!

    I'm glad they kept logs. Even if the logs are not 100% reliable, it is better than just the word of an honest driver that just happens to have someone to blame it onto.

  • musk must be desperate to assert things that cannot be verified independently and in direct contradiction to victim' claims.
    this sort of thing, even if true, should be better coming from an independent source.

    • by GerryGilmore ( 663905 ) on Friday July 15, 2016 @08:13PM (#52522111)
      So "victim's claims" are more accurate than "actual vehicle logs", eh? Your paranoia/hate is duly noted....
      • So "manufacturer's claims" are more accurate than "person who was actually there's claims", eh? After all, you haven't seen these supposed logs, and nor has anyone else. All we have is the word of the one person who has the most self-interest of all in this matter, and who has shown himself to be utterly disingenuous in the past to boot. Your fanboyism is duly noted....
  • Is that incident being investigated by the NTSB? Because parties to an NTSB investigation who release information outside the framework of that process tend to get a not-so-nice letter from the Justice Dept.

    sPh

  • by GerryGilmore ( 663905 ) on Friday July 15, 2016 @08:11PM (#52522097)
    ....is the per-mile-driven accident rate greater or less with Autopilot (or equivalent) enabled? Basically, it's a "perfect is the enemy of the good" situation whereby some folks seem to want to limit autonomous driving until it is 100% perfect when we all know that humans are far, far less reliable.
    • by whoever57 ( 658626 ) on Friday July 15, 2016 @09:32PM (#52522469) Journal

      ....is the per-mile-driven accident rate greater or less with Autopilot (or equivalent) enabled?

      For fatal accidents, the per-mile rate is lower with Autopilot enabled.

      But perhaps that doesn't tell the true story. Autopilot cannot be used in many situations, what if those situations are more dangerous? In other words, if the Autopilot can only be enabled on roads that are generally safer, then pure per-mile statistics are misleading.

      • by maorb ( 2578043 )

        Very true, and while it's difficult to adjust for this, the Tesla also has proven remarkably safe when accidents do occur compared to other vehicles, so how much does good safety characteristics decrease the reported number of fatal crashes.

      • ....is the per-mile-driven accident rate greater or less with Autopilot (or equivalent) enabled?

        For fatal accidents, the per-mile rate is lower with Autopilot enabled.

        But perhaps that doesn't tell the true story.

        No - it doesn't tell the whole story. The per-mile rate for non Autopilot vehicles is based on multiple billions of miles driven, where a single 'extra' death tomorrow would change the fourth or fifth decimal place. The per-mile rate for Autopilot is based on a much smaller sample size, a single d

    • Comment removed based on user account deletion
  • Is anyone else not the least bit concerned that Tesla has this level of access to vehicle logs and is free to blab about them publicly?
    • by Rei ( 128717 )

      Not me.

      I look forward to the day when car crashes are looked on the same as jet crashes. And one of the first goals with a passenger jet crash is to get the data off the flight recorders to crash investigators. The FAA takes the concept that crashes are simply not acceptable, just not "something that happens", and whatever caused a given crash doesn't just get marked "WILLNOTFIX" and closed. The latter is basically the situation with car accidents today. I strongly support, at least on an aspirational l

      • by mentil ( 1748130 )

        With aircraft crashes, there is usually a large organization/corporation at least partially at fault: the airplane manufacturer, or an airline that hired the pilot or scheduled a flight through dangerous conditions. If a plane goes down and everyone inside dies, the organization still exists, and can be fined for their negligence.

        In contrast, if a drunk driver swerves their car into an oncoming lane and dies, punishing the driver for DUI isn't an option, and does sadly little to deter others from drunk driv

        • by maorb ( 2578043 )

          In the case of autonomous and semi-autonomous cars however there is an opportunity to fix certain forms of driver error, especially the type where the driver's error is inaction.

          Take the fatal tesla crash into a semi-trailer. The driver was supposed to be responsible for overseeing the safe operation of the car even with autopilot turned on, but that doesn't at all negate the fact that the autopilot shouldn't have crashed in the first place. The two solutions are to either ban autopilot (either industry wid

          • I don't want people on the road with me to have such a vehicle. It's not natural for people to pay attention to something they are not interacting with.
      • We're never going to get to that day. Automation is never getting into economy cars. Maybe if the government invests in automated driving and subsidizes it, but not otherwise. I'm real tired of people using it as an excuse. We don't even test cures for diseases on humans before they are proven safe and those will definitely save lives. This is a first world fancy in the short to medium term, nothing else. Heck, get everyone to eat healthy and work out, that alone would save more lives then automated c
        • We're never going to get to that day. Automation is never getting into economy cars.

          Why not? And why do you think it's something only the government can make happen?

          • We have all kinds of automation now, and it is only in luxury cars because automakers use it to sell luxury cars. Automatic headlights, windshield wipers, etc. These have been around for 10 years or more, no filtering down to economy cars.. it's like the myth of the trickle down economy. Automated cars will be expensive, and unless the government helps people buy them, they won't have them.
    • by fred133 ( 449698 )

      Remember Moscow Rule # 4 ~ "Don't look back; you are never completely alone."

  • by SmaryJerry ( 2759091 ) on Friday July 15, 2016 @10:49PM (#52522673)
    I love that the feature wasn't even on. Go ask any insurance adjuster and let them tell you if people lie about accidents. But even if it was, this feature is just a cruise control that also keeps you in your lane and might brake when an object is in your way. It is literally a far safer cruise control than any other vehicle. This doesn't mean you can sleep while using it, same as other cruise control. If I told you I had a helmet that made injuries to the brain 50% less likely, that wouldn't mean you can use it to dive off a building head first. Using products in ways other than intended is not the fault of Tesla.
  • It was turned off 0.001 seconds ago, which we all know is an eternity in computing speed.
    • by AmiMoJo ( 196126 )

      I wonder how easy it is to tell if Autopilot is engaged? I mean, if it's just one small graphic maybe people are getting confused and thinking that it's on when it's really off.

      The UI design could definitely be improved. Reduce the hands-off-wheel time limit to 5 seconds, and make it beep loudly and incessantly when AP is off and the drive isn't gripping the wheel at speed.

  • One fact about humans.

    They don't take responsibility for their actions, and they lie hard about accidents and try to place blame elsewhere.

  • Well, not that shocked.

    User error again, so the SEC investigation is unwarranted. Why are they not investigating GM for covering up the fatal ignition switch problems, or Toyota for their safety issues, or VW for cheating emissions tests across VW and Audi vehicle lines?

  • In response to the third reported Autopilot crash, which was the first of three where there were no fatalities

    The first crash in Florida was the guy who got killed going under the truck while watching his DVD.
    The second crash was a gallery owner in Detroit and he and his passenger survived without any injuries [time.com].
    The third crash - the one apparently without autopilot - hit a guard rail in Montana [digitaltrends.com]. "The two occupants walked away without major injuries."

    I don't know why this "fatalities in two crashes" myth is so pernicious. It was also falsely claimed in this Slashdot story [slashdot.org] on the third crash last Monday. But all of the linked articles are absolutely clear that there's been only one fatality, so it's not like the various submitters are just getting bad information from the media. Instead, the Subbys appear to be making up the second fatality out of nothing.

    A more skeptical person than me would wonder if someone shorted TSLA.

  • According to TFS, Elon Musk believes if the autopilot was active it would have prevented this accident from happening.

    So let's just take his word for that. Driver makes an error and causes a crash that autopilot would have prevented, while driving a car that has the autopilot function installed and in good working order but the driver decided to operate the car fully manually.

    We have cars with technologies like traction control, anti-lock braking, assisted braking/steering options, there are various collisi

  • Consumer Reports is calling on Tesla to "disable hands-free operation until its system can be made safer."

    I'm calling on some drivers to disable hands-on operation until they can be made safer drivers.

It is not best to swap horses while crossing the river. -- Abraham Lincoln

Working...