Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Movies Transportation AI Businesses Media Television News Entertainment Hardware Technology

DVD Player Found In Tesla Autopilot Crash, Says Florida Officials (reuters.com) 485

An anonymous reader quotes a report from Reuters: A digital video disc player was found in the Tesla car that was on autopilot when its driver was killed in a collision with a truck in May, Florida Highway Patrol officials said on Friday. "There was a portable DVD player in the vehicle," said Sergeant Kim Montes of the FHP in a telephone interview. She said there was no camera found, mounted on the dash or of any kind, in the wreckage. A lawyer for a truck driver involved in the accident with the Tesla told Reuters his investigators had spoken to a witness who said the DVD player was playing a "Harry Potter" video after the accident, but the lawyer was unable to verify that beyond the witness account. Lawyers for the family of the victim, 40-year-old Joshua Brown, released a statement Friday saying the family is cooperating with the investigations "and hopes that information learned from this tragedy will trigger further innovation which enhances the safety of everyone on the roadways." Lawyers for the family of the victim, 40-year-old Joshua Brown, released a statement Friday saying the family is cooperating with the investigations "and hopes that information learned from this tragedy will trigger further innovation which enhances the safety of everyone on the roadways." Tesla said in a statement Friday, "Autopilot is by far the most advanced driver assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility."
This discussion has been archived. No new comments can be posted.

DVD Player Found In Tesla Autopilot Crash, Says Florida Officials

Comments Filter:
  • by C0R1D4N ( 970153 ) on Friday July 01, 2016 @10:40PM (#52431319)
    What exactly is the point of it? To lull you into a false sense of comfort and security? I look forward to autonomous vehicles, but if it still requires me to keep my attention on the road and ready to respond, I'd rather just be in control of the vehicle to begin with.
    • by Skot Nelson ( 4634285 ) on Friday July 01, 2016 @10:43PM (#52431331)
      The same thing cruise Control does, to some extent: allow you to pay attention to IMPORTANT things by removing the need for you to pay attention to SOME things. Cruise Control means I don't have to monitor a speedometer to make sure I'm not speeding. I can focus on things more relevant to driving safely because my attention isn't diverted by THAT. Except I drive a manual civic and don't have cruise control.
      • by RobinH ( 124750 )
        Manual transmission vehicles can still have cruise control. My 5-speed Focus had cruise.
      • I think it is now well known that cruise control does not do that. Otherwise, reaction times in case of emergencies would not increase when cruise control is activated. See for instance this study [vinci-autoroutes.com]
        The only thing cruise control provides is comfort.
      • by slew ( 2918 )

        Except I drive a manual civic and don't have cruise control.

        Cruise control has been an option on MT (including civics) for quite a while. Generally, on MT, CC works in gears 4-5, but won't engage in 3rd gear. The stock CC will dis-engage automatically if you touch the break (same as auto) or the clutch pedal (w/o actually engaging the clutch).

        My 1984 Nissan Maxima 5-speed MT had cruise control. Worked about the same as my 1989 Honda civic MT (but maybe CC only came with the power windows package, I can't remember, it's been too long). Given how loaded civics hav

        • Generally, on MT, CC works in gears 4-5, but won't engage in 3rd gear.

          I've used cruise control in both my 2001 Civic EX and 2002 CR-V EX in 3rd. I think the limiting factor is a speed below which CC won't engage (like 20 mph), not the actual gear you're in - though that's certainly a practical factor. I remember one time being in CC and decelerating via the CC controls and at some low speed it disengaged.

      • Except I drive a manual civic and don't have cruise control.

        I have a 2001 Honda Civic Coupe EX and a 2002 Honda CR-V EX both with 5-speed manual transmissions and cruise control. Maybe your vehicle is older or a different trim line, but manual transmission and cruise control are not mutually-exclusive.

    • by Ramze ( 640788 ) on Friday July 01, 2016 @10:59PM (#52431393)

      It's a safety and convenience feature that is being abused by treating it as a true AI chauffeur. The autopilot is really a minimal set of enhancements -- things like:

      intelligent cruise control (senses nearby cars and adjusts the cruise setting and braking based on their data)
      auto-parallel parking and perpendicular parking
      auto-lane change when hitting the turn sigal
      auto-driving (including making turns) in some instances -- mostly 5 mph areas
      summoning (car backs out of driveway and comes to you)

      Even the features used while driving are supposed to warn you and nag you if you take both hands off of the wheel and will slow the car down if you don't respond. It's not meant to be as full-featured as a Google self-driving car. Only someone watching a DVD player instead of driving the car would have hit that truck instead of slowing down -- assuming there's no massive glitch that disabled the driver's ability to hit the brake.

      • Re: (Score:2, Interesting)

        by wisnoskij ( 1206448 )

        No. Tesla is not staffed completely by idiots. They may be a new company, but I am sure they have many experienced driving experts working for them. You can be sure that Musk has read many reports showing that if you take away all need for user input while driving down the highway (possibly for hours at a time), when an incident happens the "driver" will be completely unable to respond in time to be of any help. It's not rocket science, this is not a new field never studied before.

        This driver did what all d

        • by Ramze ( 640788 ) on Saturday July 02, 2016 @02:29AM (#52431983)

          The Tesla does not drive for you in autopilot mode. You still have to tell it when you want to change lanes (which this person supposedly did just before the crash.) Whomever was driving was alert and attentive enough to decide to change lanes literally a moment before the crash, so they must have assessed the surrounding vehicles and determined it was safe to do so.

          As for your assumptions about driving, I have no idea where you're getting your data from as all Google cars have drivers that are paid to be attentive and all Teslas explain the features are to assist in driving, not autonomous driving... and they slow down and alert you if you don't keep your hands on the wheel.

          I've regularly driven 5 to 7 hours at a time visiting family and friends every few weekends, and I almost always use my cruise control on the interstate. I have no idea why a Tesla which has enhanced cruise control and little else other than a collision warning system would make a human being so much more bored and inattentive they'd drive straight into a truck after changing lanes. That's just nonsense. I keep the A/C on high and play music or podcasts to entertain me, but I never zone out, change lanes, and run into the back of trucks. Not sure who on earth would.

          The Tesla's enhancements don't ask the driver to "do nothing" any more than my cruise control does. They still have to physically tell the car to change lanes, watch the road for crazy drivers, note when and where to turn off the main road (even driving interstates, one can go through many off-ramps, yet still be on the same interstate), etc. It's not like a getting into a cab and telling the driver where you want to go.

          I've seen people doing their own make-up, reading newspapers, and even watching TV in their vehicles while driving on the interstate. Eyes completely off the road in front of them, vehicle on cruise control (I presume). Those are morons... and my money is on this guy watching Harry Potter instead of being a responsible driver. Don't blame the vehicle for human laziness. There's no excuse for it.

      • by wwalker ( 159341 )

        And? Nobody was expecting Tesla to calculate the trajectory of the trailer and take an intelligent detour around it via a side street. It is well within the autopilot features to stop the car if there is an obstacle in the road in front of it. The size of the fucking trailer, mind you. Yes, it's not meant to be a full-featured self-driving car. But stopping before hitting an obstacle is very much expected.

        Also, who tested autopilot at Tesla? It's not like tractor trailers are rare on the road. You just take

        • by dbIII ( 701233 ) on Saturday July 02, 2016 @01:49AM (#52431907)
          Are you being sarcastic or do you really think Tesla have managed to do what Boeing and all the rest have not managed to do?

          Are we going to learn next that Teslas don't brake for cyclists??

          I do not understand. Why would you think they do in the first place? Perfect SF movie artificial intelligence has not been invented and installed in a car. Are you being serious?

    • by Jeremi ( 14640 )

      What exactly is the point of it? To lull you into a false sense of comfort and security?

      I rather expect that Tesla will fix this particular problem quickly, if a fix is possible; so that the next time a white tractor trailer with high ground clearance is crossing in front of a Tesla (whose driver is not paying attention) on a sunny day, the Tesla will notice it and slow or stop, as necessary.

      Whether or not that fix will make the Tesla system "safe enough" is still debatable.

    • The goal of Tesla is to get people on the road, so they can build a huge data set, and use that dataset to improve self-driving cars, to the point that the cars can be fully autonomous.
    • I don't have a Tesla but the most useful thing I could imagine the Tesla autopilot for is actually stop and go traffic, where the car could do a great job of removing the tedium of constantly adjusting speed, you just watch the cars all around you.

      It would also allow you to pay somewhat more attention to what drivers are doing behind you so you could avoid an accident - I've avoided several rear-end collisions just because I saw something bad was happening behind me and if I didn't move out of the way someh

      • "I don't have a Tesla but the most useful thing I could imagine the Tesla autopilot for is actually stop and go traffic, where the car could do a great job of removing the tedium of constantly adjusting speed, you just watch the cars all around you."

        You don't need a Tesla for that. "Intelligent" cruise control that does exactly that has been in the market for quite a few years now.

    • No, the point is, to be testing the system and mapping. Those of us driving these are basically mapping the roads, intersections, etc. Sadly, we are ALSO dealing with issues that were not figured out ahead of time. We got a different Tesla model for a loaner and it has AP. So I tested it just yesterday on a road with lots of hairpins and edges which fall 100'. Needless to say, I was on edge during the whole time of driving it. Then I found out later that this is really only ready for divided highways (the 3
  • Elon Musk's Terminators have claimed their first victim.
  • by Anonymous Coward on Friday July 01, 2016 @10:50PM (#52431363)

    The problem is that if it slightly resembles a full-on AI based driverless system, that's how people are going to treat it no matter how many layweresque warnings you thrust in front of them and no matter how many forms they have to sign telling them it is just fancy lane assist.

    It's just human nature: if people aren't actively involved in the driving process, their attention is going to wander. It's how we as humans are wired up. For a long trip, I'm not sure I could stay focused at all times, even though I'd know perfectly well I was risking my life if my attention wandered. If I'm driving, that's one thing, but if the car is doing 99.9% of it, the other 0.1% is going to pose a real serious problem.

    If you build "almost an autopilot", that is a recipe for people treating it like what it resembles but isn't.

  • by RobinH ( 124750 ) on Friday July 01, 2016 @10:50PM (#52431367) Homepage

    The car was basically equipped with a stay-in-lane and slow-down-if-you-approach-the-car-in-front-of-you kind of system, which is not an autonomous vehicle, nor can you take your eyes off the road. At best it reacts a bit faster if someone in front of you hits the brakes. Google did a talk on this and said in their tests, as soon as a car seems to be working by itself, drivers stopped paying attention to the road, so half-way-autonomous is a bad idea. People don't want to pay attention and they won't if the car seems to be doing a good enough job.

    Only a fully autonomous car will be good enough.

    • Or not at all.

      • Yeah, no cars at all, better use horses. They aren't so stupid to run into obstacles with full speed.

        • Horses are a good idea, but they're not up to the task of driving a car at freeway speeds. A better solution would be to outsource driving. Let your car be remotely piloted by a driver working for pennies via VR in India.

    • actually, more than 130 million miles have been logged by drivers doing this. 1 life has been lost.
      The NORMAL rate is a death at 96 million miles. So, what this means is that 50% less fatality.
      In fact, Tesla has many instances of this already saving lives in situations where they would have died in other vehicles, or been injured in the Tesla. IOW, this is already proving itself to be safer.

      Now, as to Mr. Brown going on with watching TV, that is sad. OTOH, millions of drivers text and drink EVERY DAY
      • actually, more than 130 million miles have been logged by drivers doing this. 1 life has been lost. The NORMAL rate is a death at 96 million miles. So, what this means is that 50% less fatality.

        I don't know if "normal" is the word you're looking for, unless you're saying they need to start killing more people.

    • by sjames ( 1099 )

      It doesn't actually have to be fully autonomous to be useful. It need not need to know how to navigate and it doesn't need to be able to turn. It *DOES* need to know to stop to avoid accidents. It DOES need to be perfectly safe even if you fall asleep, even if you might not be in the right state when you wake up.

  • by Known Nutter ( 988758 ) on Friday July 01, 2016 @10:53PM (#52431379)
    Porn. Absolutely porn. No 40 year old man is driving down the road in his bad ass Tesla watching Harry Potter. No way, not happening. Porn.
  • by Guillermito ( 187510 ) on Friday July 01, 2016 @11:01PM (#52431399) Homepage
    Although in this particular case it is unclear whether the driver was actually watching a DVD at the moment of the crash, it is pretty obvious that an assisted driving technology that can handle 95% of the driving situations will make users confident enough to be distracted when operating the vehicle, no matter how many warnings and disclaimers are shown telling users they need to pay attention all the time in case they have to gain control to handle the remaining 5% of the traffic situations. This is clearly explained in this TED talk by the head of Google driverless car program: https://www.youtube.com/watch?... [youtube.com] (this particular issue is discussed around 4:10, although the whole video is worth watching). This is why Google approach to self driving cars is to release their product when the system is able to handle 100% of the driving situations and never require the user to take control in contrast to the Tesla approach of releasing a system than can handle most situations and make incremental improvements over time.
    • The other main difference is that Tesla has logged data from 50 million miles of autopilot data from all over the world, while Google has logged data from 1.5 million miles mainly in the Bay area.

      I think this gap will widen exponentially, and good enough AI for driving will come only through masses of data, so Tesla have a huge advantage.

      • by WoOS ( 28173 )

        I am a bit surprised about the belief that AIs (or machine learning) will solve all problems given enough data.
        What do you think a neural net would have learned to do if trained to use VW's "AdBlue" as efficiently as possible but still to pass the NHTSA conformance test?
        Who would you blame then? After all the constraints look reasonable. Would you want to be the engineer sued because he did not predict the neural net might learn something illegal?

        Plus, there is obviously a problem with the way Tesla gathers

    • Well, google and tesla differ. The question is, who makes more sense?
      In Tesla's case, they have 130+ million miles logged on the system with exactly 1 fatality and no injuries.
      However, the NORMAL case is that a faility and numerous injuries are logged every 96 million miles in America. So, at this time, there is 50% less loss of life and a great deal less injuries.

      Right there, it says that Tesla has the right solutions by saving more lives and within another year, they will have the count down even fur
  • As far as I can see the truck driver was at fault, so why is such a big deal being made about this? Of course automation is going to make drivers lose concentration. Thats been understood for decades.

  • Apparently a lawyer for the family has mental defect that causes them to repeat statements. Either that or the /. editors are once again showing their true dedication and attention to detail. Either way things that were getting better following the most recent change of hands have begun to erode already.

  • by Cochonou ( 576531 ) on Friday July 01, 2016 @11:36PM (#52431529) Homepage
    "Autopilot is by far the most advanced driver assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility."

    Then maybe they should start by stopping to use the misleading name of "autopilot" for this functionality.
    • by jedidiah ( 1196 ) on Friday July 01, 2016 @11:45PM (#52431575) Homepage

      Most people understand "autopilot" to be something that keeps an airplane flying in a straight line. In that regard, the term isn't misleading.

      Even a modern autopilot won't help you in an unexpected situation. You still need a real pilot to handle interesting things.

      • Most people understand "autopilot" to be something that keeps an airplane flying in a straight line.

        I don't think most people understand that.

        • Most people understand "autopilot" to be something that keeps an airplane flying in a straight line.

          I don't think most people understand that.

          Most people don't know shit, and should disqualify themselves from making assumptions. When they don't, that's their fault, not anyone else's. Compare first aid. If you have first aid training and you help someone after an accident you're basically protected from liability only as long as you stay within your training. If you attempt to exceed it, you can potentially be held liable. If a person knows enough to make a reasonable assumption, and that conclusion turns out to be false because of a deliberate at

      • Aircraft can do fully automated landings now. It's quite commonplace even on passenger airliners. Pilots are still better at handling difficult conditions like crosswinds, but if visibility is too poor for a pilot to land they just flip the switch for automatic landing.

        • Aircraft can do fully automated landings now. It's quite commonplace even on passenger airliners. Pilots are still better at handling difficult conditions like crosswinds, but if visibility is too poor for a pilot to land they just flip the switch for automatic landing.

          Sure, sure. But how often does a tractor-trailer cut in front of a plane while it's trying to land? Not sure even their AP is programmed for that.

      • Auto means self. Pilot means pilot.
        If you call something an autopilot and it can't pilot the vehicle in the vast majority of situations autonomously, you're misrepresenting it.

      • Most people understand "autopilot" to be something that keeps an airplane flying in a straight line.

        Rubbish. Most people probably think "autopilot" means that inflatable doll in the movie Airplane. I'd fully admit that I've got no idea precisely what a modern autopilot can and can't do or what the rules are for using them - what I do know is that (a) pilots are much more thoroughly trained and monitored than car drivers, and are more likely to follow the rules when flying on autopilot and (b) planes fly for thousands of miles on pre-set courses without passing within a mile of other traffic, and its proba

  • I thought that high-end consumer vehicles employed Lidar to detect physical objects in front of them?

    And isn't it a requirement of Tesla to have the cameras installed before you install the autopilot software?

  • So Tesla says the auto-pilot actually detected the trailer, but thought it was an "overhead sign" that was hanging high enough. What?! So it appears the sensors on Tesla are not precise enough to tell if the car can safely pass under something if it hangs over the road? I mean, come on, I'd be fine if the auto-pilot couldn't tell if the clearance is 10 feet or 12 feet. But a trailer? As far as I can find the standard floor height of a tractor trailer is 48". That means the clearance under is even less. It d

    • It makes sense if the car was traveling uphill and the computer doesn't take that into account or the sensors are just fixed at a single point regardless of incline.
      And yes, we had a story of an auto parking Tesla hit a trailer.

  • A person died for the novelty of a car that seemed to drive itself.
    Who will be next.

  • 130 million miles have been logged by drivers using AP. This is the first fatality and there have been zero injuries up to this point. In addition, a number of accidents have been avoided.
    So, how does this compare to the average?
    In America, somebody dies every 96 million miles. In addition, there are a large number of injuries, though to be fair, injuries should probably not be looked at as much as accident rate (tesla is the safest car on the road, bar none; they make volvo look dangerous). So, at this
  • The witness says a Harry Potter movie was playing. If he was making this up, then there's a less than one in a thousand chance that the DVD player actually contains a disc with a Harry Potter movie. (The last disc of the series was released on DVD in 2011. A Tesla owner would be much more likely to be watching a more recent release.)

    Investigators know which disc was in the player, so they know if the witness is telling the truth.

  • He was confused as to where platform 9 3/4 was.

Keep up the good work! But please don't ask me to help.

Working...