Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Youtube Businesses Power Twitter

Self-Driving Tesla Owners Share Videos of Reckless Driving (nytimes.com) 440

An anonymous reader writes: The driver killed in a Tesla car accident "celebrated the Autopilot feature that made it possible for him to cruise the highways, making YouTube videos of himself driving hands-free," reports the New York Times, adding that one of his videos of a near-miss went viral just 11 weeks before his death -- after it was shared on Twitter by Elon Musk. But USA Today reports that Tesla drivers have also filmed themselves playing Jenga and Checkers or sleeping while using the autopilot feature. "Even though Tesla tells drivers to 'keep your hands on the wheel at all times and stay alert,' the temptation to test a no-hands drive is just too much."

In April, a Volvo driver had criticized Tesla for releasing a dangerous "wannabe" Autopilot system. But when Tesla introduced the self-driving feature in October, Elon Musk argued that "Long term, it'll be way better than a person. It never gets tired, never has something to drink, never argues with someone in the car." He had also said that within three years Tesla cars should be able to drive a sleeping driver in to work -- but that that functionality is not currently supported.

This discussion has been archived. No new comments can be posted.

Self-Driving Tesla Owners Share Videos of Reckless Driving

Comments Filter:
  • Assholes (Score:5, Insightful)

    by Hognoxious ( 631665 ) on Sunday July 03, 2016 @02:38AM (#52437355) Homepage Journal

    Assholes don't know they're assholes. Film at 11. Brought to you by frosty piss!

  • Wrong approproach (Score:5, Insightful)

    by Anonymous Coward on Sunday July 03, 2016 @02:41AM (#52437365)

    We should make our public transportation better, then we can all just sleep on that on the way to work.

    • Pretty much every big city has adequate public transportation. It doesn't work well at all in less congested areas because there will never be enough riders on all the different routes that would make it useful.
  • by Pentium100 ( 1240090 ) on Sunday July 03, 2016 @02:47AM (#52437385)

    I mean it is almost like having a self-driving car that you do not need to pay attention to the road most of the time and the car will not go into a ditch or rear-end another car.

    So you can get bored by having absolutely nothing to do, so the temptation of doing something is just too great. And it works most of the time, the autopilot keeps the car on the road and avoids danger.

    Except for that 0.01% when it fails and you have to react as quickly as if you have been driving all this time.

    When driving regular car, you have to make frequent minor adjustments to keep the car on the road (the road isn't straight after all), so there is less time to get as bored as when you have nothing to do.

    My grandfather worked as a bus driver for a while (driving between cities), he told me that the road from Kaunas to Vilnius (in Lithuania) was too straight for him and he had trouble not falling asleep at the wheel (so he used to talk to the passengers etc and never actually fell asleep), while driving to other cities was easier because the road is not as straight.

    Using the autopilot most likely looks like driving on a completely straight road with a car that does not veer to any direction by itself.

    • by starless ( 60879 )

      I think perhaps some suggestion that this can happen also comes from the much higher traffic death rate (corrected for miles driven) for the US than the UK. In the UK most people drive manual/stick shift vehicles, whereas in the US most drive automatics. It seems to me (having lived/driven in both places), that driving a stick shift forces you to continually pay rather more attention to your driving environment.

      Although of course there are a huge host of other things that affect traffic death rates. It's in

      • AFAIK the long highways in the US are very straight compared to various inter-city roads in Europe, the curving roads also force the driver to pay more attention to the road and gives him something to do (correct the trajectory) so he is less tempted to let go of the wheel and browse the net..

        • In Europe most highways are even curved at places where they could be straight, for exact the reason you mention. The curves are not easy to see, but if you pay attention to what you are actually doing while driving you realize that you are driving in very long alternating curves.

      • I think the much higher standards for getting a driving license in Europe do a lot as well.

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      This is why in the Netherlands, highways are designed to be not straight but with gentle curves. Just enough to keep the long distance driver occupied, but no nearly enough for a twisty turny stressfest.

      • Indeed - after they found out the very straight roads of the Noordoostpolder gave rise to many casualties - straight and inviting speeding well above the 80 km/hr speed limit.

    • I remember a quote from google selfdriving car team that there is no point in having a steering wheel in a self driving car, because the "driver" would never be able to take over in time to avoid any danger. Which makes sense. We should not switch to automated driving before we are ready to surrender all control.

      • by mysidia ( 191772 )

        Useful if you need to takeover for navigational purposes; however, because the car does not have a clue of how to get from Point A to Point B for locations off the map, or when GPS is down due to radio interference.

    • Re: (Score:3, Insightful)

      by buck-yar ( 164658 )

      How is this autopilot legal? It sounds like one of the most dangerous devices created, inviting distracted driving.

      Cell phones are banned in many states for causing distracted driving, how is this not treated the same?

      • Re: (Score:2, Insightful)

        by cfalcon ( 779563 )

        > How is this autopilot legal?

        Because no one has banned it yet. But honestly, I have no idea. My guess is that it is mostly rich people, and they are few in numbers. There's also the "sense of inevitability" that they have pressed hard for with articles and such.

        And it is true- eventually this sort of thing will be something that will be everywhere and it will be good.

        But lets go over some situations that will probably occur on this long journey:
        1- People will probably find that there's some situation

        • It may be more than low contrast in this accident, as Tesla also mentions using radar (where colour is not an issue), which filters out stuff that looks like overhead road signs (and a trailer does somewhat look like that) to prevent unnecessary braking. Obviously there is an issue with clearance measure in the latter case, though.

        • Not sure how Tesla can shield liability. Gun manufacturers are being sued for their product's misuse. One could even make an argument, Telsa is encouraging users to break the law by making driving "hands free."

          Company I work for makes power equipment. There's literally a label for every possible safety scenario, and any potential situation where a problem could arise, the company must design the product in such a way to be safe. This seems the exact opposite, creating a device to aid in breaking the law (si

      • by Kjella ( 173770 )

        How is this autopilot legal? It sounds like one of the most dangerous devices created, inviting distracted driving. Cell phones are banned in many states for causing distracted driving, how is this not treated the same?

        Semi-automated cars aren't distracting the driver, they just reduce the burden of driving just like automatic transmissions, ABS brakes, automatic windshield wipers, cruise control, intelligent cruise control, lane keeping etc. and so far nobody has said that it is too much of a good thing.

        Google has taken the high road of "we'll release it when it's ready to drive itself not before" while Tesla has taken the low road of jus rolling it out and I'm sure they know this would happen. They're just hoping the le

        • None of those things you mentioned take the driving from you and do it for you. Silly to argue wipers and other things you mentioned have the same distracting effect as no longer having to pay attention to the road.

      • Because it has fewer fatalities per driven-mile than the statistical norm for HUMAN-driven cars, perhaps?

        The driver's main judgment error was using autopilot on a road that wasn't a freeway & had cars crossing the road.

        Realistically, we're already at the point where it's safe to have autopilot driving on a limited-access road in reasonable weather conditions. The problem is, automakers' lawyers won't allow them to openly say, "autopilot is safe under these specific conditions...". They pretend that safe

    • And if it gets out of that valley, perhaps it'll start to drink like us:
      - Morgan: Regarding last week's delivery, why did it take twice as long as usual?
      - Bender: Martini drinking contest with the autopilot. I would have beat him too this time, but we ran out of olives.
      - Leela: Look, I can explain.
      - Morgan: Do you really think you can explain why you left port without a full compliment of olives? I think not.
    • The problem is when we get these features we get lazy. Now I would love my car having such a feature. On the long trip I can relax for a second when I see it is safe keep at a constant speed and say in the lane. Because fatigue makes it so you can't do that easy task so easily causing more fatigue. Not to sleep or even take your eyes off the road but to let your nerves relax a bit to continue driving safely.

    • And it works most of the time, the autopilot keeps the car on the road and avoids danger. Except for that 0.01% when it fails and you have to react as quickly as if you have been driving all this time.

      The quality of a system is always measured in how well it handles exceptions. (Control question: Try to come up with a single example of a good system that handle exceptions badly. Hint: give up because such systems do not exist)

      So a autopilot driving car will handle the normal case extremely well, but when something unexpected happens a human driver is much better capable of performing a sensible action.

    • It's not that good (Score:5, Informative)

      by Anonymous Coward on Sunday July 03, 2016 @08:08AM (#52438127)

      I drove my Tesla with autopilot for a month. It works great in 2 situations: 1) wide open road, good lane markings, light traffic and 2) stop and go in traffic jams.

      It truly is horrible in the following situations:
      1) where lane markings are faded or confusing (road construction, lanes have been moved)
      2) where traffic is moderately dense, and there are curves - the system cannot see far enough ahead to anticipate the curves, so you wind up with abrupt maneuvers
      3) where someone is merging from either side - sometimes it works , sometimes it freaks out
      4) on twisting mountain roads, with good lane markings
      5) where you are right next to the median barrier (e.g. carpool lanes)

      In the latter case, small lateral changes in the barrier position (e.g. from a overpass column embedded in the median barrier) can trigger *exciting* steering wheel inputs as the car attempts to avoid running into the barrier. If one is commuting in the carpool lane at reasonable speeds (>40 mi/hr), one had best keep one's hands on the wheel and one's eyes open waiting for the "ding dong" that tells you the autopilot gave up and it's "Jesus take the wheel" time.

      I found it more stressful to drive with autopilot on than with it off.

      The big flaw is that the forward looking system isn't smart enough - it does not anticipate turns far enough ahead - this is really obvious on a mountain road with lots of turns: it goes into the turn fast, realizes that the road bends, and tromps on the brakes to get the speed down so it can make the turn. Then it speeds up coming out of the turn. You're never sure if it's going to be able to do it.

      In traffic, it tends to "follow the car in front", which is good if the lane markings are poor, but if there's a bend in the freeway, it's not so good - again, it's the abrupt "I've got to turn now" action.

      The adaptive cruise control is awesome - smooth handling of speed in heavy traffic from 0 mi/hr all the way up to 70-75 mi/hr. The lane guidance not so much.

  • Car manufacturers are criticising Tesla for it's Autopilot feature while at the same time their own autopilot features accomplish the same feat only with less sensors. Their systems are identical but they consider them safe because it disables if people take their hands off the wheel, .... except it is defeated with something as easy as strapping a can of coke to the steering wheel.

    Tesla owners doing stupid shit with self driving cars was inevitable, and they are far from the first car makers to be plagued

    • Big difference between engaging an "autopilot" feature", or taping a can of coke to your steering wheel in order to defeat the dead man's switch in the driver assist features. People have a great deal of trust in technology and the companies that provide it. And much of our technology is designed to at prevent unsafe use or at least warn us about it. I can see why people would think that "if it really is unsafe to let Tesla do all the driving, they wouldn't have made the autopilot stay on if I let go of
      • I think it would help a lot of Tesla were to explain what the point is of putting a self driving feature in a car that you can't leave alone for a second. It totally baffles my mind what the point of this even is.
      • You mean like the several warnings you have to click through to enable an auto-pilot feature on a Tesla?

        Call me crazy but I like it when the car doesn't automatically disengage autopilot when it is perfectly capable of controlling a situation due to some stupid thing like me changing hands on the steering wheel. At least the people who fell asleep at the wheel of a Tesla lived to tell about it. Good luck doing that on a competitor's "safe" system.

    • The other car manufacturers didn't put 'auto' in the name of the technology.
  • isn't that they're limited, it's that the user can't know exactly where the limitations are. When you know where the limits of a system are exactly, you subconsciously plan its usage to stay within its perimeter of competences. When you can't fully rely on a system to perform in certain conditions, and not to perform in other conditions, 100% of the time, you have to stay alert all the time to take over in case it craps out.

    That's precisely what's self-defeating in today's fledgling autopilot systems. A rea

    • You can actually see that in one of the linked videos [youtube.com]. The man filming (who seems to be more familiar with the system) keeps telling the driver that if she taps the brakes, the entire system will shut off and she will have to steer again. Afraid that she'll think tapping the brakes will only turn off the adaptive cruise control, while leaving the auto-steering operating.

      It reminds me of Asiana flight 214 [wikipedia.org], where the pilots changed the autopilot and thought the auto-throttle was still on, when in face th
      • Afraid that she'll think tapping the brakes will only turn off the adaptive cruise control, while leaving the auto-steering operating.

        How did people get so fucking bad at thinking that this became a reasonable assumption? I share it with him, but it's sad.

    • Well, that's the problem with autopilot.

      "Oh, over there's a tractor-trailer combo crossing the road. Probably clears well before I'm there. Autopilot on, so car will slow down and stop if needed."

      "Mmm.... Trailer still there. Shouldn't it start slowing down by now?"

      "WTF it's not slowing down!"

      Slams on the brakes while crashing into trailer...

      • Slams on the brakes while crashing into trailer...

        I don't think it braked at all. The car went another 900 feet before ending up in a guy's front yard, going through two fences and taking down a utility pole on the way. At the stated 65 mph, that's almost 10 seconds, and I'm sure the car wasn't going that fast after hitting the truck. I'm kinda wondering why the car didn't notice a sudden deceleration (too sudden for anything but a collision) and didn't stop right then and there. Also of note - the [techinsider.io]
        • I was "role-playing" the situation Tesla wants: a driver at the wheel that is actually paying attention, while relying on the autopilot to do the driving, then reacting way too late to a situation because the too late realisation that the autopilot does not react. That's the problem of an incomplete autopilot: is the driver expected to break and steer to avoid obstacles? And if so, which obstacles?

          Very well possible that the victim in the actual case didn't even see it coming.

  • I believe self-driving cars will eventually take over. However, this accident does highlight one fallacy, namely the idea that a human driver can be expected to supervise a near-perfect self-driving car.

    Just think about it: If your car has been driving perfectly for a whole year, would you find it easy to keep your eyes glued to road and your hands to the steering wheel, just in case the car’s computer has a nervous breakdown? Wouldn’t you start playing with your smartphone, eat a sandwich or ev

    • by Zumbs ( 1241138 )
      If you recall the cars in Demolition Man, they have two modes: Self-driving where the car does everything and Manual where the driver does everything. When switching to self-driving, the wheel is pulled into the dashboard and locked, so the driver has a very visual and tactile way of keeping track of the current mode.
  • This is BS (Score:5, Informative)

    by tomxor ( 2379126 ) on Sunday July 03, 2016 @05:38AM (#52437779)

    Go see the videos for yourself

    ...adding that one of his videos of a near-miss went viral.

    Context mincing BS, the near miss was a truck haphazardly changing lanes without looking into the teslas lane... the tesla avoided the accident, but this is phrased to be intentionally interpreted as the exact opposite.

    His other stupid video showing "reckless" driving is pretty stupid looking and cringeworthy but it's actually on a private road. I think the autopilot is actually pretty dangerous and incorrectly interpreted as "self driving" as others here have stated, but that's no reason for this crude BS article that reads like it's been paid for by the defence lawyers.

  • This comment [slashdot.org] says it like it is - the Tesla autopilot lulls the driver enough that when he must intervene, he is unprepared (And intervene he must -- the Tesla autopilot does not use LIDARs [9to5google.com], as Google's cars do. Musk pooh-poohed that approach as unnecessary and went with a cheaper camera and computer vision based approach.).

    I hope he reverses course. Tesla needs inward-facing tech - cameras and FLIR sensors, gaze detection algorithms, steering-wheel grip sensors - to ensure the driver is 'driving'.

  • Elon Musk argued that "Long term, it'll be way better than a person. It never ..... argues with someone in the car."

    But then :-

    Tesla tells drivers to 'keep your hands on the wheel at all times and stay alert'

    Because it can argue with you then.

  • Remember when I said that? Remember when I got mocked for saying that? Are you going to remember I said that when it turns out I'm right? Of course you won't.

    ..but I'll say it again, now, anyway: Too much automation will make people dumber, less skilled, and lazier overall. Some of them are practically chomping at the bit for the opportunity to become lazier and stupider. Should we just call this Evolution in Action? Or should we do something to save people from themselves? Oh, wait, it's not just the stup
  • I almost got in an accident with a 2003 Subaru WRX some time ago. The antilock brakes, which you can't disable without some trivial modifications to the car, become activated by hitting even a single pothole (not even a deep one) on a clean dry road, then stay activated for around 2 seconds. This had always bothered me as you then were only able to brake at about 60% of normal and happened often. One day I hit a pothole as the car in front of me quickly slowed down on a turn ramp for no apparent reason.
  • Face it. People are limited and stupid. This is not a great secret to life. If you design a system in a car, it has to work with the most limited and stupid of all people. Again, this is something that has been learned by the automotive industry over years and years, it's no secret. I personally believe Tesla was wrong to sell this system in their cars, because if you leave the door open for people to abuse anything then people will. They tried to design a system that augmented the ability of humans wi
  • This has to be updated: http://www.snopes.com/autos/te... [snopes.com]

  • by fred911 ( 83970 ) on Sunday July 03, 2016 @08:31AM (#52438201) Journal

    Did anyone bother to look at the graphic (top of page here: http://www.nytimes.com/interac... [nytimes.com]) ?

      I would assume that if there were traffic control at this intersection it would show limit lines. Without traffic control, it looks like the truck clearly failed to yield right of way. Many times unsafe truck drivers use their size as intimidation to passenger vehicles, failing to yield right of way and just being unsafe. Try to drive 10 miles over the posted speed limit on the Garden State Pky (even in the "slow" lane) without having a 9 ton tractor 1 foot off your bumper and you'll know exactly what I mean.

      Situations where the opposing traffic makes a left hand turn (stateside) kills many, many human drivers No one can assume that even a human here could have avoided the collision except for the truck driver who clearly failed to yield right of way.

      Smells like media bullshit to me.

    • I think you should try driving a truck and see how easy it is to 'break' into busy roads without getting in anyone's way.
  • Humans are great at circumventing safety. This has to be anticipated in the design of anything which has a safety critical function. And that includes autonomous vehicle activity.

    Tesla's self-drive mode must *force* the driver to hold the wheel and potentially do other actions to ensure their complete attention. And if they can't do that then they have no business putting such a feature into the car. And that's assuming the feature works as intended, which it clearly doesn't if it ends up hitting another

Avoid strange women and temporary variables.

Working...