Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Media Transportation Technology

Police Release First Video From Inside the Uber Self-Driving Car That Killed a Pedestrian (recode.net) 698

An anonymous reader quotes a report from Recode: Three days after an Uber self-driving vehicle fatally crashed into a pedestrian in Tempe, Ariz., police have released video footage of what the vehicle saw with its cameras moments before running the woman over, and what happened inside the vehicle, where an operator was at the wheel. The video footage does not conclusively show who is at fault. However, it seems to confirm initial reports from the Tempe police that Herzberg appeared suddenly. It also showed the vehicle operator behind the wheel intermittently looking down while the car was driving itself.
This discussion has been archived. No new comments can be posted.

Police Release First Video From Inside the Uber Self-Driving Car That Killed a Pedestrian

Comments Filter:
  • by Snotnose ( 212196 ) on Wednesday March 21, 2018 @08:08PM (#56301459)
    I don't understand people like this. It's dark, you aren't lit, you're crossing a road with a large, fast moving, well lit hard to miss cars. And you can't be bothered to look for oncoming traffic. Only saw the video twice on the news but it looks like she never knew the car was there.

    That said, I'm glad I was correct in my knowledge of what those "safety drivers" actually do all day.
    • by Vermonter ( 2683811 ) on Wednesday March 21, 2018 @08:17PM (#56301517)

      The scariest part is that the pedestrian does not react to the car at all before being struck. As if she either gave no effort to check for oncoming traffic, or if she just had the mindset of "I have the right of way, the vehicle will stop for me".

      We'll never know what her motive was for crossing at such a poor time, and it's a tragedy that this happened, but her choice to cross there was baffling.

      Also the driver was "intermittently" looking down? No, the driver was looked up twice for a brief moment twice in the video with very long periods of staring down. This may have been unavoidable regardless, but until self driving cars are more reliable, taking your eyes off the road like this is not a good idea.

      • The scariest part is that the pedestrian does not react to the car at all before being struck.

        I think the woman must have been very confident that the car would have seen her and stopped.

        • The scariest part is that the pedestrian does not react to the car at all before being struck.

          I think the woman must have been very confident that the car would have seen her and stopped.

          That's because a human driver would have seen her and swerved. She was more than halfway across the lane, after all. I've missed many pedestrians in similar conditions and similar positions in the lane. Lots of people have.

    • by goombah99 ( 560566 ) on Wednesday March 21, 2018 @08:30PM (#56301611)

      I had heard reports that the video showed her popping out of no where. Absolutely that is not what it shows. It shows here suddenly coming into the headlights lit region, not appearing from behind a bush.

      What's the cardinal rule of driving at night or snow storms? NEVER outdrive the range of your headlights. That is, your stopping distance abolutely positively has to be within your range of sight. Anything else is completely irresponsible.

      So that's clearly what happened here. The woman in the video just appears like magic in a couple of frames from dark to in the head light. That means the leading edge of the headlight zone was something less than 1/2 of a second from impact. No way can you stop in that time.

      This is defacto outdriving your headlights. Uber is guilty. Case closed.

      Now moving on to technical details this also shows. I think part of this is that the dymanic range of the camera sucks. I am fairly sure my own eyes would have been able to see further into the dark. Those black pixels ar not just dark they are completely saturated on the dark end. Nothing is resolvable in them which is why the appearance time is so short. This is a serious problem for all systems as the dynamic range of most cameras is very limited, especially when were dealing with 1/R^4 light fall off ( 1/^R^2 light outbound and then 1/R^2 reflected. Thus a 256 bit sensor is effectively a 16 bit dynamic range sensor. And if you were to account for glints and such then it's even less. No wonder she pops out.

      Secondly, where the hell was the lidar here? Shouldn't that have spotter her?

      Uber is flagrantly at fault.

      • by willy_me ( 212994 ) on Wednesday March 21, 2018 @09:09PM (#56301951)

        The video appears to be deceiving. It is almost like it was purposely dimmed before being released. A human behind the wheel would be able to see much more then what is shown on the video. Look at the buildings in the background, the ditch further down the road... it is all black. No more then 50' from a street light and everything is black. The human eye is so much better then that. If the driver was watching, he would have seen her. Any video system should have also been able to see her. Uber has no excuse - the cyclist was technically at fault but the Uber car should never have hit her. The car never even slowed down.

        Deer are harder to see then a cyclist with reflective shoes - most drivers would have avoided a deer in this situation.

        • I actually wanted to say the same thing too but I didn't feel like being told to put on my tin foil hat today. It's almost like the video was deliberately made bad. Or it came from a camera with a tiny lens.
        • yeah, the video only shows two white stripes ahead of the driver on the left but much more on the right.

          You also have to notice that Uber's plan for driver attentiveness appears to have allowed quite a bit of inattention.

      • by Jodka ( 520060 )

        What's the cardinal rule of driving at night or snow storms? NEVER outdrive the range of your headlights. That is, your stopping distance abolutely[sic] positively has to be within your range of sight. Anything else is completely irresponsible.

        Objects approaching a traveling vehicle from the side can intersect the path of that vehicle nearer than the furtherest projection of the headlight beam. This is a case where your rule fails to protect against night-time collisions. In the linked video the homeless woman with the grocery cart approaches the path of the oncoming Uber car from the side, not from the front.

        I once I collided with a raccoon in my rx-7 because, though I was traveling at a safe speed, the raccoon ran in front of my car from t

    • by hey! ( 33014 ) on Wednesday March 21, 2018 @08:33PM (#56301633) Homepage Journal

      The cyclist should have had wheel reflectors and front and back lights at a minimum, as well as reflective clothing.

      This does not, however, mean the driver -- or should I say human attendant-- is not at fault as well for (apparently) texting. This kind of road is where you need to be especially alert because of the combination of poor lighting and high speed.

    • by Whiney Mac Fanboy ( 963289 ) <whineymacfanboy@gmail.com> on Wednesday March 21, 2018 @08:37PM (#56301665) Homepage Journal

      You're kidding right? It is impossible to tell from that video.

      WTF happened to LIDAR and sub millisecond braking reactions? The woman stepped out of a shadow at a point where a human would've struggled to brake hard enough to stop, but a machine should've been able to sense via lidar an object moving ACROSS ANOTHER LANE in a trajectory that would end in front of it fast enough to at least brake enough to turn a death into an injury.

      I don't think this is a problem with autonomous cars in general, but a problem with Uber's 'I got mine, fuck everyone else' mentality towards everything. I doubt they're prioritizing pedestrian safety whatsoever.

      • by angel'o'sphere ( 80593 ) on Wednesday March 21, 2018 @08:52PM (#56301799) Journal

        The car obviously had no lidar.

        The lady was about 15 yards away ... total distance to brake from 35mph is about 20 yards in perfect conditions (not counting reaction time, which would eat already 10 yards), and 40 yards in general.

        I don't think this is a problem with autonomous cars in general, but a problem with Uber's 'I got mine, fuck everyone else' mentality towards everything.
        True. A car without LIDAR and various RADARs and ultrasonic sensors for road texture is not really self driving ready. This car basically only had an auto pilot, lane detection and sign detection. Pedestrian detection failed due to bad light/camera conditions.

    • by infernalC ( 51228 ) <matthew DOT mellon AT google DOT com> on Wednesday March 21, 2018 @08:56PM (#56301835) Homepage Journal

      The pedestrian didn't seem to notice the car. The car appears to be a Ford Fusion (probably hybrid). If it was in charge-sustaining mode, the car might have been very difficult to hear.

      The pedestrian was wearing a yellow hat and pushing a pink bicycle. Her shirt was dark though. She was visible to the camera for only about .77 seconds prior to impact. A human being would take 0.5-2 seconds to react to the object in the road once it became visible. Depending on the human, the pedestrian might have been visible for a couple of seconds longer than we see her in the footage, but the safety driver appeared to be distracted.

      The reaction time of the autonomous car should be milliseconds. Assuming that the dashed lane markers are fairly evenly spaced, the car doesn't appear to have decelerated at all from my perspective. According to the police, the car was traveling 38 MPH, or roughly 61 km/h. On dry pavement with decent tires, the stopping distance in meters without accounting for any reaction time should be about (s^2)/(250*.8) with s = speed in km/hr... so, about 18 meters, or to be generous, 60 feet.

      See https://korkortonline.se/en/th... [korkortonline.se] .

      Judging from the aerial layer on Google maps, the distance between the beginning of a lane marker and the beginning of a subsequent lane marker is 30 feet or so. From this, I think the first time you see the victim in the video she's about 43 feet away (.77 seconds at 38 MPH).

      Here's the thing though... the LIDAR should have seen this in time to at least swerve to avoid. The LIDAR should also have seen the victim before the victim was visible in the headlights. In my state, the driver has the responsibility to swerve to avoid even if there isn't enough time to stop. It's obvious that there was nobody in the left lane (even in the blind spot, which isn't blind with LIDAR).

      This really seems like an example of where an autonomous car could have saved a life that would have been lost due to a human driver's natural limitations, but it failed to do so. The car should have been able to see hundreds of feet, and the car should have had practically zero reaction time. Just as you would be lenient in judging and older driver for longer reaction times, I think we should hold the autonomous car to a higher standard.

      This thing was a test vehicle. The debug-level logging of the incident should be made public so that if there was a bug that killed this woman, the truth will be known.

  • LIDAR (Score:5, Insightful)

    by Aero77 ( 1242364 ) on Wednesday March 21, 2018 @08:11PM (#56301477)
    This is a good example of why visual sensors are insufficient for autonomous driving.
    • This is a good example of why visual sensors are insufficient for autonomous driving.

      Uber uses LIDAR. Of the major SDC companies, only Tesla does not. Tesla is camera-only.

      I have no idea why the LIDAR didn't work to detect this woman. From the video, it looks like the car didn't brake at all.

      • Lider is specifically supposed to be for the dark is it not? It's almost like there was something wrong with the sensor.
      • by AaronW ( 33736 )

        Tesla has video (8 cameras), radar (up to 160M), and long-distance (8M) ultrasonic sensors.

    • Why?

      Humans use visual sensors exclusively for driving, humans would have had the exact same accident.

      Trying to say that autonomous cars cannot have any accidents is silly and is an arbitrarily high bar.
      I'll be plenty happy if they are just 2 - 3x less likely to have an accident compared to human drivers. That would be a massive improvement to society.

      • Humans use visual sensors exclusively for driving

        Worldwide, human drivers kill 3500 people per day.

        Trying to say that autonomous cars cannot have any accidents is silly

        Saying that SDCs should do no better than humans is silly too.

        I'll be plenty happy if they are just 2 - 3x less likely to have an accident compared to human drivers.

        We should be aiming higher than that. This accident should have been preventable by a properly implemented Radar or Lidar system.

        • Re:LIDAR (Score:4, Insightful)

          by Rei ( 128717 ) on Wednesday March 21, 2018 @09:18PM (#56302019) Homepage

          That's part of the crazy thing.... she was pushing a bike, radar should have seen her too.

          LIDAR should always see pedestrians, easy. But when you're pushing a bike - large object made of interconnecting angular metal structures - across the road, it should be a glowing beacon to radar.

          I don't know what sort of junk system Uber has implemented, but it clearly should not be allowed on the road without an audit.

          • by geoskd ( 321194 )

            LIDAR should always see pedestrians, easy. But when you're pushing a bike - large object made of interconnecting angular metal structures - across the road, it should be a glowing beacon to radar.

            It probably did appear and was likely mis-classified as something that could be safely ignored. I give odds that this is a software bug of some kind.

            The test of the thing will be if the engineers can properly identify the root of the software problem and fix the system so that it properly identifies the hazard when the saved sensor data is replayed through the controller software.

      • How can you tell? The video quality is terrible. Things would have been much clearer if you were in the car. Even in the video you can see the shadow moving a couple seconds before the accident.
      • by Cederic ( 9623 )

        Humans use visual sensors exclusively for driving,

        Hmm, no. They use sound too, and you'd be amazed how much you use whatever the fuck the sense is that tells you whether you're moving or not - and not just in the direction of the car.

        I sure as shit can't see that my rear tyres have no grip, I can, erm, sense it. I can't see that my brakes have locked, I sense it. I can't see that I just hit a woman with a bike, I was looking down at my phone and only sensed it.

        humans would have had the exact same accident.

        Possibly. The video footage shared is very inconclusive on that front. It doesn't sufficiently sh

    • by bartle ( 447377 )

      I've been in this exact situation twice, where someone dressed in black decided to cross a darkened road directly in front of me. In both situations, I had to brake hard to prevent hitting them.

      The tip-off was that I noticed lights blinking out ahead, due to something occluding them. It was an extremely subtle effect, one I would have missed if I hadn't been paying full attention, and one which I do not think AI is capable of recognizing.

      Simply put, I doubt that computer based vision will meet the capabilit

      • by novakyu ( 636495 )

        The truth is, a good driver would not have hit that pedestrian.

        If autonomous cars are not as good as a good driver (but only good as an average or below-average driver), the whole safety argument for autonomous cars goes out the window. Uber must pay the consequences for this accident, as the video clearly shows that Uber's self-driving car did not meet the safety standards that all self-driving cars ought to meet.

        • by geoskd ( 321194 )

          The truth is, a good driver would not have hit that pedestrian.

          That is only half of the truth. The whole truth is that only a small percentage of humans are good drivers. As a professional driver, I can tell you that even professional drivers are only good drivers for part of the time they are on the road. There are many times when they are not at their peak. There are a large set of drivers who can only be classified as "crappy" when they are on their game, and "drunken lemurs" when they are not.

      • That is what I have been trying to say. Humans use so many more visual cues, especially to sense danger.
  • by shess ( 31691 ) on Wednesday March 21, 2018 @08:12PM (#56301479) Homepage

    Yes, this was hard to see using passive techniques with visible light (ie, your eyes), but WTF, the person wasn't sprinting or jumping off the curb, something active like LIDAR should have had no troubles spotting this.

    • by haruchai ( 17472 )

      Yes, this was hard to see using passive techniques with visible light (ie, your eyes), but WTF, the person wasn't sprinting or jumping off the curb, something active like LIDAR should have had no troubles spotting this.

      I would think radar should have spotted her & the bike from a ways off, too

    • by NaCh0 ( 6124 ) on Wednesday March 21, 2018 @11:43PM (#56302963)

      The dynamic range of human eyes is much greater than a camera. It was not pitch black outside to a human. I have lived in Tempe and the ambient light of the city would be enough for at least minimal night vision to apply. This is the reason why you drive in a darkened vehicle without your dome lights at night, for your night vision to be effective. Texting on your phone in a part of town where there are a lot of people roaming the streets (such as south Scottsdale Rd) is simply a negligent thing to do.

  • I think we will make progress on these issues when we collectively stop pretending that "operator inattention" is the intended result of using of automated cars, not an unwanted by-product.
    • Operator inattention did not cause this accident. Although he looked down several times, he was looking at the road when the woman appeared in the headlights. There was not enough time to react.

      • Keep in mind that you would have seen this much clearer in reality, the video quality is terrible; it looks like it was taken with a 90's camera. Even in this video you can see a silhouette of her head in the shadow with enough time to stop.
    • by Jeremi ( 14640 ) on Wednesday March 21, 2018 @08:48PM (#56301773) Homepage

      "Operator inattention" is absolutely the ultimate goal of automated cars.

      If the human has to pay attention, then the human might as well drive, and the automation is pointless.

  • by PeeAitchPee ( 712652 ) on Wednesday March 21, 2018 @08:21PM (#56301543)
    I've seen this happen on huge college campuses as well. Legions of kids crossing streets while paying zero attention to the potential for oncoming traffic. Usually it's because their face is buried in their phone, but sometimes it's not, and they literally step right off the curb into traffic for seemingly no reason. It might make me sound like an old guy but my generation had a healthy fear of death by car instilled into it (by our parents and guardians) which seems to be sadly lacking these days. It's amazing that more people aren't routinely run down.
    • by novakyu ( 636495 )

      I drive through Berkeley all the time where pedestrians routinely step into the street when I have the green light.

      But that doesn't mean I can actually run them over and not suffer any consequences as Uber is apparently about to.

      P.S. And a real, human driver learns to drive more defensively in areas where they suspect pedestrians might behave more unpredictably. Apparently AI isn't "intelligent" enough to have that sense.

    • This I can agree with. My kids' elementary school had a large sign in the front door that asked people NOT to walk directly across the street from the parking lot. Everyone ignored it.
  • I stand by my post in the last topic :
    https://slashdot.org/comments.... [slashdot.org]
    (You people are CRAZY with your laws and cars, pedestrians should NOT have right of way at all times as it seems you do, or at least most of you behaved while I was there)

    That being said, someone speculated this was right near a late night club in a boring area, only fun thing to do is drink there? So maybe drunk.

    I'll tell you a few things, they crossed in the WORST spot, just IN the dark part after some light, holy crap does she come o

    • You are quite correct - the pedestrian should have been on the lookout, for their own safety at least even if the law might be on their side (though I am not sure of the state and local laws there). Moreover they could have been wearing a reflective vest or helmet, had reflective strips on the bike, not been wearing a black shirt, had a headlight on the bike, etc... things that may not have made a difference here, but are generally good ideas (and in some places legally required) when using a bicycle at nig

    • by g01d4 ( 888748 )

      video ! = eyes

      I think that's part of the issue. The unlit street portion in the video was too dark (zero information) and I'm guessing not likely what a typical driver would've seen. To be driving at that speed with zero information of what's in front of you at that short distance is a recipe for trouble. If the victim wasn't inebriated into total indifference it's possible she assumed she was visible to the driver who wouldn't dare hit her regardless of who had right of way.

  • I am sure there are many instances that self-driving cars have braked and saved lives that otherwise would have been an injury or death. Such incidents never make to the front pages.

  • Expected (Score:5, Insightful)

    by markdavis ( 642305 ) on Wednesday March 21, 2018 @08:31PM (#56301623)

    Exactly what I expected to see....
    Someone walking a bike.
    At night.
    No streetlights.
    No backlighting at all.
    Wearing black top and dark pants.
    With no lights at all on the bike.
    No lights on the person.
    Not in a crosswalk.
    Apparently not looking.
    About 2 seconds of visibility.

    The pedestrian is almost 100% wrong in every possible way. I don't see how this could be ANY human driver's fault, had a human been driving. As for autonomous, I guess it depends on what sensors. Could their system have had an infrared camera or other sensor that could have seen the wreckless pedestrian sooner than was evident in [human] visible light? That would have been nice. But does that make the pedestrian less at fault? I think not.

    • Yet you can see a person on a bike. So should have the car.
      • Re:Expected (Score:4, Insightful)

        by markdavis ( 642305 ) on Wednesday March 21, 2018 @09:04PM (#56301905)

        >"Yet you can see a person on a bike. So should have the car."

        Yes, I saw the person for maybe 2 seconds in the video. Had it been that second I was checking speed or a mirror, it would have been less. And I might have had time to brake or swerve. And swerving might have made it worse. But just seeing the bike in 2 seconds doesn't make it the vehicle's fault.

        Oh, you might think "well, if it were a car in front of you, and you followed the 2 second [following distance] rule, you should be able to stop in time". And I would agree... BUT the car would have tail lights AND probably brake lights and I would have already known it was there and from far, far away. AND it would be in a fairly predictable location with fairly predictable actions. In such a case, yes, I would be at fault as the rear-ender. And yet, same scenario- if that car in front at night had NO lights and NO brake lights, it would immediately shift to being their fault. And that is without that unlit car coming into view at the last few seconds FROM ACROSS A MEDIAN!

        But I *do* agree that an autonomous car with lots of high-tech sensors should have been able to "see" what was happening [beyond human visible light] sooner and at least tried to brake. Still doesn't mean the car is at fault.

        • Re:Expected (Score:4, Interesting)

          by fluffernutter ( 1411889 ) on Wednesday March 21, 2018 @09:09PM (#56301955)
          First of all, the woman would have been much more visible than the video captured. The video was very dark; way darker than adjusted human vision. Secondly, as you said, where are all these nice sensors that are going to make automated driving safer? What happened to 'these cars can see way better than you can'?
    • Re:Expected (Score:5, Insightful)

      by SlaveToTheGrind ( 546262 ) on Wednesday March 21, 2018 @09:12PM (#56301967)

      The difference, of course, being that an actual human driver would have actually been watching the road (imagine that) and would have, when finally seeing the pedestrian, (a) swerved; (b) slammed on the brakes, and/or (c) most likely, both, rather than plowing into her at full speed while mouthing "oh shit" after having finally looked up from staring at a smartphone in their lap. That difference might well have left her just seriously injured rather than dead.

      It's not a perfect world. "SHE DIDN'T HAVE THE RIGHT OF WAY" doesn't come even close to excusing (a) an insufficiently designed guidance system paired with (b) an unbelievably irresponsible "safety driver."

  • by angel'o'sphere ( 80593 ) on Wednesday March 21, 2018 @08:39PM (#56301695) Journal

    ... however the test driver did not really pay attention.

    Being test driver is obviously a fucked up job. 99% is killing time and 1% is killing time.

    In Germany there is not one test driver but 3 ... one who would react if something goes wrong and 2 to write protocols about notable stuff.

    In this case it is notable that the lights are configured incorrect. They barely shine 15 yards ahead, that is definitely wrong, and a driver or the automatic driving system should adjust speed to about 1/3rd of what it was driving.

  • I'm sure serveral automation fanbois got a hard reality check today. Tough love.
  • Yeah, he’s there baby sitting the new tech but, assuming the vehicle didn’t slow down, could he have intervened if he wasn’t distracted? He appears to be reading something since he smiles after looking at whatever he’s holding below the camera view. My money is on that guy getting an NTSB finger pointed at him.

  • Doesn't look good (Score:5, Insightful)

    by quantaman ( 517394 ) on Wednesday March 21, 2018 @08:48PM (#56301767)

    True she comes out of nowhere on the video, but that's a really crappy video. She was walking slowing and already in the car's lane when the headlights hit her, even if she had been stationary the result would have been the same.

    Of course a human driver could have hit her as well, but I suspect that most often a human driver would have seen her far enough ahead to stop or at least swerve enough to avoid her (of course most Ubers might have as well).

    I'm curious if that's the only video available since decent cameras are not that expensive, and I'd expect the car to have several cameras at different contrast levels.

    • True she comes out of nowhere on the video, but that's a really crappy video.

      The video released was released in a way to make Uber look as good as possible.

  • I maintain that a 2 mile stretch of road with a limit of 35 (previously 45mph) thatâ(TM)s eight lanes across should have more than one crosswalk, and probably shouldnâ(TM)t exist in this form at all. I hate Uber and the empire theyâ(TM)ve built on the backs of the working poor, but city planning has to modernise with our tech. The real wonder here is that people arenâ(TM)t killed on that road CONSTANTLY.

    • by Kaenneth ( 82978 )

      Yeah, they should build a pedestrian bridge across it, that way no one would ever get killed.

      oh, wait.

  • by MobyDisk ( 75490 ) on Wednesday March 21, 2018 @09:36PM (#56302147) Homepage

    Suppose that this was not a self-driving car. You see a video of a driver spending 50% of their time looking down at a (phone, book, video game, etc.) and 50% looking ahead. They look ahead, and suddenly get an OH SH*T look and plow someone down. What would the law say?

    1) The pedestrian was negligent.
    2) The driver was negligent.

    This is contributory negligence, and I don't think the driver would get off with no penalty just because the pedestrian was negligent. This cannot be allowed to continue.

    So back to the self-driving part: either the driver thought "Oh, it's a self driving car, I'll play a video game" or Uber said "Monitor this status console here on your lap and just look up every now and then to make sure that you don't plow over someone." The police need to figure that out. If it is the former, the law should do whatever they normally do in cases of contributory negligence. But if it is the latter, then Uber needs to lose their license for testing these cars, and face a big fine.

  • by viperidaenz ( 2515578 ) on Wednesday March 21, 2018 @09:51PM (#56302229)

    That's at least enough time for a real human to begin applying the brakes.
    Slowing down by just 5mph would have given the woman a 30% higher chance of surviving.

    With real eyes looking and not a camera, you'd be able to see more detail in the shadows. There are street lights there and a human eye has a much greater dynamic range than a camera.

    The person behind the wheel looked up and to their left, showing he saw the woman in the other lane before the impact. The camera couldn't see the woman until she was directly in front of the driver's side of the lane, proving a person could have seen her in the shadow where the camera, due to its limited dynamic range, couldn't.

    Perhaps Uber should have forked out for HDR cameras.

Nothing is finished until the paperwork is done.

Working...