Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
United States Transportation

Fully Self-Driving Cars May Hit US Roads in Pilot Program: NHTSA (reuters.com) 82

Fully self-driving cars may be on the fast lane to U.S. roads under a pilot program the Trump administration said on Tuesday it was considering, which would allow real-world road testing for a limited number of the vehicles. Reuters: Self-driving cars used in the program would potentially need to have technology disabling the vehicle if a sensor fails or barring vehicles from traveling above safe speeds, the National Highway Traffic Safety Administration (NHTSA) said in a document made public Tuesday. NHTSA said it was considering whether it would have to be notified of any accident within 24 hours and was seeking public input on what other data should be disclosed including near misses. The U.S. House of Representatives passed legislation in 2017 to speed the adoption of self-driving cars, but the Senate has not approved it. Several safety groups oppose the bill, which is backed by carmakers. It has only a slender chance of being approved in 2018, congressional aides said.
This discussion has been archived. No new comments can be posted.

Fully Self-Driving Cars May Hit US Roads in Pilot Program: NHTSA

Comments Filter:
  • Dumb (Score:2, Funny)

    by Anonymous Coward

    If they already know the cars are going to hit the roads, why are they launching them anyway?

  • by Anonymous Coward

    Bet that's not the only thing they hit.

  • I've lived and visited some places where many of the locals really shouldn't be licensed to drive, ever (Upstate New York in particular). It would be a great place to test self-driving cars as it couldn't possibly make their situation worse.
    • Safe out of state driver. "Make sure your seatbelts are on and sit quietly."

      Local driver: "Hold my beer and watch this."

    • I've lived and visited some places where many of the locals really shouldn't be licensed to drive, ever ... It would be a great place to test self-driving cars

      Yeah, right, let's put automated cars on the roads where you think the people would be less able to react to them properly. NIMBY?

      I like this from the summary: "or barring vehicles from traveling above safe speeds,". Sense of deja vu. Who was it that wrote "the book" on the Corvair, wasn't it? "Unsafe at any speed". Yes, I know, Google is my alleged friend, but we're currently not on speaking terms for privacy violations.

    • by igny ( 716218 )
      Yes, let us take the most expensive(*) vehicles on the road and then let some "licensed" New Yorkians attack then from all angles.

      (*) By R&D expenses.
      • Well, you know, if I applied "drone" logic to these AV, I'd be gettin' my shotgun out and blastin' away at any of them durn things that gets anywhere near me or my property. Stinkin' perverts lookin' in the window of my 16 year old daughter's bedroom...
    • I have to disagree. I spent time in Upstate New York, and the drivers seemed courteous & respectful...... a dramatic contrast from Southern California where even the cops say "I give up" as people speed-by at 85.

      - BTW visitors from Baja California, and residents of Socal

      - The left lane is not the slow lane. If you're driving 55, that is just fine, but please more to the far right. (Yes that's why everyone is blowing their horn at you.)

        .

      • by dcw3 ( 649211 )

        Yes, even if you're going the speed limit, there's a reason that the signs tell you that slower traffic should keep to the right.

      • I have to disagree. I spent time in Upstate New York, and the drivers seemed courteous & respectful

        My hypothesis is that all communities of drivers lie on a continuum between courteous and skilled. Upstate NY drivers are very courteous but woefully unskilled. Drivers in other parts of the country are on the other end and many are somewhere in the middle.

        Spend some more time Upstate (particularly in the Syracuse area) and you'll see just how unskilled they are. On beautiful, clear, dry summer days you can count on seeing several vehicle roll-overs nearly every day (often single-vehicle roll-overs no

  • by AlanBDee ( 2261976 ) on Tuesday October 09, 2018 @06:28PM (#57453208)

    After any accident all sensor data should be made public so that it can then be used to further train AI systems. If it's not a law then companies will keep it to themselves so that they can only improve their AI and not their competitors. The net result is that different companies' AI's will have to "learn the same lesson" multiple times instead of once.

    • Not only that, but I would think that it would be advantageous to have multiple teams looking at a problem.

    • by Chrisq ( 894406 )

      After any accident all sensor data should be made public so that it can then be used to further train AI systems. If it's not a law then companies will keep it to themselves so that they can only improve their AI and not their competitors. The net result is that different companies' AI's will have to "learn the same lesson" multiple times instead of once.

      That seems like a good move and I think it's the way that air accidents and incidents are dealt with.

  • I know that people will quote a number of accidents (including two fatalities) with autonomous vehicles but the rate at which current technology has accidents is many times less than with humans behind the wheel in non-safety critical situations.

    The ironic thing is, safety critical situations are generally caused by humans. Somebody driving erratically, an accident taking place in front of the vehicle, somebody running a red light because they are distracted by a text. I would think that the more autonomo

    • The ironic thing is, safety critical situations are generally caused by humans.

      Yeah, that stupid, arrogant woman crossing the street where she shouldn't have been. It's all her fault. She was deliberately trying to ruin the perfect safety record of AV. She got what she deserved.

      I would think that if the weather is too bad for autonomous vehicles, it's also too bad for human drivers...

      Well, you might think that. You might be wrong. Humans have a lot of experience driving in snow and stuff. Yes, there are a lot of really funny videos of what happens on icy roads, but I don't think an AV can deal with zero traction on a hill any better than a human could.

      • by Anonymous Coward

        Yeah, that stupid, arrogant woman crossing the street where she shouldn't have been. It's all her fault. She was deliberately trying to ruin the perfect safety record of AV. She got what she deserved.

        To be clear, there were 4 overlapping causes of that accident (this is the type of data we never get in a traditional accident, which by itself is already justification for me to keep going):

        - The woman stepped into the road in an unusual place without looking.
        - The safety driver was not paying attention to the road, with strong evidence that she was watching the conclusion of a show on hulu
        - The emergency breaking system was deactivated at a software level, with the intention that emergency breaking be han

        • by Anonymous Coward
          * brakes The brakes were broken. Them's the breaks. The emergency braking system was the module that was breaking by not braking...
        • by dcw3 ( 649211 )

          I would call the management which allowed such a situation to occur a 4th

          I would call the management which allowed this to happen evil SOBs who should be sent to jail for pushing their schedule over safety.

        • - The emergency breaking system was deactivated at a software level,

          The software saw the woman, and applied the breaks.

          How does software that has been deactivated apply the brakes?

          Using the logic in your comment, EVERY AV accident will be the fault of a human, so there will never be an AV accident where we can blame the AV. AV are perfect.

      • Well, you might think that. You might be wrong. Humans have a lot of experience driving in snow and stuff. Yes, there are a lot of really funny videos of what happens on icy roads, but I don't think an AV can deal with zero traction on a hill any better than a human could.

        Damned thing will probably just park itself and call it's 'remote human operator', or just call 911 or something. Too stupid to figure it out because it has no ability to think.

      • > It's all her fault.

        She was jaywalking in the middle of a highway, so yes, it was her fault. Plus she stepped in front of the car when it was only feet away. Even with instant braking, that car would not have stopped in time to miss the impact. SHE caused her own damn death.

        • Plus she stepped in front of the car when it was only feet away.

          No. She was visible for a full 8 seconds or so on the road before the impact.

          The poor quality video released makes it seem like she appeared out of nowhere, but an old lady crossing the road pushing a bicycle does not cover 2.5 lanes in 2 seconds.

          • If she's not visible in the video, then she's not visible to human eyes either. She did not become visible until the headlights were on her. (And also: Why in hell was she crossing the road? SHE has eyes too. She should have seen the headlights coming & avoided the car.)

            • If she's not visible in the video, then she's not visible to human eyes either.

              Incorrect. Lots of things are visible to human eyes that are invisible to the camera. The fact is that she was visible for a good 8 seconds or so - you can look up the findings in the official reports and what Uber had to say about it themselves.

              Uber themselves say that she was visible for a long time. Why are you disputing what they say?

  • by dkman ( 863999 ) on Tuesday October 09, 2018 @06:40PM (#57453258)

    Self-driving cars used in the program would potentially need to have technology disabling the vehicle if a sensor fails or barring vehicles from traveling above safe speeds

    Why is this necessary? Half the point of self driving cars is that they can go slower because I don't need to focus. Go 40 mph (64 kph) for all I care. I can be doing something else. I don't need to "hurry" at 70, just get me there.

    Though I suppose I do see why it legally "needs to be said". During the introductory phase it would be best to "flow with traffic", but once the majority are self driving they could lower the speed limits so any accidents that do happen are less dangerous.

    • by Kjella ( 173770 )

      Why is this necessary? Half the point of self driving cars is that they can go slower because I don't need to focus. Go 40 mph (64 kph) for all I care. I can be doing something else. I don't need to "hurry" at 70, just get me there.

      But you don't want half an hour's trip to become one or two hours. If the autonomous car is crippled because it's lost long range sensors it's totally reasonable to force it to stop rather than slow down everyone on the road to 10 mph. Though I hope they don't do anything so silly as to ban vehicles with redundant sensors from operating in degraded mode - that's kinda the point of redundancy. I'm not sure why they have to make explicit rules about this, it sounds like the AI version of the "self-integrity"

    • Half the point of self driving cars is that they can go slower because I don't need to focus. Go 40 mph (64 kph) for all I care.

      Yes, that would be remarkably safer than the current situation. Imagine a freeway where 20% of the cars are AV, and 20% of the cars are going 40MPH instead of the 65MPH speed limit.

      No, I contend that half the point of AV is NOT that they can go slow because the passenger doesn't care how soon he gets to the destination. I think that's absolute nonsense.

  • by BrendaEM ( 871664 ) on Tuesday October 09, 2018 @06:41PM (#57453266) Homepage
    Ask any elevator operator.
    • What was the elevator accident rate before they became automated versus afterwards?

      • Here's the automated rate: U.S. elevators make 18 billion passenger trips per year. Those trips result in about 27 deaths annually,

        - I can easily imagine the pre-automated elevators had accidents due to operator stupidity or carelessness.... like closing the door on a passenger & killing him. Or moving the elevator up a floor as someone is trying to exit, and then they plunge to their death.

        Automated elevators don't do stupid stuff.

        • by dcw3 ( 649211 )

          Automated elevators don't do stupid stuff.

          I have 3 of them just down the hall, and each of them have quirks, such as doors that close and reopen repeatedly w/o any reason. In my previous (government) building, I had a coworker get stuck inside one for several hours on a weekend when nobody was around.

        • Here's the automated rate: U.S. elevators make 18 billion passenger trips per year. Those trips result in about 27 deaths annually,

          I have never seen, nor have I heard of, an elevator accident that happened because one elevator detected someone in the shaft that it had to avoid so it swerved into the next shaft and was hit by a passing elevator. Nor have I seen or heard of an elevator accident where one elevator slammed on the brakes to keep from hitting someone in the shaft and was run into by a following elevator.

          Perhaps comparing elevator automation to automobile automation is a bit of a stretch?

  • All I can say is: if they approve this, invest in body-bag companies, you'll literally make a killing.
  • by Joe_Dragon ( 2206452 ) on Tuesday October 09, 2018 @07:00PM (#57453350)

    What about Liability?

    • What about it? Are you asking whether the owner of rhe vehicle will be liable, or the manufacturer?

      Both. The manufacturer will ultimately pay the bill, but I I buy a device and send my device out on the road, where it injures you, you're claim is against me.

      Just as I as a driver have an agreement with an insurance company to cover my liability, the owner of an autonomous vehicle have coverage from the manufacturer. Essentially the manufacturer serves the same role as an insurance company as far as how a sui

      • Why would it be your fault? Did you tell your automated car to hit the pedestrian? Did you tell your automated car to do anything that would cause you believe the trip would not be safe? If the answer is no., then it's not your fault.
        • Technically it's not your fault, but legally it is. Doesn't matter much, because the insurance picks up the tab anyway. They don't send a driver to prison for causing a deadly accident.

          Where I live, it's the same when there's a collision between a car and a bicycle or pedestrian. Legally, the car is always at fault, even if it did nothing wrong.

          • "Doesn't matter much"... sure it does. Don't know how accidents work where you are from, but where I am from you pay larger premiums and it goes against your driving record. No way am I taking that penalty for entering directions in my car that I had no reason to believe were dangerous.
        • First let's be clear it's not about fault, it's about liability.
          It's a question of who needs to pay the bill to get the damage fixed, not who is a bad boy.

          If my dog bit your kid, causing damage, you could expect me to pay for at least the medical bills, because it's my dog. I'd like to not because I did anything wrong, but because it's my dog that did the damage.

          Just by getting a dog I took on the risk that the dog would cause damage. (You and your kid didn't choose for me to get the dog, and so didn't ass

          • I have a typo above. Instead of:

            I'd like to not because I did anything wrong, but because it's my dog that did the damage.

            That should be

            I'm liable not because I did anything wrong, but because it's my dog that did the damage.

          • The problem with your reasoning is, you are the only person who is able to keep your dog from biting a kid so yes of course you will get sued if it happens. This is not so with a self driving car. The owner is just entering directions that are assumed to be within safe parameters. The liability to make sure it doesn't hit a person rests entirely with the manufacturer.
            • The term to Google is "privity of contract".

              See also Winterbottom v. Wright (1842). Winterbottom, a postal service wagon driver, was injured due to a defective wagon wheel. Winterbottom sued.
              Held:
              The wagon was provided to Winterbottom by the postmaster. Winterbottom can file a claim only against the postmaster, with whom he has dealings.

              It is the postmaster who received assurances from Wright, so the postmaster can sue Wright. Winterbottom cannot "skip a step" and sue Wright.

              Later cases clarified that if

    • What about Liability?

      I'm sure self-driving cars would have to be insured for liability just like any other car.

    • by Erastus ( 520136 )
      Before long, car insurance to drive manually will cost _more_ than fully automated driving until, eventually, manual driving is a special privilege or is relegated to private driving tracks.
  • This pilot program will end with the first lawsuit filed for the death of an American child by the hands of a foreign robot vehicle.

  • Never understood why they just don't say "fine you can put them on the roads, but they can't go faster than 15MPH" which pretty much puts them at crawling down side streets, across parking lots, on private property, and back roads. Then bump it up by 5 mph every year or two after a DOT safety review.

    Naw, it seems like a much better idea to just legalize them at highway speeds day 1.

    • Because highways are actually the easy part for self-driving cars, and disorganized alleys and dirt roads and parking lots are the hard part. Perhaps we should instead pass a law saying they must travel at least 65 MPH, then decrease it by 5 MPH a year.

      • True, it is easier. But kinetic energy is 1/2*mass*velocity squared.

        Your car going 15mph crashes with 1/19 the amount of energy than it does at 65mph. Basically a strong nudge compared to a high velocity explosion.

        Nothing like being auto drive operator that finds system bug #12,943 that causes the car to suddenly swerve right when the road is slightly banked left, the sun is low and blinding the cameras, and a crow takes off from a bit of road kill at the last second. I think I'd rather have that exp

  • killing 20 people for no good reason.

    Me? PPV of the person(s) responsible execution, with proceeds going to the victim's families.

    Seriously, someone needs to at least go to prison for 10+ years on this. And don't go blaming the dead driver, sounds like he wasn't happy either.

    Oh, you're hiding in Pakistan? Guess what? We have people trained for exactly that, shitstick.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...