Fully Self-Driving Cars May Hit US Roads in Pilot Program: NHTSA (reuters.com) 82
Fully self-driving cars may be on the fast lane to U.S. roads under a pilot program the Trump administration said on Tuesday it was considering, which would allow real-world road testing for a limited number of the vehicles. Reuters: Self-driving cars used in the program would potentially need to have technology disabling the vehicle if a sensor fails or barring vehicles from traveling above safe speeds, the National Highway Traffic Safety Administration (NHTSA) said in a document made public Tuesday. NHTSA said it was considering whether it would have to be notified of any accident within 24 hours and was seeking public input on what other data should be disclosed including near misses. The U.S. House of Representatives passed legislation in 2017 to speed the adoption of self-driving cars, but the Senate has not approved it. Several safety groups oppose the bill, which is backed by carmakers. It has only a slender chance of being approved in 2018, congressional aides said.
Dumb (Score:2, Funny)
If they already know the cars are going to hit the roads, why are they launching them anyway?
Re:Dumb (Score:4, Insightful)
Because without a trebuchet, how else would you make them hit the roads?
Re: (Score:2)
Some sort of punching device maybe?
Re: (Score:1)
If it's an Uber car it's likely to hit anything and everything.
Re: (Score:2)
The pilot, of course. It's right in the title!
HIT THE ROAD (Score:1)
Bet that's not the only thing they hit.
Can we suggest test markets? (Score:3)
"Local" approach to driving (Score:2)
Safe out of state driver. "Make sure your seatbelts are on and sit quietly."
Local driver: "Hold my beer and watch this."
Re: (Score:2)
I've lived and visited some places where many of the locals really shouldn't be licensed to drive, ever ... It would be a great place to test self-driving cars
Yeah, right, let's put automated cars on the roads where you think the people would be less able to react to them properly. NIMBY?
I like this from the summary: "or barring vehicles from traveling above safe speeds,". Sense of deja vu. Who was it that wrote "the book" on the Corvair, wasn't it? "Unsafe at any speed". Yes, I know, Google is my alleged friend, but we're currently not on speaking terms for privacy violations.
Re: (Score:2)
Former presidential candidate Ralph Nader.
Re: (Score:2)
(*) By R&D expenses.
Re: (Score:2)
Re: (Score:3)
I have to disagree. I spent time in Upstate New York, and the drivers seemed courteous & respectful...... a dramatic contrast from Southern California where even the cops say "I give up" as people speed-by at 85.
- BTW visitors from Baja California, and residents of Socal
- The left lane is not the slow lane. If you're driving 55, that is just fine, but please more to the far right. (Yes that's why everyone is blowing their horn at you.)
.
Re: (Score:2)
Yes, even if you're going the speed limit, there's a reason that the signs tell you that slower traffic should keep to the right.
Re: (Score:2)
I have to disagree. I spent time in Upstate New York, and the drivers seemed courteous & respectful
My hypothesis is that all communities of drivers lie on a continuum between courteous and skilled. Upstate NY drivers are very courteous but woefully unskilled. Drivers in other parts of the country are on the other end and many are somewhere in the middle.
Spend some more time Upstate (particularly in the Syracuse area) and you'll see just how unskilled they are. On beautiful, clear, dry summer days you can count on seeing several vehicle roll-overs nearly every day (often single-vehicle roll-overs no
All sensor data should be made public (Score:5, Insightful)
After any accident all sensor data should be made public so that it can then be used to further train AI systems. If it's not a law then companies will keep it to themselves so that they can only improve their AI and not their competitors. The net result is that different companies' AI's will have to "learn the same lesson" multiple times instead of once.
Re: (Score:2)
Not only that, but I would think that it would be advantageous to have multiple teams looking at a problem.
Re: (Score:2)
After any accident all sensor data should be made public so that it can then be used to further train AI systems. If it's not a law then companies will keep it to themselves so that they can only improve their AI and not their competitors. The net result is that different companies' AI's will have to "learn the same lesson" multiple times instead of once.
That seems like a good move and I think it's the way that air accidents and incidents are dealt with.
It's time for a trial & make roads safer (Score:2)
I know that people will quote a number of accidents (including two fatalities) with autonomous vehicles but the rate at which current technology has accidents is many times less than with humans behind the wheel in non-safety critical situations.
The ironic thing is, safety critical situations are generally caused by humans. Somebody driving erratically, an accident taking place in front of the vehicle, somebody running a red light because they are distracted by a text. I would think that the more autonomo
Re: (Score:2)
Obligatory xkcd [xkcd.com].
Re: (Score:2)
The ironic thing is, safety critical situations are generally caused by humans.
Yeah, that stupid, arrogant woman crossing the street where she shouldn't have been. It's all her fault. She was deliberately trying to ruin the perfect safety record of AV. She got what she deserved.
I would think that if the weather is too bad for autonomous vehicles, it's also too bad for human drivers...
Well, you might think that. You might be wrong. Humans have a lot of experience driving in snow and stuff. Yes, there are a lot of really funny videos of what happens on icy roads, but I don't think an AV can deal with zero traction on a hill any better than a human could.
Re: (Score:1)
Yeah, that stupid, arrogant woman crossing the street where she shouldn't have been. It's all her fault. She was deliberately trying to ruin the perfect safety record of AV. She got what she deserved.
To be clear, there were 4 overlapping causes of that accident (this is the type of data we never get in a traditional accident, which by itself is already justification for me to keep going):
- The woman stepped into the road in an unusual place without looking.
- The safety driver was not paying attention to the road, with strong evidence that she was watching the conclusion of a show on hulu
- The emergency breaking system was deactivated at a software level, with the intention that emergency breaking be han
Re: (Score:1)
Re: (Score:1)
I would call the management which allowed such a situation to occur a 4th
I would call the management which allowed this to happen evil SOBs who should be sent to jail for pushing their schedule over safety.
Re: (Score:2)
- The emergency breaking system was deactivated at a software level,
The software saw the woman, and applied the breaks.
How does software that has been deactivated apply the brakes?
Using the logic in your comment, EVERY AV accident will be the fault of a human, so there will never be an AV accident where we can blame the AV. AV are perfect.
Re: (Score:2)
I should say "tried to apply the breaks." It's easy:
I know how easy it is to comment out a line of code. But, if the line of code that applies the brakes is commented out, the software is not even trying, and will never try, to apply the brakes. It may "detect obstacle" but it, by design, will do nothing about that obstacle. Saying it "tried to apply the brakes" gives the software some anthropogenic qualities. It "wanted to" apply the brakes, it "thought about" applying them, it "tried to". "Try not. Do. Or do not. There is no try." The executable code has n
Re: (Score:2)
Well, you might think that. You might be wrong. Humans have a lot of experience driving in snow and stuff. Yes, there are a lot of really funny videos of what happens on icy roads, but I don't think an AV can deal with zero traction on a hill any better than a human could.
Damned thing will probably just park itself and call it's 'remote human operator', or just call 911 or something. Too stupid to figure it out because it has no ability to think.
Re: (Score:3)
> It's all her fault.
She was jaywalking in the middle of a highway, so yes, it was her fault. Plus she stepped in front of the car when it was only feet away. Even with instant braking, that car would not have stopped in time to miss the impact. SHE caused her own damn death.
Re: (Score:2)
Plus she stepped in front of the car when it was only feet away.
No. She was visible for a full 8 seconds or so on the road before the impact.
The poor quality video released makes it seem like she appeared out of nowhere, but an old lady crossing the road pushing a bicycle does not cover 2.5 lanes in 2 seconds.
Re: (Score:2)
If she's not visible in the video, then she's not visible to human eyes either. She did not become visible until the headlights were on her. (And also: Why in hell was she crossing the road? SHE has eyes too. She should have seen the headlights coming & avoided the car.)
Re: (Score:2)
If she's not visible in the video, then she's not visible to human eyes either.
Incorrect. Lots of things are visible to human eyes that are invisible to the camera. The fact is that she was visible for a good 8 seconds or so - you can look up the findings in the official reports and what Uber had to say about it themselves.
Uber themselves say that she was visible for a long time. Why are you disputing what they say?
speeds (Score:3)
Self-driving cars used in the program would potentially need to have technology disabling the vehicle if a sensor fails or barring vehicles from traveling above safe speeds
Why is this necessary? Half the point of self driving cars is that they can go slower because I don't need to focus. Go 40 mph (64 kph) for all I care. I can be doing something else. I don't need to "hurry" at 70, just get me there.
Though I suppose I do see why it legally "needs to be said". During the introductory phase it would be best to "flow with traffic", but once the majority are self driving they could lower the speed limits so any accidents that do happen are less dangerous.
Re: (Score:2)
Why is this necessary? Half the point of self driving cars is that they can go slower because I don't need to focus. Go 40 mph (64 kph) for all I care. I can be doing something else. I don't need to "hurry" at 70, just get me there.
But you don't want half an hour's trip to become one or two hours. If the autonomous car is crippled because it's lost long range sensors it's totally reasonable to force it to stop rather than slow down everyone on the road to 10 mph. Though I hope they don't do anything so silly as to ban vehicles with redundant sensors from operating in degraded mode - that's kinda the point of redundancy. I'm not sure why they have to make explicit rules about this, it sounds like the AI version of the "self-integrity"
Re: (Score:2)
Half the point of self driving cars is that they can go slower because I don't need to focus. Go 40 mph (64 kph) for all I care.
Yes, that would be remarkably safer than the current situation. Imagine a freeway where 20% of the cars are AV, and 20% of the cars are going 40MPH instead of the 65MPH speed limit.
No, I contend that half the point of AV is NOT that they can go slow because the passenger doesn't care how soon he gets to the destination. I think that's absolute nonsense.
Driverless Cars Cost Jobs, Decrease Safety (Score:3)
Specious argument (Score:2)
What was the elevator accident rate before they became automated versus afterwards?
Re: (Score:3)
Here's the automated rate: U.S. elevators make 18 billion passenger trips per year. Those trips result in about 27 deaths annually,
- I can easily imagine the pre-automated elevators had accidents due to operator stupidity or carelessness.... like closing the door on a passenger & killing him. Or moving the elevator up a floor as someone is trying to exit, and then they plunge to their death.
Automated elevators don't do stupid stuff.
Re: (Score:1)
Automated elevators don't do stupid stuff.
I have 3 of them just down the hall, and each of them have quirks, such as doors that close and reopen repeatedly w/o any reason. In my previous (government) building, I had a coworker get stuck inside one for several hours on a weekend when nobody was around.
Re: (Score:2)
Here's the automated rate: U.S. elevators make 18 billion passenger trips per year. Those trips result in about 27 deaths annually,
I have never seen, nor have I heard of, an elevator accident that happened because one elevator detected someone in the shaft that it had to avoid so it swerved into the next shaft and was hit by a passing elevator. Nor have I seen or heard of an elevator accident where one elevator slammed on the brakes to keep from hitting someone in the shaft and was run into by a following elevator.
Perhaps comparing elevator automation to automobile automation is a bit of a stretch?
NIMBY here.. (Score:2)
What about Liability? (Score:4)
What about Liability?
What about it? Owner vs manufacturer? (Score:2)
What about it? Are you asking whether the owner of rhe vehicle will be liable, or the manufacturer?
Both. The manufacturer will ultimately pay the bill, but I I buy a device and send my device out on the road, where it injures you, you're claim is against me.
Just as I as a driver have an agreement with an insurance company to cover my liability, the owner of an autonomous vehicle have coverage from the manufacturer. Essentially the manufacturer serves the same role as an insurance company as far as how a sui
Re: (Score:3)
Re: (Score:2)
Technically it's not your fault, but legally it is. Doesn't matter much, because the insurance picks up the tab anyway. They don't send a driver to prison for causing a deadly accident.
Where I live, it's the same when there's a collision between a car and a bicycle or pedestrian. Legally, the car is always at fault, even if it did nothing wrong.
Re: (Score:2)
I'm liable because mine did the damage (Score:2)
First let's be clear it's not about fault, it's about liability.
It's a question of who needs to pay the bill to get the damage fixed, not who is a bad boy.
If my dog bit your kid, causing damage, you could expect me to pay for at least the medical bills, because it's my dog. I'd like to not because I did anything wrong, but because it's my dog that did the damage.
Just by getting a dog I took on the risk that the dog would cause damage. (You and your kid didn't choose for me to get the dog, and so didn't ass
Typo: I'm liable, not "I'd like to" (Score:2)
I have a typo above. Instead of:
I'd like to not because I did anything wrong, but because it's my dog that did the damage.
That should be
I'm liable not because I did anything wrong, but because it's my dog that did the damage.
Re: (Score:2)
Winterbottom v. Wright (1842) (Score:2)
The term to Google is "privity of contract".
See also Winterbottom v. Wright (1842). Winterbottom, a postal service wagon driver, was injured due to a defective wagon wheel. Winterbottom sued.
Held:
The wagon was provided to Winterbottom by the postmaster. Winterbottom can file a claim only against the postmaster, with whom he has dealings.
It is the postmaster who received assurances from Wright, so the postmaster can sue Wright. Winterbottom cannot "skip a step" and sue Wright.
Later cases clarified that if
Re: (Score:2)
Re: (Score:2)
What about Liability?
I'm sure self-driving cars would have to be insured for liability just like any other car.
Re: (Score:1)
May Hit US Children is what it should say (Score:2)
This pilot program will end with the first lawsuit filed for the death of an American child by the hands of a foreign robot vehicle.
What are they thinking? (Score:2)
Naw, it seems like a much better idea to just legalize them at highway speeds day 1.
Re: (Score:2)
Because highways are actually the easy part for self-driving cars, and disorganized alleys and dirt roads and parking lots are the hard part. Perhaps we should instead pass a law saying they must travel at least 65 MPH, then decrease it by 5 MPH a year.
Re: (Score:2)
Your car going 15mph crashes with 1/19 the amount of energy than it does at 65mph. Basically a strong nudge compared to a high velocity explosion.
Nothing like being auto drive operator that finds system bug #12,943 that causes the car to suddenly swerve right when the road is slightly banked left, the sun is low and blinding the cameras, and a crow takes off from a bit of road kill at the last second. I think I'd rather have that exp
I smell a limo (Score:2)
Me? PPV of the person(s) responsible execution, with proceeds going to the victim's families.
Seriously, someone needs to at least go to prison for 10+ years on this. And don't go blaming the dead driver, sounds like he wasn't happy either.
Oh, you're hiding in Pakistan? Guess what? We have people trained for exactly that, shitstick.
Not to mention pedestrians, telephone poles, (Score:1)
buildings, trees