California May Soon Allow Passengers In Driverless Cars (reuters.com) 165
According to Reuters, California's public utility regulator on Friday signaled it would allow passengers to ride in self-driving cars without a backup driver in the vehicle. It is a big step forward for autonomous car developers, especially as the industry faces heightened scrutiny over safety concerns. From the report: The California Public Utilities Commission, the body that regulates utilities including transportation companies such as ride-hailing apps, issued a proposal that could clear the way for companies such as Alphabet's Waymo and General Motors to give members of the public a ride in a self-driving car without any backup driver present, which has been the practice of most companies so far. The California Department of Motor Vehicles had already issued rules allowing for autonomous vehicle testing without drivers, which took effect this week. The commission said its proposed rules complement the existing DMV rules but provide additional protections for passengers. The proposal, which is set to be voted on at the commission's meeting next month, would clear the way for autonomous vehicle companies to do more testing and get the public more closely acquainted with driverless cars in a state that has closely regulated the industry. It also comes as regulators across the country are taking a harder look at self-driving cars in the aftermath of a crash in Arizona that killed a pedestrian.
Madness - Far Too Soon For This (Score:1)
Re: (Score:2)
Re: (Score:3)
Tesla's autopilot isn't meant to be autonomous, and Uber's technology was laughably far behind. Citing their accidents is almost as irrelevant as citing someone driving into a wall on cruise control. I don't know if self-driving cars are ready or not, but you haven't cited any relevant evidence.
Re: (Score:2)
Tesla's autopilot isn't meant to be autonomous
It doesn't give Tesla the right to put people in a vehicle that still makes very stupid and clumsy mistakes.
Re: (Score:1)
Tesla's autopilot isn't meant to be autonomous, [...].
If it not meant to be "auto"-nomous, don't fucking call it "autopilot", you moron...
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
"Autopilot" is about right -- aircraft autopilots are designed to be used under human management.
Not according to Hollywood. 70% of the population's first thought when "autopilot" is mentioned will be the inflatable doll in Airplane! For that matter, I've seen documentaries where the pilot is doing paperwork while the plane is on autopilot. The people who named Tesla's self-drive system "autopilot" knew exactly what the majority of non-pilot's-license-holding customers would understand by the term.
Anyway, the "autopilot" concept in aviation relies on highly trained pilots backed up by air-traffic con
Re: (Score:1)
Re: (Score:1)
After the fatalities that just happened with Uber and Tesla's malfunctioning autopilots, putting passengers in self-driving cars this soon is just crazy.
After the 100+ fatalities that happened yesterday, putting drivers behind the wheel of ton-plus machines moving over a mile a minute is just crazy.
Tesla's autopilot is not a fully autonomous driving system, so it's not relevant when talking about autonomous vehicles. That's not the kind of system that's going to be allowed to drive people around. Authorities say that a human driver would likely have hit the pedestrian who walked suddenly out into the street in the case of the Uber collision, so it's not cle
Re: (Score:2)
Tesla's autopilot is not a fully autonomous driving system, so it's not relevant when talking about autonomous vehicles.
It's kind of funny how you self-driving proponents will swear up and down how lousy people are at driving. I have read the word 'meatbag' more times in the last year than I ever have. Yet you will support a system that expects them to remain alert while sitting still and doing nothing. This is the most unnatural thing for humans and you are ready to get behind a system that almost ensures their distraction.
Re: (Score:1)
It's kind of funny how you self-driving proponents will swear up and down how lousy people are at driving. I have read the word 'meatbag' more times in the last year than I ever have. Yet you will support a system that expects them to remain alert while sitting still and doing nothing.
I don't actually support that. That's not what autopilot is. You don't "do nothing". You use the time that the car gives you to be a better driver. You scrutinize the people around you, and look at the background. Othewise, you're using it wrong. I do also think that the kind of fault it experienced recently is pathetic and unacceptable. I don't think the Uber car accident is in that category, but I'm willing to be convinced otherwise. However, I am sure that many people who are driving are crap at driving.
Re: (Score:2)
Re: (Score:1)
automatic: ôdmadik/, adjective, 1. (of a device or process) working by itself with little or no direct human control. noun 1. an automatic machine or device, in particular.
Moron.
Re: (Score:1)
autopilot: Ãdplt/, noun, short for automatic pilot.
Autopilot will fly you straight into a hill, or sail you straight into a rock. You really want to make that comparison? Because it completely deflates your argument.
automatic: Ãdmadik/, adjective, 1. (of a device or process) working by itself with little or no direct human control.
Yep. Teslas with autopilot work with little direct human control. Fits the description perfectly. Thanks for saving me the trouble of pasting the definition.
Re: (Score:1)
Tesla's autopilot is not a fully autonomous driving system, [...].
autopilot: ôdplt/, noun, short for automatic pilot.
if something is not meant to mean its dictionary definition, don't fucking call it that way. Even a 2 year old retard understand that, I'm sure Elon has enough brain cell to understand that as well, and I'm sure you do too.
Re: (Score:2)
Such a gross over-simplification, Tesla's autopiliot should not be called autopilot, it's simply a lane assist function. Uber's cars shouldn't be allowed on the roads because of how bad there self-driving systems are. Other car companies have massively superior autonomous systems - like waymo which can go over 400 times further than an uber car on average before the driver has to take over.
Comment removed (Score:5, Insightful)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
I've never really understood the notion that being "pelted with ads" was a major problem. Possibly because I'm quite capable of tuning ads out, and don't necessarily feel an incredible urge to buy something (or vote for someone) based on any ads I might pay attention to.
Re: (Score:2)
In the UK we have the highway code. It'd serve as a good test to test the cars response to every bit of the highway code, it certainly wouldn't be a short test and it'd need to have a detailed test track and multiple participants, but I don't think anything less would suffice.
Re: (Score:2)
The media should stop saying that this was a death caused by self-driving cars.
And yet it was. You see fundamentally the problem here is that the race to self-driving technology is one that involves keeping your trade secrets and technology close locked away for yourself. Uber may have owned the car that killed a person, but the fact that someone else's technology locked behind patents and IP could have prevented the death is THE problem with self driving cars.
At least Volvo had the decency to patent the seatbelt for the express purpose of opening it up to everyone and preventing any
Testing? (Score:2)
Is there any testing and certification done for those cars or do they trust the companies to handle that by themselves?
Re: (Score:2)
THIS IS A GUESS!!
I think it's probably too new for standard tests to have emerged. There will be the normal "road worthy" tests, and smog tests (electric car, pass), but appropriate "driving skill" tests for autonomous vehicles haven't yet been formalized.
So, yes, there is testing and certification, but it's based on the existing standards. It doesn't yet test the autonomous skill level. But liability regulations haven't been waived, and without a "designated driver" there's no intermediate to end up wit
Re: (Score:2)
With the possible exception of Uber, most autonomous vehicle companies can already beat those death rates.
The problem is that we don't know how those cars are tested. It's easy to not kill anybody when you have a safety driver on board and decide when you let the car on the road as that allows you to avoid all the situations that the car can't currently handle. But what when it runs into unexpected an blizzard, rain, hail, sandstorm or whatever? What about those situations where just stopping the car could get you killed, like for example in a forest fire or in the middle of nowhere in cold weather? Will it be
It's a big step for Greed N. Corruption (Score:1)
"...It is a big step forward for autonomous car developers, especially as the industry faces heightened scrutiny over safety concerns."
Scrutiny? Don't assume for one fucking second this change has fuck-all with validating how safe driverless cars are, especially with no backup driver. This is Greed N. Corruption pushing forward with legislation that best supports maximizing profits at any cost.
Re: (Score:2)
I really think the "backup driver" who's supposed to take over in an emergency makes things less safe. It's one thing to have someone who should take over when, e.g., leaving the freeway, but taking over in an emergency is a horrible idea.
I've definitely heard of experiments where people can't maintain attention, and where there was a lapse of time before they could effectively take control. I haven't heard of *any* where it was shown to be a good idea.
Re: (Score:2)
I really think the "backup driver" who's supposed to take over in an emergency makes things less safe. It's one thing to have someone who should take over when, e.g., leaving the freeway, but taking over in an emergency is a horrible idea.
I've definitely heard of experiments where people can't maintain attention, and where there was a lapse of time before they could effectively take control. I haven't heard of *any* where it was shown to be a good idea.
Other than those profiting from pushing autonomous solutions to market as fast as greed will possibly allow, I haven't heard of *any* adult armed with common sense showing that this is a good idea.
Deploying it in the most populated state in the nation is merely icing on the Cake of Grand Stupidity.
Re: (Score:2)
I have no idea whether the driverless car is ready to put on the roads or not. There's some indication that it is, and other indications that it isn't. And they aren't all the same.
That said, the "emergency takeover driver" idea is worse than useless. It's not merely useless, it's even worse. People need several seconds to get up to speed in that kind of activity, and if you have that much time, it's not an emergency. Plan on needing at least 30 seconds for a take-over, or someone won't put down their
Re: (Score:2)
I have no idea whether the driverless car is ready to put on the roads or not. There's some indication that it is, and other indications that it isn't. And they aren't all the same.
That said, the "emergency takeover driver" idea is worse than useless. It's not merely useless, it's even worse. People need several seconds to get up to speed in that kind of activity, and if you have that much time, it's not an emergency. Plan on needing at least 30 seconds for a take-over, or someone won't put down their crossword puzzle in time.
If an autonomous solution starts to malfunction causing it to start drifting off the road, it would be nice to have some kind of manual override. Perhaps you can stop assuming that every emergency that happens in an autonomous car is going to require faster-than-human reflexes to do anything to avoid disaster. The next step in driverless cars with no one behind the steering wheel is the removal of said wheel, along with the requirement to license drivers. Greed will ensure to use the excuse of "less deat
Caring (Score:2)
Re: (Score:2)
Re: (Score:1)
The question is not are "Automated cars are baking VERY STUPID MISTAKES still." The question is, are humans "BAKING" more very stupid mistakes than the cars.
And the answer to that is obvious, from the way you baked the question.
In fact, I am willing to bet that you personally bake more stupid mistakes than the average automated car.
That is not an insult, I bake more stupid mistakes than a computer does all the time.
Re:Caring (Score:4, Informative)
Re: (Score:2)
Re: (Score:2)
Makes sense (Score:2)
California just discovered another source of revenue [slashdot.org].
Dear Uber Drivers: (Score:2)
It's going away.
No, thanks. (Score:2)
Sounds like a no-brainer (Score:2)
Self driving cars have an abundance of caution, and are generally going slower than normal traffic. The people INSIDE a self driving car are plenty safe.
I maintain the ones outside are as well, as self-driving cars are already safer by far than the average driver. However I can seer why some people might still not understand that... not the case for the safety of passengers, which are obviously safe.
I don't know how but this is already going on (Score:3)
My partner is from California and friends have already sent her footage of them sitting in the driver seat of a self driving vehicle which took someone home.
I believe it was an uber and I think they needed to sign up in order to do this, but it's definitely occurring. The person who was in the driver seat is a simple friend of my girl, not uber staff or any kind of technician, trainer, vehicle monitor, just a regular passenger. This was about 3 or 4 weeks ago.
How come each and every one of these cars... (Score:1)
Doesn't have to go through a driving test at a randomly assigned DMV to prove it is at least as competent as a teenaged driver at navigating traffic, residential streets, and vocal instructions from a human?
When they can do that I will consider them acceptable to be driving on the same streets as me. In the meantime I will take the kid in the ricer zigzagging between cars and generally acting stupid. At least there I know there is some primal instinct not to die baked into it. These souless machines are jus
Re:How come each and every one of these cars... (Score:5, Interesting)
Re: (Score:2)
"At least there I know there is some primal instinct not to die baked into it."
You wish! The urge to impress the other sex is much greater that the survival drive.
Re: (Score:2, Interesting)
I don't think so Tim.
Self-driving is actually safer than a human with less than 5 years of driving experience. However there is a lot of missing context.
Teens who learned on a Standard transmission car, in BFN (eg, any rural area or city under 10,000) often know the roads like the back of their hands and know exactly what to be aware of, and most of their accidents are intoxication related. Not weather, animals, or speed. Most intersections are stop signs and the occasional red light, and little traffic flo
Re: (Score:2)
I was born and raised in NYC, still live there, and virtually no one drives a manual. I don't anymore. Traffic kills the joy of driving a manual.
I propose to you that people driving a manual transmission are paying more attention to their car and how it performs than those who use an automatic. I also propose to you (as someone who knows NYC and lived in rural NH for 5
Re: (Score:2)
I believe you are correct. I know for certain that my elderly neighbor can't hear a conversation directed at him in a loud voice, though he catches a word or two, and he drives.
Re: (Score:2)
I thought surely you were mistaken. You're not [ca.gov].
Drivers that are deaf or hard of hearing can adjust their driver safety habits by relying more on their seeing sense to compensate for the loss of hearing.
Re:Are this motherfuckers... (Score:4, Insightful)
How can you do this after what just happened?
Nothing will happen to the passengers in the cars . . .
. . . it's the pedestrians that will have the problems.
Re: (Score:2)
Re: (Score:2)
Logic and rationality, apparently (Score:5, Insightful)
Completely fucking crazy? How can you do this after what just happened? What is wrong with these animals?
Logic and rationality, apparently.
They note an enormous increase in safety when cars are autonomous, want to be on the forefront of a developing technology that has benefits to society, and aren't swayed by the daily panic dished out in the media.
Or in other words, they take a measured, considered approach instead of running around panicky with quick fixes.
Re: Logic and rationality, apparently (Score:1)
Or in other words, they take a measured, considered approach
Just to make sure we're on the same page... you are referring to bureaucrats?? I want to make sure before I laugh in your logical and rational face.
Re: (Score:2)
Re: (Score:2)
Once you factor in manual interventions, "autopilot" systems become much less safe
Once you factor in experience and development they become continuously improving systems. ... Unlike say the squishy mass of mostly water held together with a bit of protien that kills people continuously because it fundamentally is a fallible non-deterministic distraction machine.
Re: (Score:1)
Re: (Score:2)
Typical "meatbag" response from a techbro full of hubris and propaganda.
Nope, just a thought out response weighing a deterministic mistake free system against human nature.
Re: (Score:2)
Once you factor in experience and development they [SD systems] become continuously improving systems. ...
You cannot factor in possible future improvements as an argument for implementation now. I'm inclined to wait for that further development and improvement you promise actually to take place, especially after recent fuck-ups including the very elementary one of a Tesla in SD mode (I don't care what else you call it) failing to decide which side of a fork it should take and opting for the concrete centre.
Re: (Score:2)
You cannot factor in possible future improvements as an argument for implementation now.
Sure you can and for three reasons:
1) If you train something in a lab it will be very good in a lab and will never be able to leave a lab. Therefore it makes far more sense to implement and train something in the field.
2) Recent fuckups are minor in comparison to the number of people who died on the road and also quite statistically insignificant due to a lack of data. But locking something in a lab means we never gather the data and round and round we go forever.
3) We already do just that. Not all vehicles
Hype and PR, apparently (Score:2)
Looks like you've bought into the hype.
We can fully be "on the forefront of a developing technology" and not buy into hype and bullshit...both can exist simultaneously.
The tech isn't ready and won't be for awhile...companies are scrambling to be the first on the road and they really don't care about anything else.
Re: (Score:2)
IIUC, being an autonomous vehicle doesn't exempt you from liability. It may get them off murder charges, but not off liability, and without a "designated driver" they won't have a fall guy. So I expect they'll be rather cautious. Otherwise it could get a bit expensive.
Re: (Score:3)
They note an enormous increase in safety when cars are autonomous
I'm unconvinced at the present state of development. The test cars behind their safety statistics have qualified drivers taking over when things are looking pear-shaped. These tests cars actually have two parallel safety systems (the software and the human supervisor) so it's hardly surprising if it scores better than one system alone. It's likely to be a different story when carrying people who are not ready or capable (non-drivers, drunks, sleepers) to over-ride the SD system. Moreover, as I expect that
Re: (Score:2)
Agree. I can't think of a better case of "you don't rely on Revision 1.0 (and certainly not anything earlier!) of any software. You expect it to glitch, lock up, and crash outright.
It is annoying enough to lose a document or some data records. It is a bit beyond annoying to ride a car into a concrete abutment at 60MPH.
Re: (Score:2)
Are you completely brain dead? You are concerned about the safety of driverless cars which have killed few enough people to count on the fingers of one hand. Where is your concern for the moronic human drivers who kill dozens every single day? Replacing these idiots with not-quite-perfect autonomous cars will save many many lives. You'd already realize this if you could actually interpret facts rather than being guided by media hysteria like the simple-minded sheep you are.
Re: (Score:3)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
All more or less true. Self-driving cars really haven't put in enough km of testing or gone through enough rounds of refinement to be at the level where they can be trusted by the general public, or really even by a community of experts who's motives are unbiased.
But self-driving technology is observing a ratchet effect. It's not getting worse or regressing to a mean, it's always getting better, and at a pretty good rate. It seems that a competent company with deep pockets, like Alphabet/Waymo, will choose
Re: (Score:2)
No, it really isn't. Self-driving tech doesn't have to be better than the best drivers to be useful. It just has to be better than the worst drivers before requiring people to use it when drunk or fatigued will save lives.
Re: (Score:2)
Re: (Score:2)
Ha! You Fell Victim to one of the Classic Blunders, the most famous of which is "Never get involved in a land war in Asia," but only sightly less well known is this: "Never assume that the worst drivers are the same people from one day to the next!"
To be fair, the drunk drivers tend to be consistent night after night, but taking away their licenses doesn't usually keep them from driving drunk, from what I've seen, making that a fairly ineffectual ap
Re:Are this motherfuckers... (Score:4, Interesting)
Depending on whose numbers you believe, the National Safety Council says that the U.S. average is 1.25 deaths per 100 million miles. So that number is in the ballpark.
However, your estimate of the number of autopilot miles is probably about an order of magnitude low. There are news articles from late 2016 claiming over 300 million miles traveled with autopilot/autosteer active. If it's not at least half a billion by now, I'd be surprised, and I wouldn't be shocked if it hit a solid billion already.
So if you ignore Waymo (too small a sample size) and Uber (trying to deploy FSD before their tech was ready), and concentrate only on Tesla, that's a pretty sizable drop in fatalities — around a factor of 5–10.
Re: (Score:2)
In the case of Tesla "autopilot" miles (which isn't "self driving"), one needs to consider the driver and vehicle demographics.
The cost of a Tesla eliminates a huge number of the riskiest drivers - such as young males and old folks. Mostly drivers of Teslas are middle age wealthy folks -- and many of them are tech oriented (telling since some insurance companies give a discount to engineers since they seem to be safer drivers). Few of them are in the "Here, hold my beer while I show you how to do a four wh
Re: (Score:2)
On the flip side, a decent percentage of people drive them because they like powerful cars. Some like powerful cars for safety reasons, but others like powerful cars because they like to drive like a bat out of you-know-where. So you shouldn't necessarily assume that the cost of the car makes people better drivers. :-)
Re: (Score:2)
You are concerned about the safety of driverless cars which have killed few enough people to count on the fingers of one hand. Where is your concern for the moronic human drivers who kill dozens every single day?
Those moronic drivers are easily identifiable. I'd ban them.
There have been no "driverless" cars on public roads so far. They have all had humans present ready (in theory) to intervene when the SD software did not cope; I don't know what the statistics are for interventions but I suspect that many if not most go un-reported. The accidents that SD cars have had are the result of double failures (software and supervisor). An analogy is a twin-engined aircraft having both engines fail at the same time. Look
Re: (Score:2)
There have been no "driverless" cars on public roads so far. They have all had humans present ready (in theory) to intervene when the SD software did not cope; I don't know what the statistics are for interventions but I suspect that many if not most go un-reported.
Waymo seems to be operating driverless cars in Arizona right now. The public launch is supposed to be "real soon now", but news articles have claimed that they have been running in a private test for a few months. No issues so far.
The human "supervisor" will never be reliable. If you expect people to take over in time to save a life, you'll be disappointed. Humans are good at many things, but paying attention during long watches of boring inactivity is something we are terrible at. If the car cannot dr
Re: (Score:2)
How can you do this after what just happened?
What? One person died? There's 325699999 others to contribute to the economy. America has gotten on just fine killing close to 5500 pedestrians every year, one more won't make a difference.
Re: (Score:2)
Re: Most posts here... (Score:1)
Re: (Score:2)
There were 4 independent sensor systems in that car (LiDAR, radar, camera, 'safety driver'). NONE of them registered the appearance of a human sized target walking in a straight line in time to apply the brakes before the collision. OK the safety driver was playing on their phone, but the three automatic systems failed. The obfuscated video supplied by Uber may have hoodwinked the general populace but they have not revealed the LiDAR or radar 'footage', neither of which rely on ambient light.
Re: (Score:2)
An animal for one thing. Ever had venison through your windshield?
Also children.
Heavy objects falling from trucks.
Re: (Score:1)
An animal for one thing. Ever had venison through your windshield?
No, but the time I saw a truck hit a deer, I totally would have seen the deer coming. He was driving too fast for conditions, because I was behind him in a vehicle which could go that fast in poor weather — he was in a 1970s pickup, I was in a 1990s Subaru — and he was too dumb to pull over and let me go by, and decided to drive too fast instead. So he was watching the road, and not the surroundings.
A car watching the side of the road would have seen it coming, too.
Also children.
That's why it's considered end
Re: (Score:2)
Re: (Score:1)
Except the Goober car that hit the pedestrian did not even respond as well as an attentive human driver would have.
Sure, that's true. But the question isn't whether it responds as well as an attentive human driver, but the average human driver. If it's at least that good, then it should be allowed — because putting it on the road now is part of the research that will make it better.
It doesn't matter whether you get killed by a human or a robot, you're dead either way and you won't have any feelings about it. But if the robot is less likely to kill you, or if letting the robot drive now makes it less likely to kill
Re: (Score:2)
Re: (Score:1)
If we mandate autonomous cars, there could be other negative consequences -- like loss of privacy due to all trips being via "rented" cars tied to a credit card and trip database.
That ship has already sailed. Most rental companies won't rent to you without a credit card, even if you're paying cash for the rental.
Re: (Score:2)
Re: (Score:1)
But not all cars are rental cars at present.
No, but people are giving up vehicle ownership voluntarily in many cases, and I'm afraid that mandatory V2V beacons are a foregone conclusion — the only question is how long before they get here, and how much resistance the public actually puts up. My guess is, not very much.
Re: (Score:2)
Heavy objects falling from trucks.
One day we were driving on the highway and the pickup in front of us didn't secure the mesh ramp they had for the pickup. Went rolling end over end down the highway. My buddy swerved around, but I'm really wondering how many automated cars are ready for something like that. It's not even like the wide face was facing us. To an automated car it would have looked like the thickness of a branch.
Re: (Score:2)
Nope. Traffic laws invariably define situations in which that is not true, such as pedestrians stepping out in front of a car that is too close to stop. And most states also say that the pedestrian no longer has the right of way after the pedestrian has finished crossing your lane.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Well, Tesla never claimed to have a autonomous vehicle, though the marketspeak "autopilot" did/does confuse some people. So take them out of the list. That's more a fancy cruise control with some lane following built in. And a bit of collision avoidance, that often works.
That said, in the recent event where a cop ticketed a autonomous car, the "driver" was charged because that's the way the laws are written. If he'd been a passenger that wouldn't have happened. (I'm not sure *who* would get the ticket
Re: (Score:2)
Rewriting history. Elon said the problem was easy to solve in 2015 [fortune.com], and that he would let his car drive him coast to coast before the end of 2017 [theverge.com].
Basically he was saying Tesla had an autonomous vehicle technology that was just around the corner.
I'm with you that they never said that this technology was implemented in their current cars.
Re: (Score:3)
Re: (Score:2)