Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Transportation Businesses Google United States Technology

California May Soon Allow Passengers In Driverless Cars (reuters.com) 165

According to Reuters, California's public utility regulator on Friday signaled it would allow passengers to ride in self-driving cars without a backup driver in the vehicle. It is a big step forward for autonomous car developers, especially as the industry faces heightened scrutiny over safety concerns. From the report: The California Public Utilities Commission, the body that regulates utilities including transportation companies such as ride-hailing apps, issued a proposal that could clear the way for companies such as Alphabet's Waymo and General Motors to give members of the public a ride in a self-driving car without any backup driver present, which has been the practice of most companies so far. The California Department of Motor Vehicles had already issued rules allowing for autonomous vehicle testing without drivers, which took effect this week. The commission said its proposed rules complement the existing DMV rules but provide additional protections for passengers. The proposal, which is set to be voted on at the commission's meeting next month, would clear the way for autonomous vehicle companies to do more testing and get the public more closely acquainted with driverless cars in a state that has closely regulated the industry. It also comes as regulators across the country are taking a harder look at self-driving cars in the aftermath of a crash in Arizona that killed a pedestrian.

California May Soon Allow Passengers In Driverless Cars

Comments Filter:
  • After the fatalities that just happened with Uber and Tesla's malfunctioning autopilots, putting passengers in self-driving cars this soon is just crazy. The tech needs to go through far more tests before that should be allowed.
    • Proposal was likely written before those deaths.
    • Tesla's autopilot isn't meant to be autonomous, and Uber's technology was laughably far behind. Citing their accidents is almost as irrelevant as citing someone driving into a wall on cruise control. I don't know if self-driving cars are ready or not, but you haven't cited any relevant evidence.

      • Tesla's autopilot isn't meant to be autonomous

        It doesn't give Tesla the right to put people in a vehicle that still makes very stupid and clumsy mistakes.

      • by x0ra ( 1249540 )

        Tesla's autopilot isn't meant to be autonomous, [...].

        If it not meant to be "auto"-nomous, don't fucking call it "autopilot", you moron...

    • After the fatalities that just happened with Uber and Tesla's malfunctioning autopilots, putting passengers in self-driving cars this soon is just crazy.

      After the 100+ fatalities that happened yesterday, putting drivers behind the wheel of ton-plus machines moving over a mile a minute is just crazy.

      Tesla's autopilot is not a fully autonomous driving system, so it's not relevant when talking about autonomous vehicles. That's not the kind of system that's going to be allowed to drive people around. Authorities say that a human driver would likely have hit the pedestrian who walked suddenly out into the street in the case of the Uber collision, so it's not cle

      • Tesla's autopilot is not a fully autonomous driving system, so it's not relevant when talking about autonomous vehicles.

        It's kind of funny how you self-driving proponents will swear up and down how lousy people are at driving. I have read the word 'meatbag' more times in the last year than I ever have. Yet you will support a system that expects them to remain alert while sitting still and doing nothing. This is the most unnatural thing for humans and you are ready to get behind a system that almost ensures their distraction.

        • It's kind of funny how you self-driving proponents will swear up and down how lousy people are at driving. I have read the word 'meatbag' more times in the last year than I ever have. Yet you will support a system that expects them to remain alert while sitting still and doing nothing.

          I don't actually support that. That's not what autopilot is. You don't "do nothing". You use the time that the car gives you to be a better driver. You scrutinize the people around you, and look at the background. Othewise, you're using it wrong. I do also think that the kind of fault it experienced recently is pathetic and unacceptable. I don't think the Uber car accident is in that category, but I'm willing to be convinced otherwise. However, I am sure that many people who are driving are crap at driving.

          • We use cars because they take us where we need to go very safely. A lot of people focus on the fatality count without putting it into perspective of the serious number of hours Americans put on the road every year. Another nice thing about a regular car is you have control to slow down if you see a dangerous situation, but you had better hope level 4 or 5 senses the same thing you do. The second mistake they make is to appreciate the number of accidents self-driving get in today with the number of miles
          • by x0ra ( 1249540 )
            autopilot: ôdplt/, noun, short for automatic pilot.

            automatic: ôdmadik/, adjective, 1. (of a device or process) working by itself with little or no direct human control. noun 1. an automatic machine or device, in particular.

            Moron.
            • autopilot: Ãdplt/, noun, short for automatic pilot.

              Autopilot will fly you straight into a hill, or sail you straight into a rock. You really want to make that comparison? Because it completely deflates your argument.

              automatic: Ãdmadik/, adjective, 1. (of a device or process) working by itself with little or no direct human control.

              Yep. Teslas with autopilot work with little direct human control. Fits the description perfectly. Thanks for saving me the trouble of pasting the definition.

      • by x0ra ( 1249540 )

        Tesla's autopilot is not a fully autonomous driving system, [...].

        autopilot: ôdplt/, noun, short for automatic pilot.

        if something is not meant to mean its dictionary definition, don't fucking call it that way. Even a 2 year old retard understand that, I'm sure Elon has enough brain cell to understand that as well, and I'm sure you do too.

    • by MrL0G1C ( 867445 )

      Such a gross over-simplification, Tesla's autopiliot should not be called autopilot, it's simply a lane assist function. Uber's cars shouldn't be allowed on the roads because of how bad there self-driving systems are. Other car companies have massively superior autonomous systems - like waymo which can go over 400 times further than an uber car on average before the driver has to take over.

  • by yayoubetcha ( 893774 ) on Sunday April 08, 2018 @10:45AM (#56401545)

    " It also comes as regulators across the country are taking a harder look at self-driving cars in the aftermath of a crash in Arizona that killed a pedestrian. "

    An autonomous driving vehicle did not kill that pedestrian in AZ. Uber did.

    In order for Uber to show their Investors that they were working well on autonomous vehichles, they rushed their poorly developed prototype onto roads that it had no business being on. This was not a self driving car incident. This was a lab experiment that was allowed on public roads.

    The media should stop saying that this was a death caused by self-driving cars. It was Uber and their idiotic mission to show, on so many fronts, that they are a more advanced technical company than they actually are.

    • You know what? Uber is certainly a steaming pile of turd of a company, but I don't even blame them. We all know companies will do anything to make profit, especially a company like Uber. The problem is no one seems to care what these companies do. Everyone rolls there eyes and says, "Oh there goes Uber again" to a laugh track like on an 80's sitcom. I fault the government for not establishing quantifiable and measurable standards to *ensure* a vehicle is safe enough just to get through the testing it n
      • ALL of the companies that want to roll out autonomous cars as a service are "steaming turds." They'll all be the same -- trips in a database, tied to an identity and bank card number for life. They may think they're doing good by saving lives through safer cars. In reality, they'll be destroying people's lives by robbing their privacy if their product becomes mandatory.
        • I agree. It's going to be terrible for freedom. Right now if I want to go to a museum, I pay the admission of the museum. Down the road, you may need to pay a car company to go to the part of the city with the museum and then pay for the museum. Then they know you're at the museum so you get pelted with 'personal' ads while you are there AND in the car.
          • so you get pelted with 'personal' ads

            I've never really understood the notion that being "pelted with ads" was a major problem. Possibly because I'm quite capable of tuning ads out, and don't necessarily feel an incredible urge to buy something (or vote for someone) based on any ads I might pay attention to.

      • by MrL0G1C ( 867445 )

        I fault the government for not establishing quantifiable and measurable standards to *ensure* a vehicle is safe enough just to get through the testing it needs to do without killing anyone.

        In the UK we have the highway code. It'd serve as a good test to test the cars response to every bit of the highway code, it certainly wouldn't be a short test and it'd need to have a detailed test track and multiple participants, but I don't think anything less would suffice.

    • The media should stop saying that this was a death caused by self-driving cars.

      And yet it was. You see fundamentally the problem here is that the race to self-driving technology is one that involves keeping your trade secrets and technology close locked away for yourself. Uber may have owned the car that killed a person, but the fact that someone else's technology locked behind patents and IP could have prevented the death is THE problem with self driving cars.

      At least Volvo had the decency to patent the seatbelt for the express purpose of opening it up to everyone and preventing any

  • Is there any testing and certification done for those cars or do they trust the companies to handle that by themselves?

    • by HiThere ( 15173 )

      THIS IS A GUESS!!

      I think it's probably too new for standard tests to have emerged. There will be the normal "road worthy" tests, and smog tests (electric car, pass), but appropriate "driving skill" tests for autonomous vehicles haven't yet been formalized.

      So, yes, there is testing and certification, but it's based on the existing standards. It doesn't yet test the autonomous skill level. But liability regulations haven't been waived, and without a "designated driver" there's no intermediate to end up wit

  • "...It is a big step forward for autonomous car developers, especially as the industry faces heightened scrutiny over safety concerns."

    Scrutiny? Don't assume for one fucking second this change has fuck-all with validating how safe driverless cars are, especially with no backup driver. This is Greed N. Corruption pushing forward with legislation that best supports maximizing profits at any cost.

    • by HiThere ( 15173 )

      I really think the "backup driver" who's supposed to take over in an emergency makes things less safe. It's one thing to have someone who should take over when, e.g., leaving the freeway, but taking over in an emergency is a horrible idea.

      I've definitely heard of experiments where people can't maintain attention, and where there was a lapse of time before they could effectively take control. I haven't heard of *any* where it was shown to be a good idea.

      • I really think the "backup driver" who's supposed to take over in an emergency makes things less safe. It's one thing to have someone who should take over when, e.g., leaving the freeway, but taking over in an emergency is a horrible idea.

        I've definitely heard of experiments where people can't maintain attention, and where there was a lapse of time before they could effectively take control. I haven't heard of *any* where it was shown to be a good idea.

        Other than those profiting from pushing autonomous solutions to market as fast as greed will possibly allow, I haven't heard of *any* adult armed with common sense showing that this is a good idea.

        Deploying it in the most populated state in the nation is merely icing on the Cake of Grand Stupidity.

        • by HiThere ( 15173 )

          I have no idea whether the driverless car is ready to put on the roads or not. There's some indication that it is, and other indications that it isn't. And they aren't all the same.

          That said, the "emergency takeover driver" idea is worse than useless. It's not merely useless, it's even worse. People need several seconds to get up to speed in that kind of activity, and if you have that much time, it's not an emergency. Plan on needing at least 30 seconds for a take-over, or someone won't put down their

          • I have no idea whether the driverless car is ready to put on the roads or not. There's some indication that it is, and other indications that it isn't. And they aren't all the same.

            That said, the "emergency takeover driver" idea is worse than useless. It's not merely useless, it's even worse. People need several seconds to get up to speed in that kind of activity, and if you have that much time, it's not an emergency. Plan on needing at least 30 seconds for a take-over, or someone won't put down their crossword puzzle in time.

            If an autonomous solution starts to malfunction causing it to start drifting off the road, it would be nice to have some kind of manual override. Perhaps you can stop assuming that every emergency that happens in an autonomous car is going to require faster-than-human reflexes to do anything to avoid disaster. The next step in driverless cars with no one behind the steering wheel is the removal of said wheel, along with the requirement to license drivers. Greed will ensure to use the excuse of "less deat

  • Automated cars are baking VERY STUPID MISTAKES still. Apparently life isn't important enough to just have the stupid mistakes fixed before we ante up human lives. Nice to know our governments care about us.
    • making*
    • The question is not are "Automated cars are baking VERY STUPID MISTAKES still." The question is, are humans "BAKING" more very stupid mistakes than the cars.

      And the answer to that is obvious, from the way you baked the question.

      In fact, I am willing to bet that you personally bake more stupid mistakes than the average automated car.

      That is not an insult, I bake more stupid mistakes than a computer does all the time.

      • Re:Caring (Score:4, Informative)

        by fluffernutter ( 1411889 ) on Sunday April 08, 2018 @01:42PM (#56402369)
        I've probably driven as many miles in my life as Uber has with all their cars. Probably much more, and absolutely in many more conditions than these companies would ever think of driving in. I've never driven at full speed into a pedestrian in a well lit street. I'm sure may people here could say the same. This applies to the other car companies as well. The people on Slashdot have probably driven equal to Tesla's 100 million miles in their life and none have killed themselves on a concrete divider. Humans drive 3.22 trillion miles a year. That is a very big number compared to self-driving 'experience' to date, and an important part of understanding how safe humans really are. Three deaths at this point are enough to make self driving very much more dangerous than humans when put into perspective of miles driven. Especially when you add to that perspective that self-driving companies get to pick when and where they drive. Even Autopilot only activates in conditions it deems 'safe'.
        • by houghi ( 78078 )

          http://www.iihs.org/iihs/topic... [iihs.org] for some numbers.
          Neither the Uber, nor the Tesla are self driving cars. In both a human did not take the action they should have taken. This is not even the 'I tried to break, but the brakes failed' scenario. That would be MORE like a self driving car incident.
          Yes, looking ate the number of miles that people drive, people are safe drivers. 1.16 per 100.000.000 miles driven in the US.

          So yes, 3 deaths would be a lot, but they have not yet happened.

  • California just discovered another source of revenue [slashdot.org].

  • Remember that side gig you took on to try to make ends meet? IE pay off that onerous student loan, or put a dent in the wife's cancer treatment bills, or buy the kids clothes for school?

    It's going away.
  • Call me old fashioned (and it wouldn't be the first time) but I prefer to be carless, thank you.
  • Self driving cars have an abundance of caution, and are generally going slower than normal traffic. The people INSIDE a self driving car are plenty safe.

    I maintain the ones outside are as well, as self-driving cars are already safer by far than the average driver. However I can seer why some people might still not understand that... not the case for the safety of passengers, which are obviously safe.

  • My partner is from California and friends have already sent her footage of them sitting in the driver seat of a self driving vehicle which took someone home.

    I believe it was an uber and I think they needed to sign up in order to do this, but it's definitely occurring. The person who was in the driver seat is a simple friend of my girl, not uber staff or any kind of technician, trainer, vehicle monitor, just a regular passenger. This was about 3 or 4 weeks ago.

No problem is so large it can't be fit in somewhere.

Working...