Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation United States Government

US Consumer Groups Warn 'Robot Car Bill' Threatens Safety (consumerreports.org) 139

"If you don't place a Capable Engineering crew to oversee a project that involves lives, you're asking for trouble," writes Slashdot reader Neuronwelder. Consumer Reports writes: Congress is moving ahead with plans to let self-driving cars be tested on U.S. roads without having to comply with the same safety rules as regular vehicles... The House passed its version of the legislation earlier this month with little opposition. The Senate is expected to vote on its bill in the coming weeks... "Federal law shouldn't leave consumers as guinea pigs," said William Wallace, policy analyst for Consumers Union. "We were hopeful that this bill would include much stronger measures to protect consumers against known emerging safety risks. Unfortunately, in the bill's current form, it doesn't."

The legislation, which would take effect in 18 months, would allow the deployment of up to 50,000 self-driving vehicles per company in the first year of its application, rising to 100,000 vehicles annually by the third year, exempt from essential federal safety standards... Automakers might be able to go beyond the limits by getting exemptions for more than one model. The bill also creates a means to go beyond 100,000 cars for each company, by allowing automakers to petition the NHTSA after five years for more vehicles.

"The bill pre-empts any state safety standards," argues the group Consumer Watchdog, "but there are none yet in place at the national level."
This discussion has been archived. No new comments can be posted.

US Consumer Groups Warn 'Robot Car Bill' Threatens Safety

Comments Filter:
  • by fluffernutter ( 1411889 ) on Saturday September 30, 2017 @04:41PM (#55285179)
    Not only does it leave consumers as guinea pigs, it makes every non-participating driver, cyclist, and pedestrian a guinea pig. When someone dies from a flaw in self driving, will they consider it a good trade off if maybe fifty years down the road we start to see a decrease in road deaths from the technology? Will they understand why their family paid the price? Full liability on the part of the vendor introducing a self driving technology should be a minimum requirement.
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Like those same drivers, cyclists, and pedestrians I see every day who don't obey traffic laws or have much common sense?

      I'd say they are their own worst enemies.

      • US Consumer Groups Warn 'Robot Car Bill' Threatens Safety

        Wait 'til it hits my 5-tonne 'robot pedestrian'...

      • Like those same drivers, cyclists, and pedestrians I see every day who don't obey traffic laws or have much common sense?

        This is one of the biggest challenges with completely autonomous vehicles. In the real world, even if you play by the rules and act totally logically, you can't safely assume that everyone else will. A human driver will naturally learn to deal with this variability and adapt. Software doesn't do that unless its programmers make it.

        It's also worth keeping in mind that there are many legitimate reasons that normal traffic rules might not be followed. Emergency vehicles might be travelling faster than a normal

        • by DedTV ( 1652495 )

          A human driver will naturally learn to deal with this variability and adapt.

          Not well. There's around 80,000 pedestrians injured [dot.gov] in vehicle crashes each year. We as a society "adapt" with things like placing low speed limits, well defined crosswalks and a multitude of signals and signage in places with high pedestrian traffic. But our main method to adapt is to simply ignore how poorly we adapt and instead adopt an illusion of our own superiority.

          Software doesn't do that unless its programmers make it.

          Every piece of self driving car software I've ever seen demoed already has many, many systems in place to monitor and attempt to avoid ped

          • Every piece of self driving car software I've ever seen demoed already has many, many systems in place to monitor and attempt to avoid pedestrians and are much, much more sophisticated than human adaptability is.

            Really? What have you seen demoed? I haven't seen a lot of detailed technical information (I'm a geek who follows this area out of interest, but it's not my field professionally) but what I have seen suggests that recent generations of these systems still rely on signals and markings far more than they will be able to in an entirely realistic and open world, and have frequently been forced to transfer control back to their human drivers when coming up against situations they didn't know how to handle, which

    • Re:Liability (Score:5, Insightful)

      by Art Challenor ( 2621733 ) on Saturday September 30, 2017 @05:01PM (#55285273)
      The US road transportation system kills ~40k people per year and maims many more. To put that in perspective that's the equivalent of about 300 airplane crashes per year and yet it doesn't really make the news. Every car that is currently sold is threat to those same groups - humans as drivers are absolutely the worst. Some worse than others,but none are perfect. I suspect that any technology that will be deployed would be, statistically, safer than human drivers. So deploying the technology when it has matured a little more has the immediate prospect of reducing overall death rate, however that doesn't help the individual. It's a difficult problem because statistics don't matter if it's your loved one has been killed and yet we accept this of human drivers. Expecting to go from 40k to zero deaths just by deploying autonomous technology is unrealistic. Where's the cut off, 10k deaths (saving 30k lives per year), 1000 deaths, 100???
      • Don't just "expect that" automation will save lives, especially since nothing has been proven. Any automation on the road today either A) is used in a controlled environment, B) is severely limited where it can be used, backed up by the 'human still has to pay attention' bull, and/or C) has a dedicated human driver taking control when it gets into trouble. Let them prove that they *can* save lives, then maybe let them on the road.
        • I know it's difficult on ./ to expect even the person writing the article summary to read the article, but here's what says:

          Both the House and Senate bills would let automakers test and eventually sell self-driving cars as long as they prove to federal regulators that the level of safety is "at least equal" to current requirements for regular cars.

          Note the word "prove".

          Also, the bill does:

          Require companies selling self-driving cars to submit “safety evaluation reports,” spelling out how the vehi

      • Every car that is currently sold is threat to those same groups - humans as drivers are absolutely the worst. Some worse than others,but none are perfect. I suspect that any technology that will be deployed would be, statistically, safer than human drivers.

        You might suspect that, but is there any real evidence to show that we've reached anywhere near that stage of maturity yet? The only statistics I've seen so far suggest that autonomous vehicles even under relatively favourable and semi-controlled conditions still don't outperform good human drivers statistically, even with all their advantages in terms of never losing "concentration", having full 360 degree "vision" the whole time, having near-instant physical response to sensor inputs, and so on.

        So deploying the technology when it has matured a little more has the immediate prospect of reducing overall death rate, however that doesn't help the individual.

        There are

        • You might suspect that, but is there any real evidence to show that we've reached anywhere near that stage of maturity yet? The only statistics I've seen so far suggest that autonomous vehicles even under relatively favourable and semi-controlled conditions still don't outperform good human drivers statistically, even with all their advantages in terms of never losing "concentration", having full 360 degree "vision" the whole time, having near-instant physical response to sensor inputs, and so on.

          I don't think that anyone, certainly not me and not the manufactures are claiming that the technology is ready yet. Proving negligence in a wreck with the current technology and claiming damages would be easy. The technology is maturing rapidly, my guess is five years before we start to see serious test cars without human controls. We have to solve the ethical quandary before then.

          • It would be nice to think you're right, though I fear your suggested timescales are optimistic.

            What worries me is that a lot of the talk from the auto and tech industry execs does seem to be pitching this as a technology that's ready to go on real roads for real world testing in the very near future. Maybe that's partly for the investors, the media and the politicians, but still, I've detected more than a hint of arrogance in some of those public statements in recent years.

            In reality, what I see today is th

            • 2-3 are useless. If the car is mostly autonomous the driver will not be paying enough attention to take charge if there's a problem. We've already seen this with lane guidance and collision avoidance - including a recent Tesla crash.

              We can argue timeframe but it is, to me, inevitable that autonomous cars will become better drivers than humans. I think sooner rather than later, but, assuming civilization survives intact, 100 years from now the technology requirements will be trivial, so somewhere in th
              • I think we're probably in agreement on most of this. In particular, I agree that levels 2-3 are the trouble spot. I suppose my immediate concern is that exactly the safety issue you mention, if a driver suddenly has to take charge, will be the cause of some high profile accidents, and that will then cause the kind of paranoia you mention later on and result in delaying the move to properly autonomous transportation. Right now I don't see much evidence that anyone has anything as high as level 4 ready to go,

    • 1) it won't take 50 years, it will take 5 or less once we see wide adoption of the technologies. 2) Liability lies with the OEM just like with any manufacturing defect.
      • We will only see wide adoption of the technologies if almost all of the worst drivers can afford it. That isn't going to happen any time soon.
        • You don't think this will be in the lease market and in wide adoption within 5 years of the technology being released? Look at the emergency breaking and the backup cameras....they are everywhere....30% adoption would be wide adoption and with 30% of the cars on the roads being autonomous you will see a significant decline in deaths.
          • These are expensive sensors. Manufacturers put in gizmos if they cost pennies and if they will sell more cars. They had to be forced to put in seat belts at $5 a vehicle and air bags at who knows how much. How come my 12 year old Lexus had auto-levelling head lights and windshield wiper sensors but still most economy vehicles don't have these features?
            • by Whibla ( 210729 )

              How come my 12 year old Lexus had auto-levelling head lights and windshield wiper sensors but still most economy vehicles don't have these features?

              Because these are not safety features*, they are 'bragging rights', hence they have not been legally mandated, unlike seat belts and air bags.

              As an aside, and I'm sure it's just me, whenever I read about windshield wiper sensors I have to wonder at the sheer laziness implicit in a technology that does away with the need to reach out with a single finger, with no additional need to move your hand, in order to move a lever approximately 1 cm. I idly wonder what other wiping functions we can replace next...

              *OK

        • We will only see wide adoption of the technologies if almost all of the worst drivers can afford it. That isn't going to happen any time soon.

          Why not? The sensors and actuators add very little to the cost of a car. The software has a marginal cost of zero. Once you factor in insurance, SDCs will likely be cheaper than conventional cars.

        • This this this!. I saw an ad for a kia something that was 9918 + TTL on TV. I've heard one of the less expensive lidar systems is 8K. The top line velodyne is 80K. So exactly how does one build the rest of a car for 1918 bucks? The valley seems to think everyone is buying a 100K car and they just are not. Worse, if the nirvana is achieved (driverless cars that are not owned by anyone except transit companies) then theoretically the number of cars produced drops to maybe 1/3 of current production so volumes

    • by Kjella ( 173770 )

      Ugg the caveman: I have discovered how to make fire.
      fluffernutter: Fire dangerous, you maybe burn down village. Me ban making fire.

      Defective cars (and other products) have killed people before, that's standard liability law and it's not going to change and all the traffic laws still apply too. The regulation they're exempted from sets requirements to divide responsibility between the car and the driver, which doesn't really make much sense when they're one and the same. Everybody in the car are passengers,

      • So we should all have open fires in drums in our living rooms because people tell us it is safe?
        • by meglon ( 1001833 )
          Depends. If you're bass player can't keep time, your lead is faking it, and you singers favorite two words are "lip sync," setting the drums on fire might be enough of a distraction to make people think you're at least entertaining. Hell, if it was still the 90's you might even get a Grammy.
    • Like most things reduced to a quick paragraph summary or soundbite, the details are missing. The "exemption" isn't a blanket "if your car is self-driving, it can break all the rules" exemption. Instead, it allows the manufacturer to apply for permission to not meet a given safety rule, they must demonstrate that they are at least as safe as the rule requires. This exemption process already exists, but the bills will modify it so that it explicitly applies to the development of autonomous vehicles.

      As an exam

  • So long that the companies of the self driving cars are wholly liable for any and all injuries, deaths, and emotional distress to the tune of $10 million plus.

    Doubt the law actually places liability on the companies testing these and we're just expected to take the deaths as the inevitable cost of progress!

    • by Jason1729 ( 561790 ) on Saturday September 30, 2017 @04:55PM (#55285243)
      The thing is, currently 50,000 people a year die on American roads. Even if self-driving cars could reduce that number 99%, rather than getting credit for saving 49,500 people a year, these car companies would be ripped apart for "murdering" 500 people a year. Rather than winning accolades for saving 10's of thousands of lives, they'd be sued for hundreds of millions a year for those hundreds of deaths.

      You need legislation to prevent that kind of liability, and it will save many, many lives. It just won't save everyone.
      • 50,000 people a year die on American roads, yet people still use them. Imagine that! They must be happy with the risk they incur by using American roads. It doesn't make it right to change the game on them by allowing companies to put automation on the roads who have shown us a clear history of reducing costs over reducing safety.
        • 50,000 people a year die on American roads, yet people still use them.

          Everyone that has ever died was an habitual breather, yet people still keep on breathing - the fools. :-)

        • You're right, people are satisfied with the risk. So why try and reduce it?

          A couple of hundred years ago, infant mortality was around 40% and maternal mortality around 10%. Getting pregnant was literally fatal 10% of the time. Yet most women still did it. I guess they were satisfied with the risk . What fools people are for developing medical technology to reduce that risk. And now all the foolish doctors have taken on malpractice liability for nothing.
          • Except every step that led to the reduction of infant mortality had a demonstrable benefit and was proven to do no further harm. I see no such discretion with automated cars.
            • So if we can reduce mortality rate 99% while also doing some small amount of harm, we shouldn't do it? Because if you don't think sheer human stupidity kills ten's of thousands of people a year on the roads, I just don't know what you look at when you drive.

              And if you think every step along the way in medical advances for birth did no further harm, you are sadly ignorant of medicine and history.

              http://mentalfloss.com/article/50513/historical-horror-childbirth

              Doctors wanted little to do with women'
              • So now you want to introduce a similar horror but with cars. I would like to think we have grown wiser since the 15th century.
        • 50,000 people a year die on American roads, yet people still use them. Imagine that! They must be happy with the risk they incur by using American roads.

          People die breathing pollution as well, yet people still breath. Imagine that! America is not the Netherlands. You can't just give up your car and go about your life, and technically most people in the Netherlands can't do that either.

          People still use them because they have to regardless of the risks, not because they are happy or accepting about them.

      • by MaskedSlacker ( 911878 ) on Saturday September 30, 2017 @05:04PM (#55285289)

        You need legislation to prevent that kind of liability, and it will save many, many lives. It just won't save everyone.

        No you don't. You just need insurance and actuaries to calculate and charge for the risks--which is exactly how we handle car accident deaths already.

        Nothing new is needed to deal with self-driving car liabilities. It's a solved problem. I will never understand why people cling to this idea that it's not.

        • If I am in a self driving car, I cannot be liable for anything since I am not controlling it. How is it not new that I shouldn't need insurance for a car that I own? I don't care if it is a 0.001% chance that it gets in an accident, it has nothing to do with me since I'm not driving.
          • You need insurance as the owner of the car.

            No different than a business owner needs insurance on their company cars, even if they're not driving it.

            THIS IS NOT NEW.

            • Nope. In that case, the business controls the drivers who control the car so of course there needs to be insurance for liability. A business buying self driving cars from Google should not be liable either, since there is nothing they can do to control the car. Yes you may want to insure against external forces such as vandalism or fire, but why would anyone pay for liability for something they do not control?
              • In that case you control the AI that controls the car just as much as the business seems to control the human drivers.
                • How does anyone 'control AI'? A business can place threats of penalties on employees that don't drive safe, it can put numbers of vehicles to call if the drive is reckless, it can give its employees defensive driving training; none of these can be done with AI.
                  • People actually do whatever they want to do anyways no matter what you threaten them with. You can only train a person or an AI.
                    • So your argument is that laws and threat of penalization or incarceration don't work on a general basis? If there were no punishment for stealing from a corner store, no more people would do it? That's a fairly difficult stance to defend.
              • You are inventing a problem where none exists. While you're whinging about "should" and "should not" the rest of us will be over here in reality.

                Reality: It doesn't matter who "pays" the liability because the owner of the car pays it anyway. The cost is passed on. Since the system we have already does this, why change it?

                • It's fine if the cost is passed on equally through the cost of the car. I just don't want to be the poor schmuck who pays twice the premium as anyone else because his personal automated vehicle thought a trailer crossing the road was a bridge in the distance. If the cost of that gets passed equally to all owners of the same vehicle then fine.
                  • That's why a corvette owner pays more for liability than if they drive a Camry. Insurance companies and actuaries are a lot brighter than you think....
                    • I'm not following how this applies to the arguement, which vehicle in your example is controlled by a neural network that no one really understands?
                    • If you're referring to humans, then yes their behavior is consistent well understood. If it weren't, then insurance wouldn't work. On the contrary, a blip in AI, a sudden introduction of mishandling of one odd case, could one day create 100 accidents out of the blue before it could be stopped.
                    • because if you screw around on a bike, you will be the one that dies?
          • If you didn't "take" the self driving car to where its going, the event would not have happened. Same argument as "guns don't kill people, bullets kill people".
        • by lordlod ( 458156 )

          You need legislation to prevent that kind of liability, and it will save many, many lives. It just won't save everyone.

          No you don't. You just need insurance and actuaries to calculate and charge for the risks--which is exactly how we handle car accident deaths already.

          The difference is that if one person kills somebody in a car crash, we use phrases like accident, treat each incident individually and insurance is based around actual damages.

          When one programming mistake or design decision kills 50 people we use phrases like tragedy, negligence, blame, the previously diffused anger is focused on a single company, possibly a single individual. Insurance starts including punitive damages, companies are targeted rather than individuals which historically leads to massive a

        • It's these same "actuaries" who determine the cost of payoff of few families of a dead relative due to a vulnerable part is less expensive than a recall.
      • by cob666 ( 656740 )
        That's because as it is now, people driving cars kill people on the road. With driverless vehicles they will be killed by either hardware failures, software bugs, software logic glitches, hacking PLUS all of the environmental issues that currently plague human drivers. There's a difference there and until they are proven to be safer than human driven cars, the companies that are putting them on the road should be liable and will most likely be found liable in a court of law, unless of course the jury is c
      • and in 50 years the Anti-Automation crows will claim they are not safer than humans because if they were there would't be a government court that doles out money for people who are killed by autonomous vehicles. #AVerLogic
      • by gijoel ( 628142 )
        FTFY
        Firstly you'll have to prove that self-driving cars would reduce road fatalities.So far I haven't seen any evidence that robocars are better than humans at driving, and I doubt I will in my lifetime.

        Secondly you're asking for the self-driving cars manufacturers to get a break that no other industry gets. Airline safety is far better than it was fifty years ago, but if you tried to use that to lobby for a reduction in manufacturer's liability they'd look at you as if your head is on fire.

        Just be
      • by Bongo ( 13261 )

        Yes, and it’s an ever ongoing problem, how to balance the power of corporations with the power of individuals and with the power of majority opinions.

        I think a significant percentage of people want self driving cars.

  • by Anonymous Coward

    Proof positive - as if further proof were needed - that the Republican party have their collective tongue so far up the backside of big business that they now don't care who knows. A 1+ ton of metal - carrying fuel to intensify any resulting explosion, lest we forget - allowed to use the same roads as J Random Driver without even a basic roadworthiness test? The only other logical explanation is that they're all high.

  • by Anonymous Coward

    What safety standards?

    Why do they think those standards shouldn't apply?

    They call this reporting?

  • Make clear that federal requirements necessary for human operation, like steering wheels, won’t be required for self-driving cars

    That seems normal. Where's the beef?

  • We do what we must because we can. For the good of all of us... except the ones who are dead. For there's no use crying over every mistake.

    The world has always been just one big Aperture Science.
  • Of course there are none yet. Congress is kicking the can to the administrative agencies to figure out how to review submissions to determine if the vehicles are at least as safe as cars are now. Whether the federal agency will succeed in carrying out its mission, it will be in a better position to figure that out than congressmen are. It sounds like the "watchdogs" are asking for the self-driving vehicle equivalent of the Clipper chip: something that will be an antique before it is even passed, and then wi

  • It is fine as long as it puts safety standards in place itself and sets up a regulatory body to monitor compliance.
  • Does whatever their corporate masters decide.

  • >> 'Robot Car Bill' Threatens Safety

    Is this Bill Gates? Who is this Bill person and why hasn't he already been arrested?

  • We can't expect technology to be fully viable and 100% safe from the start. By imposing such expectations we are making a lot of things commercially not viable. This is why we don't have flying cars, not because we can't build them, but because we can't make them perfectly safe.
  • You're going to need laws like this. Because it's just not going to be possible to make self-driving cars under a regulatory system designed for human-driven cars. Or indeed under any "mature" regulatory system (that is, one that has managed to fix the industry in question in place). It would be like taking today's regulatory system and insisting Ford follow it for his first Model A.

    Personally, I'm not a fan of self-driving cars, so by all means oppose this law.

  • by Anonymous Coward

    TENS OF THOUSANDS of people die in car accidents every year in the United States.

    Look, like the rest of you, I'm arrogant enough to think myself a great driver. But the numbers are irrefutable, we fucking suck as a species, at driving. Humans have FAILED.

    Given the length of development thus far, if every car was immediately switched out for a self driving car, there is no way they'd match even a substantive fraction of the number of deaths.

    ANY delay will cost MORE lives than those being saved. It's absol

  • by jfdavis668 ( 1414919 ) on Sunday October 01, 2017 @06:44AM (#55286953)
    This is a poorly written article. There is no mention, except for lack of steering wheel, of the rules self driving cars don't have to follow. If you are getting up in arms over this, at least point out the problem.
  • Does anyone have a good car analogy?
  • This William Wallace guy. I knew he was just a stickler for the man. Freedom my ass. Boo, I say. Boo!
  • Self-driving cars are going to have a very different set of safety concerns and will need very different safety regulations. We don't know what those are yet. We won't know what they are until development of the tech gets to the point where the needs become clear.

    Until then, regs for human directed cars are more likely to get in the way of development than they are to do any good.

  • 'U.S. Consumer Groups' apparently have some intelligent people in them that see, as do I and many others, that so-called 'self driving car' technology is not even close to ready-for-prime-time, and that legislators, who notably are technologically ignorant and incompetent, have been taken in by all the hype and actually believe that they're going to have K.I.T.T. to have a delightful conversation, in an English accent, on the way to their destinations. So-called 'self driving cars' are not going to do much
  • Given the amount of money these companies are [probably] throwing at federal, state, and local government in order to make this happen. Safety be dammed, as long as the money is rolling in.
  • The cited article didn't differentiate between self driving vehicles designed for human beings and self driving vehicles designed for freight only. I think it makes a difference. Who needs air bags and passenger roll bars if there are no living passengers?

    Self driving cars lacking the safety requirements of a steering wheel, brake pedal, emergency brake, etc. might be acceptable for a car that can't be moved under manual control. But, it will be a while before I would trust a vehicle with no way to ta

Do molecular biologists wear designer genes?

Working...