Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
United States Government Transportation

House Passes Bill To Speed Deployment of Self-driving Cars (go.com) 176

The House voted Wednesday to speed the introduction of self-driving cars by giving the federal government authority to exempt automakers from safety standards not applicable to the technology, and to permit deployment of up to 100,000 of the vehicles annually over the next several years. From a report: The bill was passed by a voice vote. State and local officials have said it usurps their authority by giving to the federal government sole authority to regulate the vehicles' design and performance. States would still decide whether to permit self-driving cars on their roads. Automakers have complained that a patchwork of laws states have passed in recent years would hamper deployment of the vehicles, which they see as the future of the industry. Self-driving cars are forecast to dramatically lower traffic fatalities once they are on roads in significant numbers, among other benefits. Early estimates indicate there were more than 40,000 traffic fatalities last year. The National Highway Traffic Safety Administration says 94 percent of crashes involve human error.
This discussion has been archived. No new comments can be posted.

House Passes Bill To Speed Deployment of Self-driving Cars

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Wednesday September 06, 2017 @01:27PM (#55148511)

    To what level????

    I want see some CEO hulled in front of small town hard ass judge after a bad crash where they get into the local jail after they try to pull the NDA / EULA / 3rd party BS to get out of talking about the code. In a very bad car accident it can be an criminal trial.

    • by DickBreath ( 207180 ) on Wednesday September 06, 2017 @01:41PM (#55148621) Homepage
      If the self driving car is not at fault in the accident (the vast majority of present day cases), then the self driving car has tons of data both in visible light and other parts of the spectrum to show everything that happened prior to the crash.

      It is an inevitability, once statistics catch up with it, that a self driving car will be the cause of a major accident. I doubt that this can ever be a criminal trial, because no criminal intent is involved at any level of the design or implementation of the self driving car. It's an accident.

      As more self driving car accidents occur, the self driving cars will get better and better at avoiding them (unlike puny humans). If for no other reason than the designers will make improvements based on all of the data from each accident.

      In court, the lawyers can argue about how the self driving car came to the decision to run over a group of people whose skin color it did not like. There won't be any NDAs. The owner of the technology will file a motion to keep the technology under seal. It will be discussed in court, but in a closed courtroom, with court members bound to secrecy about the technology. This is nothing new.

      BTW, I'm all for requiring safety standards of automakers. (OMG! regulation!) As long as you can quantify it in a way that is clear in the law. You can't have laws that are so vague that you can unintentionally violate them. There needs to be a bright line.

      The line cannot be that no accidents can occur -- because self driving cars are already safer than cars driven by puny humans.
      • self driving cars are already safer than cars driven by puny humans.

        I don't think you can really say that road conditions vary day to day and by region, I doubt the self driving car has been tested in all those varied conditions.

        I could tell you some anecdotes about drivers from Southern Texas and other places that don't get much if any snow driving on snow for the first time or I could tell you about the places where cellular and gps don't work or bridges that have wind gusts that catch even a seasoned driver off guard. Most of the self driving cars I hear about being test

      • by geekmux ( 1040042 ) on Wednesday September 06, 2017 @02:13PM (#55148873)

        You can't have laws that are so vague that you can unintentionally violate them.

        No, but you can certainly have punishments so weak that manufacturers will find it worth it to intentionally violate them. You know, like bypassing security in order to be first to market. Not that we've ever seen that shit happen before...(cough, IoT, cough)

        The line cannot be that no accidents can occur -- because self driving cars are already safer than cars driven by puny humans.

        Let's see how the masses feel when they find out a loved one was one of 100,000 people killed after a DDoS-style mass attack against autonomous vehicles takes place in a major city. Watch as the manufacturer demands closed-door legal proceedings and produces redacted shit detailing their fault, negotiating death caused by an insecure product down to a free cup of fucking coffee for the next of kin.

        If companies are already looking to push this technology by requesting a pass on current regulation, then it will probably go to market like damn near every other mass-produced thing we make, meaning shit for security. And I've already explained why they will do this; because it will be worth it.

      • Re: (Score:3, Interesting)

        The line cannot be that no accidents can occur -- because self driving cars are already safer than cars driven by puny humans.

        This is the point that I hope gets understood sooner rather than later. Accidents happen currently to the tune of ~3000 people dying a month and many times that injured. If self driving cars reduce this then progress has been made. The data I've seen is that self driving / driver assist reduces car accidents by ~20% and injuries by ~25%. That's huge. As long as some jackass lawyer doesn't get to have punitive damages that are in excess of what a regular driver would get then it should be fine. Meaning

        • Re: (Score:3, Flamebait)

          You have only seen data from self driving that either A) has a human to correct for it, or B) drives only where driving is simple and straightforward. We don't know if self driving will ever be adequate for all conditions. And self driving has made progress on those dead or injured only as long as it hasn't injured or killed anyone else that wouldn't have died otherwise.
      • by Kjella ( 173770 )

        I doubt that this can ever be a criminal trial, because no criminal intent is involved at any level of the design or implementation of the self driving car. It's an accident.

        The two highest levels of culpability, purposely or knowingly probably not. The two lowest levels, recklessness and negligence can most certainly happen. The former would be where the court finds that there was a programming decision made to ignore a potentially dangerous condition, the latter if it failed to recognize the condition or

      • The line cannot be that no accidents can occur -- because self driving cars are already safer than cars driven by puny humans.

        And what's your evidence for this rather bold assertion?

        I have no doubt that self-driving cars *can* be safer than human-driven cars, but I'm quite dubious that they are there already. I'm even less certain that the current state of partially self-driving cars is safer than purely human-driven cars in the long run.

        • I'll restate it to say that it WON'T be, rather than it CAN'T be. That is, the bright line will not be that no accidents can occur. The benefits of self driving cars vastly outweigh the fact that they are imperfect. Just like aviation. It's too valuable not to have, even though planes do sometimes have spectacular accidents. Either way, planes or self driving cars, it's safer than cars driven by puny, unreliable, distractable, masturbating, humans talking on a cell phone with one hand, while using thei
    • by Hadlock ( 143607 )

      I think what we're going to find is that when cars don't have to deal with fussy/loud children, not getting enough sleep, being too drunk/high, thinking about their ex breaking up with them last week, getting/not getting that raise/promotion etc etc and it can just concentrate on driving.... that self driving cars are already way, way more safe than slow meat-based human drivers. And they'll only continue to improve. Yes, there will be the inevitable fatality where the automated car kills a human, but worki

      • The automated cars are so wary of me as a pedestrian that they stop a full 20-30 feet away even before I leave the curb.

        This is a certain recipe for gridlock in many places, such as college towns. Any vehicle that stops for you before you leave the curb will find itself sitting stock still in any situation where there is anyone on the sidewalk, whether they intend to cross or not.

        Note that the law (at least in Oregon) does not require anyone to stop for a pedestrian before they leave the curb, and thus every AV that follows your "stop" rule will be a traffic hazard to every human-operated vehicle that does not expect sudden

    • The better question is how are state right government republicans pushing a federal government regulations on what is normally a state level?

      • ICC. Auto manufacturers rarely have production facilities in every state where they sell cars, thus automobile sales clearly involves interstate commerce. Further, automobiles, as in 'mobiles', often cross state lines; therefore legislation that reduces different laws for different areas is justified. E.g., there is a reason why there is a standard for stop signs.
    • They are trying to do things like exempt them from having to have mirrors, window washers, etc.

    • by Ultra64 ( 318705 )
      Way to not read the whole sentence:

      to exempt automakers from safety standards not applicable to the technology

      • Way to not read the whole sentence:

        to exempt automakers from safety standards not applicable to the technology

        Since "the technology" will require the ability of a human to take control of the vehicle and operate it safely when the AV fails or enters conditions that it is not designed to handle, then this means exempting automakers from existing safety standards such that it makes human operation of the vehicle less, if not completely, unsafe.

  • by queazocotal ( 915608 ) on Wednesday September 06, 2017 @01:29PM (#55148533)
    And a sizeable fraction of the remainder would be eliminated by automatic monitoring of car performance. (brake failure due to neglect say).
    Then there are those accidents that are 'unavoidable' (debris falling on road say).

    But become not-unavoidable if you have an AI with reflexes beyond a trained stunt/rally driver who has a week to prepare.
    • dealer only service with oil changes each 3000 miles will drive profits up.

      • dealer only service with oil changes each 3000 miles will drive profits up.

        So far most SDCs are electrics. There is no oil to change.

        • Oil is a lubricant, not a fuel.

          Yes, Virginia, even electrics need lubricated.

          • by tomhath ( 637240 )
            No Virginia. There is no motor oil in an electric motor. Maybe some grease in sealed bearings, but motor oil in an internal combustion engine needs to be changed because it gets contaminated with byproducts of combustion.
      • by amiga3D ( 567632 )

        No more oil changes with electric motors.

    • Sure you'll cut down on 99% of accidents, but it's still useful to test the hell out of automated cars to make sure we know what they do, if for no other reason than to be able to put it in writing so that owners are aware. To illustrate, what should the expected behavior be if the vehicle suddenly finds itself in a situation where it needs to choose between one action that will almost certainly result in the death of a pedestrian or where the other course of action results in the death of the passengers?
      • it's still useful to test the hell out of automated cars to make sure we know what they do

        No, that is not useful. Considering that 3000 people per day die worldwide in HDC accidents, any delay in the adoption of SDCs is unconscionable.

        • by tsstahl ( 812393 )

          Considering that 3000 people per day die worldwide in HDC accidents, any delay in the adoption of SDCs is unconscionable.

          Self driving vehicles are such a first world problem. The vast majority of the world where those deaths occur feature unnavigable "roads" for a self driving vehicle.
          I would far sooner trust a llama to get me up a Chilean mountain road, if only because the llama has a stake in the outcome.

          Hot Wheels work best on little orange tracks.

          • The vast majority of the world where those deaths occur feature unnavigable "roads" for a self driving vehicle.

            The country with the most traffic fatalities is China, with about 260,000 deaths annually. China's road infrastructure in many areas is better/newer than America's.

    • But automated cars are controlled by computers that make decisions in fractions of a second! They should absolutely be able to avoid debris falling on the road. If they don't have the sensors for it, then that's a problem.
      • But automated cars are controlled by computers that make decisions in fractions of a second! They should absolutely be able to avoid debris falling on the road.

        Let's test that hypothesis. You get your AV going down I5 at posted legal speed and I'll drop a rock on it from an overpass. I predict your AV will not avoid "debris falling on the road" when the laws of physics say your AV cannot stop in time to avoid hitting/being hit by it.

        If they don't have the sensors for it, then that's a problem.

        I have three significant chips in my windshield that came from small rocks being kicked up by trucks in front of me. I challenge you to have a camera with sufficient resolution, and computer with sufficient processing speed, to detect,

        • Well we're talking about debris falling on the road sizable enough to cause an accident, not rock chips. They should absolutely be able to detect a kid on a bridge about to drop a rock.
          • Well we're talking about debris falling on the road sizable enough to cause an accident, not rock chips.

            A common concept in aviation is that every accident begins as a sequence of events that aren't necessarily individually fatal. In this case, as I pointed out, that "rock chip" may be taking one of your sensors out of operation. That's the starting link in the chain.

            They should absolutely be able to detect a kid on a bridge about to drop a rock.

            "Hopeless optimism" is not a very good way to design safety systems. That you think an AV computer can detect a "kid on a bridge holding a rock" (when the kid may be on the downstream side of the bridge and completely hidden from the AV until he'

  • by PPH ( 736903 )

    Automobile design and performance standards are pretty much set at the federal level anyway. A few states (California, for example) have stricter emissions standards. But it's time to put a stop to that B.S.

    I can take practically any vehicle legally operable in one state and drive it across the border into another anyway. So state by state laws really don't accomplish much other than to protect local market channels.

    • I can take practically any vehicle legally operable in one state and drive it across the border into another anyway

      For a relatively brief period of time.

      If the vehicle is going to remain in that new state, you'll have to register it in the new state. And it's at the registration step where the state's standards come into play.

      • If you have multiple legal addresses, you can maintain legal residency in the state of your, more or less, choice. Car is registered at your home address, it's just visiting with you, when in the other state.

        CA has fucked up car laws. I drove an out of state registered car for a few years when I first moved here. Now I just cheat on smog, but that's another thread.

        CA is still better than some states. IIRC there are some where you can't put so much as a flowmaster on your car. Car asthma is required by

        • If you have multiple legal addresses, you can maintain legal residency in the state of your, more or less, choice.

          Nope. At least, not in many states. The car has to be registered where it is primarily used, not your legal residency.

          CA has fucked up car laws. I drove an out of state registered car for a few years when I first moved here.

          Same thing happens in all states. Fortunately by being in CA you were much less likely to have the locals turn to tickets as a primary revenue stream, so you got away with it.

          And as someone who got to experience the joys of air in Los Angeles before CA emissions really clamped down, the "ha ha! I broke the law" angle is not exactly appealing. Might as well brag about eating rancid meat,

  • I guess a self-driving taxi and freight truck would go without driver airbags, driver steering wheel, and driver pedal, since the idea is to not pay a driver--you still have to pay a driver for lounging around in your car all day when he could be at home lounging around in front of the TV or digging in the garden.

    As for "speed deployment", that's not quite it.

    If we lose a few thousand or tens of thousands of jobs a month to new technologies, that's just business as usual: the .01% nudge in unemploymen

  • I have no doubt based on the behavior I see on the road every day in the large American metropolitan area I live and work in that once self driving cars become ubiquitous, somebody is going to figure out how to hack the AI to make it more aggressive. I see people all the time who take crazy chances on the road to get in front of other drivers. Human beings are really good at being jerks and ruining a good thing for everybody else by exploiting it first. So I expect somebody to figure out how to make the
    • I have no doubt based on the behavior I see on the road every day in the large American metropolitan area I live and work in that once self driving cars become ubiquitous, somebody is going to figure out how to hack the AI to make it more aggressive. I see people all the time who take crazy chances on the road to get in front of other drivers. Human beings are really good at being jerks and ruining a good thing for everybody else by exploiting it first. So I expect somebody to figure out how to make the AI make the car its controlling go as fast as possible after a light goes green and do other perhaps risky behaviors under the assumption that the others cars will have AI that will let them. Once that happens, it probably will get very unsafe with large numbers of hacked cards jockeying for position all the time on the road under the assumption that the other guy will obey the rules so they don't have to.

      Maybe. I see it largely going the other way where people care less about getting a few car lengths ahead because they are watching a movie or playing on their phone.

      • Maybe. I see it largely going the other way where people care less about getting a few car lengths ahead because they are watching a movie or playing on their phone.

        Nope. I watch movies and play on the phone now and I know it is much safer to do that when I am more car lengths ahead of someone else than following behind him.

        People will hack the AI because it exists. Owners will do so because it will make their car cooler. Others will do it because it is a challenge and they can get their leet haxor creds by causing damage.

        • by Whibla ( 210729 )

          ... I know it is much safer to do that when I am more car lengths ahead of someone else than following behind him.

          It's cute that you can simplify the problem down to you and one other car and pretend that the 'solution' holds for all other problems in the class.

          People will hack the AI because it exists. Owners will do so because it will make their car cooler. Others will do it because it is a challenge and they can get their leet haxor creds by causing damage.

          And that would pretty much settle the question of where to start looking to 'apportion blame' in the event of an accident.

          (on the other hand I make no opinion on whether people should be allowed to tinker with the programming or firmware of their cars (it'll take all night...))

  • When all cars are automated, all crashes will be 100% computer error.

    Seems like people still have the statistical edge.

    What is it Samuel Clemens never said about statistics?

  • by GerryGilmore ( 663905 ) on Wednesday September 06, 2017 @02:34PM (#55148997)
    Damn! Will wonders never cease? Look, all of you self-driving car Luddites can just stay the hell away from them. For the rest of us, though, it makes PERFECT sense to have National standards for these cars in the same way that we currently have National standards for all current vehicles. Otherwise, you'd have an amalgam of incompatible state-based standards that would severely hinder development and deployment. (California's stricter emissions standards being the only exception I'm aware of.) What's wrong with that?
    • Not only that, but imagine the economic consequences when the states finally do get their shit together [slashdot.org].

    • by cdwiegand ( 2267 ) <chris@wiegandfamily.com> on Wednesday September 06, 2017 @03:37PM (#55149371) Homepage

      Nice try, but already several states have emissions standards, and had the Feds used their unconstitutional "right" to pass a law only permitting Federal jurisdiction over emission standards, CA's rules (which like 10 other states follow) would NEVER have seen the light of day. And those standards have helped push electric vehicles, even self-driving cars (which arguably would not exist if alternative fuel cars wasn't as big of a market as it is - it's sparked innovation in a previously dead market).

      In the end, the Feds don't really have authority to do this, if the States would finally stand up and remind the federal government of their rights under the 10th amendment. Great point: I may not be a leftie or rightie (I'm actually centrist), but how would you feel if the Feds also demanded concealed carry reciprocity nation-wide? Or blocked LGBT marriage nation-wide? Most liberals squirm, at best, at the thought, yet ITS THE SAME IDEA - the feds taking away the right for a State to determine its own laws entirely within that state. It's only when something is commerce cross-state boundaries that the feds should be doing anything like this, and several states (esp. out west) are larger than many European countries, so I have a hard time believing they are having a hard time doing business in California (practically its own country already).

  • by Scroatzilla ( 672804 ) on Wednesday September 06, 2017 @04:38PM (#55149767) Homepage Journal

    I'm always curious when self-driving discussions appear. I'm "somewhat" informed on this topic, and am relatively neutral; but, I can't help but believe that tech folks are a bit too optimistic about the benefits of "eliminating human error." For example, I see in these types of discussions the example of debris on the road. Theoretically, most human drivers have the ability to see such debris and determine a course of action, and most of the time they choose correctly and avoid disaster.

    On the other hand, it would only take a single bug in an AI "debris subroutine" running in a whole bunch of self-driving cars to choose the wrong course of action 100% of the time. Such a bug would *probably* only be identified after enough failures were accurately recorded to piece together a pattern that could point to it (i.e., an incomplete test plan didn't catch it, a code review didn't catch it, differences between virtual test worlds vs. the real world hid the defect, etc.).

    I guess if someone could convince me that it is possible to write 100% bug-free code, I would feel better about this. However, what I perceive as the somewhat naive optimism of technical folks is somewhat terrifying in this context.

    • by Kjella ( 173770 )

      On the other hand, it would only take a single bug in an AI "debris subroutine" running in a whole bunch of self-driving cars to choose the wrong course of action 100% of the time. Such a bug would *probably* only be identified after enough failures were accurately recorded to piece together a pattern that could point to it

      What makes you say that? I would expect any SDC accident or near-accident where the car is potentially at fault to be given a thorough hearing, FAA style. I expect the "black box" in SDCs to give you all the raw sensor data from the last 30 seconds which will be put into simulators and ran not just on what did happen but a ton of variations to see what could happen. And for any change in the programming to be ran through a bunch of regression tests to check that you don't have any unexpected behavior change

One man's constant is another man's variable. -- A.J. Perlis

Working...