Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation United States Technology

Uber Will Not Re-Apply For Self-Driving Car Permit In California (techcrunch.com) 101

An anonymous reader quotes a report from TechCrunch: Uber, after suspending its self-driving car operations in all markets following a fatal crash, has decided not to re-apply for its self-driving car permit in California. Uber's current permit in California expires March 31. "We proactively suspended our self-driving operations, including in California, immediately following the Tempe incident," an Uber spokesperson told TechCrunch. "Given this, we decided to not reapply for a California permit with the understanding that our self-driving vehicles would not operate in the state in the immediate future."

Uber's decision not to reapply comes in tandem with a letter the DMV sent to Uber's head of public affairs, Austin Heyworth, today. The letter pertains to the fatal self-driving car crash that happened in Tempe, Arizona last week. "In addition to this decision to suspend testing throughout the country, Uber has indicated that it will not renew its current permit to test autonomous vehicles in California," DMV Deputy Director/Chief Counsel Brian Soublet wrote in the letter. "By the terms of its current permit, Uber's authority to test autonomous vehicles on California public roads will end on March 31, 2018." This comes following Arizona's decision to block Uber's self-driving cars in its city.

This discussion has been archived. No new comments can be posted.

Uber Will Not Re-Apply For Self-Driving Car Permit In California

Comments Filter:
  • No kidding (Score:5, Insightful)

    by Izuzan ( 2620111 ) on Wednesday March 28, 2018 @08:41AM (#56340459)

    With the reported problems their self driving car had im ammazed it was allowed off a test track. Barely averaging 13 miles before needing human intervention is piss poor, and not functioning properly while next to a large object (like a transport).

    Yeah. Definitely should not have been off a test track.

    • Can I have some citation on this. It isn't that I don't believe you, I just hadn't heard that particular statistic before and would like to understand the details on such information.
      From my knowledge most of the previous accidents were from driver error not the automated car. And for this reported death, while improvement could be made, the person who got killed was doing something dangerous and crossed the street without trying to make eye contact with the driver, so the guy monitoring the automated

      • by Anonymous Coward

        Assuming you mean the 13 miles figure:
        https://www.nytimes.com/2018/0... [nytimes.com]

        As of March, Uber was struggling to meet its target of 13 miles per “intervention” in Arizona, according to 100 pages of company documents obtained by The New York Times and two people familiar with the company’s operations in the Phoenix area but not permitted to speak publicly about it.

        And OF COURSE "the guy monitoring the automated car didn't know she was crossing" (or was it a girl, I thought guy when I saw the video but I've heard it reported that was a female). HE/SHE WAS LOOKING AT HIS/HER FUCKING PHONE!

        • by Uberbah ( 647458 )

          And OF COURSE "the guy monitoring the automated car didn't know she was crossing" (or was it a girl, I thought guy when I saw the video but I've heard it reported that was a female). HE/SHE WAS LOOKING AT HIS/HER FUCKING PHONE!

          As you would be, if you were driving for hours and hours without having to do anything, and on a road with little traffic. The hating on the human minder is entirely misplaced.

          • by taustin ( 171655 )

            If it takes "hours and hours" for their self driving car to go 13 miles (the distance between the driver having to do something) there are far bigger problems than identifying a pedestrian pushing a bicycle.

            • by Uberbah ( 647458 )

              Uber's self-driving division is a hot mess (like the rest of the company) but that's not the fault of the prole trying to make a few bucks an hour. There are large driving courses free of pedestrians that Uber could be using, but it must cost more than the prole, so they decided not to.

        • Re:No kidding (Score:4, Interesting)

          by jellomizer ( 103300 ) on Wednesday March 28, 2018 @12:51PM (#56342271)

          Ok I got it and read it.
          So here is some information that seems that should be explained further

          "Waymo, formerly the self-driving car project of Google, said that in tests on roads in California last year, its cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per “intervention” in Arizona, according to 100 pages of company documents obtained by The New York Times and two people familiar with the company’s operations in the Phoenix area but not permitted to speak publicly about it."

          Now this far more important part of the article. It isn't self driving car technology as the technology currently stands, but Ubers implementation of it, and it failure to prevent it from going out when they haven't perfected it at the level the competitors are achieving. And attempting to stop people from alerting the public about it.

          This should be considered a shame on you to Uber, and not be scared of autonomous automobiles, as ones that are properly implemented have a rather good safety record. (Still 5600 miles isn't that great for a release, it should probably be over 100,000 miles)

      • Re:No kidding (Score:4, Informative)

        by hazardPPP ( 4914555 ) on Wednesday March 28, 2018 @09:03AM (#56340585)

        Can I have some citation on this. It isn't that I don't believe you, I just hadn't heard that particular statistic before and would like to understand the details on such information.

        The statistic is from a NYT article [nytimes.com] (there was a Slashdot story about it a few days ago), that is, from leaked internal Uber company data obtained by the NYT.

        Also, not all self-driving cars are the same. Google's tend to have a good safety record thus far, Uber seems to try and piggy back on this (like they tried to "piggy back" on Google's technology, too...) to assure everyone that testing self-driving cars is safe, while refusing to release their own testing data. The information obtained by the NYT suggests the performance of the Uber cars is terrible in comparison to Google.

        Uber deserves no trust and no benefit of the doubt, I mean they were kicked out of California because they didn't apply for a $150 licence for autonomous car testing. If they can't be bothered to fill out some paperwork, I wonder where else they are cutting corners.

        • “With autonomy, the edge cases kill you, so you’ve got to build out for all the edge cases,” Mr. Khosrowshahi said at a conference in November.

          That is, the edge cases kill you, my listeners, and the public.

      • by jrumney ( 197329 )

        I just hadn't heard that particular statistic before

        You must be new here. The 13 miles between human interventions statistic is on every Uber or autonomous driving related thread on Slashdot since Uber first started its testing.

    • Exactly this. If I were teaching someone to drive and I needed to intervene every 13 miles to avoid an accident, well, no license for you. 13 miles is like every 1/2 hour in suburban traffic. How can it be that bad and they think it was usable? I do understand, it is coming from the sw industry where generally sw is shipped and the customer does the testing. But hey, they met the schedule and shipped the car out on time.

      • by Anonymous Coward

        You left out one thing.

        1. They met the schedule.
        2. The car was shipped on time.

        Was the EULA on the dash? Did we highlight the part that says, we will not be responsible for any accidents during the testing phase?

        Now two worlds collide -- car analogy is actually the submitted item -- and Slashdot is the scene of the crime.

        April 1 is next.

      • by ColdSam ( 884768 )
        Except that nobody is asking you to give them a license. They are asking to have a learner's permit, which is a far different standard.
        • With a student with that track record. Would you be sitting there looking af your phone? Or ready to hit the brakes when they fucked up again?

          • by ColdSam ( 884768 )

            Have you ever taught a kid to drive? If that kid drove 13 miles without any guidance or instruction he/she would be off the charts skill wise.

            Any normal person would teach a kid to drive the same way as these backup drivers seem to. They would pay close attention on any new or tricky situations, but would relax when the driver (student or AV) was doing the same thing they've done successfully for the last 20 times.

            • by Izuzan ( 2620111 )

              1) you don't teach a teenager to drive at night. in fact. up here in Canada they are not allowed to drive 1 hour before sun down and 1 hour after sun up. 2) you don't take a person learning to drive on a highway (freeway for you yanks) 3) if the person you are teaching to drive needs to be corrected that much in the area you are driving.. YOU DON'T TAKE YOUR EYES OFF THE ROAD !

              • by ColdSam ( 884768 )

                You really have no idea what you are talking about. Clearly you have never taught anyone to drive and it is unclear you have ever learned to drive yourself in Canada or anywhere else.

                1) you don't teach a teenager to drive at night.

                Of course, you do!

                That is in fact the only way they can learn to drive at night. You expect them to just become excellent night time drivers overnight with no training? You do this after they've shown proficiency in driving in the daytime.

                2) you don't take a person learning to drive on a highway (freeway for you yanks)

                Of course, you do!

                That is in fact the only way they can learn to drive on the freeway.

    • https://www.axios.com/humans-c... [axios.com]

      Ah facts. What a pain.

      It's important to keep in mind how long the cars are on the road. Waymo, for example, filed 13 accident reports in 2016, but its cars also drove 635,868 miles in autonomous mode during that period, or just about 1 for every 50,000 miles

      • Is that for tesla, google, uber or all combined?

        As it has been reported from the ny times uber was struggling to meet 13 miles before human intervention was needed.

        If you were training a human that cluldnt maneuver a construction zone, drive with a transport beside them, or you had to intervene on average of once every 13 miles. Would you be sitting there looking at your phone?

        Googles cars are going hundreds of miles before needing human intervention. 13 miles is ridiculous to be on a main road and not a cl

        • Is that for tesla, google, uber or all combined?

          You don't mention Uber in your OP so I can only assume you are talking about self driving cars in general, as was I.

          • by Izuzan ( 2620111 )

            Since the main article is about Uber it is only logical i was referring to them not about self driving cars in general. that is just idiotic.

    • What are they going to do with the 24,000 Volvo cars they ordered in November? And Uber's GM in Spain said last week that the program will contunue: https://citiesofthefuture.eu/u... [citiesofthefuture.eu]
  • Not a feature, just a bug. I think the relentless push for autonomous self-driving cars is too big right now but this incident will make those companies think twice especially when a few ambulance chasers come calling.

    • by Yannic ( 609749 )

      Very true. Sometimes history has to repeat itself. Maybe the self-driving car industry needed a Therac-25 moment. Hopefully it doesn't need another.

      • Worse situation (Score:5, Interesting)

        by DrYak ( 748999 ) on Wednesday March 28, 2018 @11:15AM (#56341499) Homepage

        The situation is actually worse than with Therac-25.

        Back then it was standard type of software, (a classical imperative program written in assembler, that is supposed to do a series of determined steps), where further analysis could find bugs (mostly due to bad practices that could help arise race conditions, overflows, etc.) you can clearly show that the software didn't work as planned.

        Nowadays, with autonomous cars, its entirely different, mostly relying on modern-day AI with deep neural nets.

        I'm exaggerating (though not necessarily that much, in the specific case of Nvidia's platform) :
        but basically you have on one side a bunch of sensor inputs (camera, radar, lidar, sonar)
        on the other side you have a bunch of controls (steering wheel, brakes, gaz pedal)
        and in the middle you have a giant AI black box with "here magic happens" written on it (but it could as well be hallucinating sheeps [aiweirdness.com]).

        There's no clear software bug here. In fact the software run exactly as expected : it correctly ran the neural net.
        It's what the neural decided to do out of sensors data that is problematic.
        It's not as much a software bug as a general design oversight.
        With problematic black boxes whose logic is hard to model and understand in an exact logic fashion (DNN aren't a long list of logic rules like "if condition A and B are detected then apply decision C").

        Of course in practice that's a big simplification (except, again, for Nvidia's platform). In practice a good design should include also hard logic that looks at some simpler safeguards (a big object in front detected by sensors should cause direct slowing down/braking not waiting to see what weird decision the neural net might come up with). So there is actual "classical imperative program" somewhere that needs fixing.

        But you get the general gist of the direction things are going with AI : we're reaching the point where it's hard to understand what going under hood, because what goes under the hood is closer to "intuition" than "hard logic".

        • by jbengt ( 874751 )
          No mod points today, but the above post makes excellent points.
        • Self driving cars are neither run by AIs nor by Deep Learning Neural Networks.
          Recognizing a street sign might be so ...

          • Self driving cars are neither run by AIs nor by Deep Learning Neural Networks.

            Nvidia made a clear point that their platform was indeed pure DNN [nvidia.com] (aka "end-to-end deep learning self driving cars").
            (Because it was more a demo of the application of their tensor acceleration rather than a practical implementation of driving).

            But also has given rise to criticism about a technology that we can't completely understand (simply by not being based on a set of rules).

            Recognizing a street sign might be so ...

            Yes, in most cars (except Nvidia), the final decision is rules based (as in "if there an object that's too near, hit the brakes")

            T

    • You mean morticians?
  • by Anonymous Coward

    Welcome to Arizona, we have the city!

    • by GoTeam ( 5042081 )
      Exactly what I was thinking. So Arizona has just one city?
      • by Anonymous Coward
        Barely one. More of an overgrown small town surrounded by endless suburban wastes.
  • Given Uber's history of flouting taxi and labor laws just about anywhere they operate, I expect them to test in California anyway, without a permit.

    Uber is a high tech innovative company, with an App . . . who needs a permit . . . ?

    "We don't pay taxes; only the little people pay taxes." -- Leona Helmsley

    "We don't obey laws; only the little people obey laws." -- Uber

    • who needs a permit . . . ?

      and another quote.

      If it moves, tax it. If it keeps moving, regulate it. And if it stops moving, subsidize it.

      Ronald Reagan

      It's about money, California will do anything to protect it's stream of revenue to dump into any libtard project moonbeam chooses, that's why Uber needs a permit.

      • by hazardPPP ( 4914555 ) on Wednesday March 28, 2018 @09:23AM (#56340697)

        It's about money, California will do anything to protect it's stream of revenue to dump into any libtard project moonbeam chooses, that's why Uber needs a permit.

        Uhm, no, it's not about money, it's about safety.

        An autonomous vehicle testing permit [ca.gov] has an annual fee of $3600 and includes 10 test vehicles. Additional batches of 10 vehicles can be added at $50 a batch.

        Currently there are about 50 companies testing about 300 cars total. [theverge.com] 50 licences mean 500 cars if everyone is testing 10 cars. This is not the case (since there are less than 500 cars on the road), and you have some companies (e.g. Waymo) testing way more than 10, while others are probably testing one or two. So that's 50*3600 = $180,000 in application fees + a few hundred, maybe thousand, dollars more from the companies testing more than 10 vehicles.

        This is hardly some sort of windfall for the government of California. The money probably doesn't cover the cost of developing the legislation, processing the applications, and monitoring the results.

        • Thanks for validating my point.

          If a company has liability coverage and their vehicles are compliant with existing vehicle use laws, why do they need the extra layer of bureaucracy? Wait till they're put into use on an every-day basis and you'll see the cash roll in and the extra bureaucratic nuances of taxing per mile/minute etc.

          The bureaucracy is expanding to meet the needs of the expanding bureaucracy.

            Oscar Wilde

          • Re: (Score:3, Interesting)

            by Kierthos ( 225954 )

            So they have a paper trail showing that it was supposed to be a self-driving car?

            Some of these self-driving cars still have a steering wheel and other controls so the passenger in the driver's seat can take control if necessary. Let's say one of those cars gets in an accident. Do you expect the officer on the scene to immediately recognize it's a self-driving car?

          • I thought there was going to be some rigorous certification process to demonstrate the car is at least close to humans in safety. I was wrong.
            • yes I always trust a company that has never built a vehicle, done any sophisticated hardware engineering/integration or any complex software development to test the safety of a vehicle weighing thousands of pounds moving at city street velocities.

              Uber: Let's grab an old Volvo and strap some cameras on it and test in live situations with unwitting test subjects on the highways.
              Lawyers: Take our card, we'll be calling you shortly.

          • by Anonymous Coward
            I like how you're arguing that it should be legal for any dipshit with an RPi and a couple cameras to put a 2000lb toy car on the road, with zero additional barriers to entry. And you think this is some sort of "gotcha" that proves your point. You're a fucking idiot.
          • Thanks for validating my point.

            Reading comprehension much?

    • Given Uber's history of flouting taxi and labor laws just about anywhere they operate...

      ...Uber should not be allowed to do anything that can jeopardize the safety of people, period. Flouting taxi regulations doesn't kill anyone. Flouting car safety regulations does. If Uber were a car company, they'd sell you something that fails a crash test and tell you'll be fine, who needs crash tests?

    • by jrumney ( 197329 )
      Probably they will go back to claiming that their cars do not meet the definition of autonomous [theverge.com]. But this time Arizona won't be so keen to offer them a safe space from those regulation-loving Californians.
  • Without self-driving cars Uber is dead. They are not sufficiently profitable with regular rides to service their debt.
    • by Oswald McWeany ( 2428506 ) on Wednesday March 28, 2018 @09:15AM (#56340643)

      Without self-driving cars Uber is dead.

      I'm not convinced that that is a bad thing. Uber is like a parasite. They put taxi drivers out of work, and after cost for car maintenance/depreciation is considered- you actually make negative money driving for Uber (it's not a smart business proposition for the driver- people don't think it through long term). All the while Uber has been doing some pretty unethical things.

      There are few companies I WANT to see fail more than Uber.

      • by sinij ( 911942 )

        Without self-driving cars Uber is dead.

        I'm not convinced that that is a bad thing. Uber is like a parasite.

        Yes, but traditional taxi industry is Lupus. So having parasites is lesser evil.
        As a frequent business traveler, Uber made my life much better. I still remember dumpster-like cabs, waiting for more than an hour for one to arrive, and sightseeing detours that were common before Uber.

        • If you don't like taxi regulations, then seek to have them changed. Just don't sacrifice the benefits to the regulations because all you can see is the negative.
          • by sinij ( 911942 )

            If you don't like taxi regulations, then seek to have them changed. Just don't sacrifice the benefits to the regulations because all you can see is the negative.

            As individual, I have no practical means to change taxi regulations. However, as a consumer I now have choice that allows me to avoid traditional cabs. As a result, traditional cabs drastically improved, because they have to compete with uber.

            Uber vs. Taxi Industry is a techno-thugs fighting traditional mafia. There are no good guys here.

        • by Uberbah ( 647458 )

          I still remember dumpster-like cabs, waiting for more than an hour for one to arrive, and sightseeing detours that were common before Uber.

          All of which you can get with spades with Uber as well. The hate on taxis always seems to be based on anecdotes, confirmation bias, and ignoring the fact that Uber is burning through billions in cash every year. If you were running a taxi business, how easy would you find it to compete with an unlicensed competitor that can afford to lose money for a decade or more? A

          • by sinij ( 911942 )
            If I were running taxi "business" I would be a cab plate owner that rents it to mule drivers at predatory rates. I would also likely be involved with or on the city council, working hard on preventing more plates from getting issued, ensuring artificial scarcity. Yes, I would be loosing money to Uber, but would be completely unqualified for any kind of sympathy.
    • Without self-driving cars Uber is dead.

      There is no reason to believe that even with self driving cars Uber will ever achieve profitability. Honestly I'm kind of astonished they have managed to get the funding they have because they haven't shown any credible path to profitability that I am aware of.

      They are not sufficiently profitable with regular rides to service their debt.

      And somehow we are to believe that they will beat their well financed competition in developing them? Self driving cars will require tens of billions at minimum to develop (probably hundreds of billions) and decades to get to a state where they can

  • What are the states doing to determine whether Waymo has the same flaws that Uber does? Maybe they're just being more careful with testing and are a ticking time bomb like Uber was.
    • What are the states doing to determine whether Waymo has the same flaws that Uber does? Maybe they're just being more careful with testing and are a ticking time bomb like Uber was.

      I guess you don't know know who Waymo is because they were previously the "Google Car" aka, the people who spurred the commercial autonomous car development. Waymo is an autonomous car development company and subsidiary of Google's parent company, Alphabet Inc. [wikipedia.org]

      You are now required to turn in your geek card as you have depleted your geek cred.

      • I know who Waymo is, don't understand what this has to do with the question? Companies are pushing this technology out half baked because they want to make money. Since we can't really trust anty company, what is being done to ensure the technology poses a minimum risk to the public? This case has shown that 'safety drivers' are not good enough since they will loose focus after some period of time.
        • Since we can't really trust anty company, what is being done to ensure the technology poses a minimum risk to the public? This case has shown that 'safety drivers' are not good enough since they will loose focus after some period of time.

          Seems like something you should talk about with your state representative.

      • True, Waymo is in a different class and Way more security conscious, but still their 'miles driven between interventions' numbers are restricted to selected itineraries and destinations. This of course inflates the number of miles driven without intervention they can report, and that number also doesn't tell us about how badly the car would have failed without intervention, or what the consequences would have been.

        • Humans drive 551,370 miles without getting into an accident. So current 'miles without an interaction' targets seem ridiculously short.
          • Not to mention that those 551,370 miles include all kinds of conditions, not just those conditions most easily navigated by automated vehicles.
  • Stinky (Score:5, Insightful)

    by tungstencoil ( 1016227 ) on Wednesday March 28, 2018 @09:18AM (#56340665)
    I smell a (public relations/marketing) rat. Uber has sunk serious cash into self-driving vehicles, with a vision of a future where there are no drivers and no vehicle ownership. The public at large will simple dial up (automated) transport services from their mobile - transport services Uber will provide. I can't imagine one (admittedly tragic) setback is all it takes to cause them to abandon it.

    They are (disingenuously, I suspect) attempting to appear humane, cautious, and considerate... to mitigate the "we stomp on laws! we stomp on people! we harass! progress at any cost!" image that they have. When the dust settles, they'll - at best - issue a "we're super duper happy we took this pause to focus on safety blah blah".

    Writing applications that impact public safety (as someone else pointed out, Therac anyone?) is an entirely different ballgame than "app development". Anyone rationally believe that Uber has the governance, structure, and discipline in place to create self-driving systems? What about airplane critical systems? Radiology?

    Transportation is no joke. Realtime systems are not 'just another kind of app'. It's a very different approach, discipline, and willingness to follow a particular governance, a particular path. Nothing I've read about Uber, nothing they've ever said publicly, and nothing the three engineers I've known to work for them indicates that they - at all - acknowledge that. They believe they're somehow super-special, somehow smarter than the thousands of people who've come before them, working on critical systems, who utilized such safeguards and discipline.

    NopeNopeNope.
  • "We proactively suspended our self-driving operations, including in California, immediately following the Tempe incident," an Uber spokesperson told TechCrunch"
    The word this spokesperson wanted was "REACTIVELY".
    You don't proactively do things in immediate response to events. That's called reacting.
  • Automotive deaths are only justified when undertrained and underqualified human drivers have somewhere to be on a friday night.

  • "We proactively suspended our self-driving operations, including in California, immediately following the Tempe incident," an Uber spokesperson told TechCrunch.

    ...when you do a thing in response to another thing, that's the definition of reactive.

  • The register is reporting Uber cut the number of LIDAR sensors from 5 to 1 on the volvos. They also cut the number of cameras. Worse, it looks like the AZ gubanor was playing footsies with UBER. I just get more and more disgusted by this as more comes out. Much like the water slide thing in TX, someone needs to go to jail.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...