Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Transportation Businesses United States Technology

Waymo's Driverless Cars Have Logged 10 Million Miles On Public Roads (qz.com) 129

An anonymous reader quotes a report from Quartz: Alphabet's driverless-car company Waymo announced a new milestone today (Oct. 10): its vehicles have driven a collective 10 million miles on U.S. roads. With cars in six states, Waymo has really been racking up the miles since April 2017, when it launched a program giving rides to passengers around the Phoenix, Arizona area. At that point, Waymo cars had driven not quite 3 million miles since the company's earliest days as a research project within Google in 2009. But in the last 18 months, the company more than tripled its road mileage.

Competing with other companies with autonomous-vehicle programs like Uber, Tesla, Apple, and GM's Cruise, Waymo is leading the pack in terms of road miles driven. [...] The company's next 10 million miles, CEO John Krafcik said in today's announcement, will focus on "striking the balance" between its safety-first algorithms and driving assertively in everyday maneuvers, like merging, and navigating bad weather. But it's worth keeping things in perspective: U.S. drivers rack up some 3 trillion miles each year, so Waymo still has some ground to cover.

This discussion has been archived. No new comments can be posted.

Waymo's Driverless Cars Have Logged 10 Million Miles On Public Roads

Comments Filter:
  • by Austerity Empowers ( 669817 ) on Thursday October 11, 2018 @08:04AM (#57460410)

    Even if they put 3 trillion miles on their system, if they confine it to just a few geographical areas, I don't trust it very much. I'd like to see them driving in NYC, Boston, Chicago, New Jersey (even humans can't figure this one out), etc. Places where public investment in the roadways has either been compromised (i.e. stolen by politician for other bullshit), minimal, or there simply wasn't enough space to put proper roads in, so they did something else instead...

    • by XXongo ( 3986865 )

      Even if they put 3 trillion miles on their system, if they confine it to just a few geographical areas, I don't trust it very much.

      Not just geographical areas, I wonder if they try it out on multiple different types of streets in multiple different times of day. An automated driving system that works fine on freeways and on wide, relatively untrafficed suburban roads may be better than one in complex city exchanges in rush hour.

      • Maybe this information should be shared with the Google people. There's a chance they've never considered any of these ideas.

        • by religionofpeas ( 4511805 ) on Thursday October 11, 2018 @08:28AM (#57460504)

          Don't worry. Every slashdot comment is framed and hung up in the board room. True goldmine here.

        • by arth1 ( 260657 )

          Maybe this information should be shared with the Google people. There's a chance they've never considered any of these ideas.

          That they have considered it does not imply that they give a fuck.
          If the goal is to sell to the majority who will buy a product, how it will affect minorities is not going to be a showstopper. It is, unfortunately, up to the government to ensure that the interests of those who will be negatively affected are protected and that manufacturers address issues.

          In other words, until a senator gets severely delayed or his dog gets run over by by a driverless car, nothing will happen. Until then, the promises of

      • Google has done that.

        Waymos limits are tight geofenced and high resolution mapped areas in decent weather.

        Which I would point out are 1,000 times more open than everyone else.

        Next up as of last year was repeating phoniex's setup in detriot.

        Which should cover bad roads and bad weather nicely.

        • by Anonymous Coward

          This is good, they must be prepping for best in class detection of burning barrels.

      • Not just geographical areas, I wonder if they try it out on multiple different types of streets in multiple different times of day.

        Yes, after all nobody could have ever come up with the idea of testing a SDC on different types of road conditions. Google will be sending your bonus over immediately.

    • Even if they put 3 trillion miles on their system, if they confine it to just a few geographical areas, I don't trust it very much

      Outside those areas, you wouldn't have to trust it, because they won't be driving there.

    • First you are right in what you point out. Additionally, what they have been doing is a clinical trial on non-volunteers-- everyone who intersects their roadways. That's really really bad. They should haveracked up a million miles on test tracks before moving to anything with even limited public exposure.

      However now that they have done this clinical trial, unethically/illegally or not, they do have a body of evidence that maybe worthy of a phase 2 clinical trial on less constrained public roads.

      that is,

      • They should have racked up a million miles on test tracks before moving to anything with even limited public exposure.

        Why ? How many accidents did they cause ?

      • by sjbe ( 173966 )

        Additionally, what they have been doing is a clinical trial on non-volunteers-- everyone who intersects their roadways. That's really really bad.

        Disagree. One only has to look at the accident record of human drivers versus autonomous vehicles to see who is currently leading the standings to be the most dangerous. (spoiler: humans have the bigger body count by a wide margin) Quite honestly I trust Waymo more than I trust you (or any other human - it's nothing personal) to operate a vehicle safely based on the available data. My chances of getting killed by a human driver are FAR higher. Anyway it doesn't matter to a dead person whether the drive

        • Waymo drives 0.0003 percent the miles humans drive a year in hand-picked conditions and you talk like they deserve a safety award or something.
        • by Anonymous Coward

          First off, these cars are not safer than the ones being driven by people. I'm not sure where you got that idea, but it's not true. These are cars that are being driven in good conditions, they're not being driven in conditions which lead to crashes. Waymo has fewer total miles driven than what regular drivers have every day. Meaning that the figure is likely less respresentative of the actual safety than it might seen.

          And you're a fucking moron about the informed consent. Uber's car murdered one person, yo

          • Something to think about: Sure autonomous vehicle testing has been done almost exclusively in good weather conditions and relatively low traffic loads. But doing so is a necessary step in reducing the risk. Any good test involves reducing the variables as much as possible after all. The engineers who are building these vehicles have passed the point where closed track testing gives meaningful results. They have to start putting these vehicles on the open road in order to further refine the designs. They ave
      • Self-driving cars have been racking up test track miles since the 1990s. I remember watching shows about it back then. The only option to satisfy you, apparently, is to stop every driver to get them to sign a consent form. Which is absurd.

    • by arth1 ( 260657 )

      'd like to see them driving in NYC, Boston, Chicago, New Jersey (even humans can't figure this one out), etc.

      Please add things like
      - Twisty mountain passes during winter, with loaded trailers barrelling down doing the standard 9 mph above speed limit.
      - Deserts with tumbleweed.
      - Forests with deer crossings.
      - Areas with bikers who like to ride abreast.

    • by phantomfive ( 622387 ) on Thursday October 11, 2018 @08:51AM (#57460598) Journal
      Waymo's system can't operate in an area where they haven't built a highly detailed 3D map. NYC isn't dramatically worse than San Francisco (which has plenty of bizarre traffic things, but it doesn't matter because the AI has a really good map. It knows what those things are), and Waymo has been operating in SF. If they can build the map, they can handle NY or Boston ok.
      • How much does the environment need to change before the maps are no longer good? I feel like needing really good maps is a huge limitation towards overall usability - if I need good maps, I potentially couldn't self drive cross country. Granted regions in the middle of nowhere might need less frequent mapping compared to a major city.

        • by religionofpeas ( 4511805 ) on Thursday October 11, 2018 @09:05AM (#57460650)

          Small changes in the environment shouldn't matter. In the future, they could automatically make updates to the map using the 3D scans from all the cars passing points that show discrepancies in the old map. Maybe they're already doing that.

          • That makes sense. I was curious about thresholds since I'm less familiar with Waymo's approach. Say a city put up portions of protected bike lanes one night (as they did in my area) would that disrupt things for a morning commute. Or if a foot of snow fell (presumably they aren't there for testing yet) whether the car could handle that change.

            I imagine that would be a logical solution for updating maps since I imagine the sensors for driving and mapping are similar. Still a bit concerned as it is likely the

          • So the other year when they replaced the normal intersection by my house to a round-about, you're saying Google will be there with their cars ready to scan that intersection the moment it opens? Keep in mind we are but one small city. There is no way the city will have the expertise for this.
            • No, I said "small changes to the environment".

            • SDC cameras surely must be capable of detected the absence of stop signs / stoplights by now, and deducing from the shape of the road that this must be a roundabout.

              • Plus, changing a road from intersection to roundabout is virtually always heralded by construction for at least a few days prior. Vehicles passing through would have ample chances to update the common shared map, noting that this area is likely to be subject to further change.

                In addition, in the North American and European road signage standards, drivers are supposed to be alerted to the presence of a round-about ahead by signs. (there is some variation in what roundabout signs look like though) All the au

                • All good points.

                  But It still isn't "self driving" if you need to hold the cars hand all the time.

                  Where just one small unexpected input can lead to catastrophic failure.

                  You can always add more and more outside input to help the car appear to be "self driving", but the more you do the less it is.

                  Electronic "train tracks", even very flexible and adaptive train tracks, won't make a car self driving.

                  • Thank you, this is what I was trying to say. How much are we willing to roll the dice on an unexpected situation? If the car is that good then why do we need the map? It's the gap in which how safe the car is when it goes off the map that is key here. 10M miles doesn't prove anything. How many times was it off the map?
        • also need free data and free data roaming if needs to download new maps as you make that trip.

        • Then you need to Learn the differences in self driving systems.

          Waymo is the only level 4 company and has tight geofenced areas, and high res maps.

          Level 0 is regular cruise control
          Level 1 is the newer speed changing in traffic cruise control
          Tesla and supercrusie are level 2

          Fully self driving starts at level 4 but limited
          Level 5 car drives like humans can. No one has even started this yet.

          It is why I laugh when people say self driving cars are almost here. Nope 20-30 years away at best.

    • Even if they put 3 trillion miles on their system, if they confine it to just a few geographical areas, I don't trust it very much.

      Why? Human drivers are demonstrably dangerous and the body count to date heavily favors the computers as the likely safer option. While I'm not suggesting autonomous driving vehicles are ready for prime time yet or that it's a slam dunk that they are safer, I think people like yourself are not really doing a very good job of evaluating the actual risk data. Honestly I don't really trust YOU as a driver either. Nothing personal - you shouldn't trust me either or any other human driver. But the point is

      • by arth1 ( 260657 )

        Why? Human drivers are demonstrably dangerous and the body count to date heavily favors the computers as the likely safer option.

        So? Driving is a calculated risk, where people see the benefits as outweighing the risks. When reducing risks causes the benefits to go down, this changes the equation.
        If people really were interested in safety above everything else, no one would ever buy sporty cars or drive above 35 mph.

        A small risk of accidents and fatalities is an acceptable price for the freedom of driving, as it was for the freedom of riding for those before us. Reducing the risk by taking away the freedom is just not acceptable fo

        • A small risk of accidents and fatalities is an acceptable price for the freedom of driving, as it was for the freedom of riding for those before us. Reducing the risk by taking away the freedom is just not acceptable for many of us.

          First off your argument that autonomous cars somehow reduce your freedom is nonsense. The freedom that cars provide is freedom of mobility which is in no way being threatened. If you enjoy driving that's fine but your freedom to drive isn't being impinged by autonomous vehicles also being on the road. Even if computers replaced all human drivers your freedom of mobility isn't being affected at all. The ONLY point is that YOU as a human driver are very likely a bigger threat to me (and vice versa) than

          • by arth1 ( 260657 )

            First off your argument that autonomous cars somehow reduce your freedom is nonsense.

            How do you go on a joyride in an autonomous car?

            The freedom to take your car (or motorbike or horse) "out there" is a part of American life. If you never enjoy that, I feel truly sorry for you.

            • by djinn6 ( 1868030 )

              How do you go on a joyride in an autonomous car?

              You get in the car and tell it where to go. Then you enjoy the ride.

              Oh did you mean the joy of driving it? Last I checked, go carts are pretty cheap and racing tracks still have public days. You can still drive off road in many places where roads don't exist. You can also visit another country where autonomous cars aren't mandated.

              • by arth1 ( 260657 )

                How do you go on a joyride in an autonomous car?

                You get in the car and tell it where to go.

                In other words, you have no idea what a joyride is.

                The whole point being that there is no destination.
                You go where the road and whim takes you. Discover new places you didn't know about. Enjoy the freedom of not having to go to any particular place, at any particular schedule. Make impromptu decisions when hitting crossroads. Stop at a kid's lemonade stand or ice cream store you didn't know existed. Or where there turns out to be a good view. Or not stop at a

                • by djinn6 ( 1868030 )

                  I see you're easily entertained. But you know, you can tell the car to go somewhere and then tell it to stop anywhere along the way. They might even make cars that take directions from you, turn-by-turn.

                  Or you can do what I do, which is to explore it online, maybe read about its history, learn about local politics and economy, look at photos or streetview, read some reviews, watch some videos other people took, then decide whether I need to visit in person. I can visit way more places virtually and learn ab

                  • by arth1 ( 260657 )

                    They might even make cars that take directions from you, turn-by-turn.

                    Oh, they do that. The technology is called a steering wheel.

          • I think you misunderstand the basis of the posters fears about losing freedom. To be fair, he wasn't explicit in why he thinks that way.

            The problem is, once autonomous driving has a proven track record of being safer than human drivers in an overwhelming majority of situations, it is likely and perhaps even inevitable that legislation will be passed restricting humans right to drive on public roads. Right now, the big concern is that humans may be placed at undue risk by robotic use of the public roads. O

          • by arth1 ( 260657 )

            40,000 people died last year in the US alone from human driven vehicles and I'm pretty sure some of them and their loved ones might prefer a different outcome if we had the technology. If we replace the human drivers with computers and drop that number to a smaller number (maybe even zero) then you are going to have a VERY hard time arguing that huge body count is a worthwhile price to pay.

            40,000 is a small number compared to number of drivers, passengers and miles driven. Heck, it's less than the number of suicides in a year (and even includes a number of suicides that weren't classified as such).

            And if the end justifying the means is your argument, why not apply a technological solution to bigger problems too?

            Deaths in the US in 2016:
            Heart disease: 635,260
            Cancer: 598,038
            Stroke: 142,142
            Diabetes: 80,058

            Most of these were likely preventable. Self-administering food appears to be incredibly d

        • by Kjella ( 173770 )

          So? Driving is a calculated risk, where people see the benefits as outweighing the risks. (...) You can live in a padded room if you like, but don't impose it on others.

          To most people, most of the time the benefit is getting from A to B and driving only a means to an end. And stop acting like we can't die in your crash. That's why we have laws on speeding and drunk driving, what you think is acceptable risk is not the final answer. I think we're extremely far from a ban on human driving, but your "maybe I'm high risk but I don't care because freedom" reminds me of half-blind elderly who refuse to turn in their license. I'm not going to hold on to it at all costs if it's ob

  • by Rei ( 128717 )

    Competing with other companies with autonomous-vehicle programs like Uber, Tesla, Apple, and GM's Cruise, Waymo is leading the pack in terms of road miles driven.

    Um, huh? Tesla's Autopilot had driven 1,2 billion miles [electrek.co] as of July. Two orders of magnitude more than Waymo.

    10 million miles is really nothing. In the US, there's only one fatal accident per 86 million miles on average.

    • Re:Huh? (Score:5, Informative)

      by XXongo ( 3986865 ) on Thursday October 11, 2018 @08:14AM (#57460454) Homepage

      Um, huh? Tesla's Autopilot had driven 1,2 billion miles [electrek.co] as of July. Two orders of magnitude more than Waymo.

      Uh, Tesla's "autopilot" is a driver assist, not a self-driving vehicle. And it racks up the miles on expressways-- that's the easy kind of driving.

      So, no, not the same thing.

      10 million miles is really nothing. In the US, there's only one fatal accident per 86 million miles on average.

      Indeed, that's the metric to compare to. But not all miles driven are the same.

      • by Rei ( 128717 )

        Uh, Tesla's "autopilot" is a driver assist, not a self-driving vehicle

        Semantics and legalities. It's still collecting data and allowing for the refinement of algorithms. Just over a hundred times more data.

        If the article is going to claim that Waymo is "leading the pack in terms of road miles driven", they shouldn't explicitly list Tesla as a company that they've driven more miles than. Peroid. Because that's simply not a valid claim. You can argue that Waymo and Tesla's goals are different, but which

        • by XXongo ( 3986865 )
          Tesla's autopilot is not self driving because Tesla specifically states that "autopilot" is not self driving, and drivers should not be considered it self-driving.

          it is a different thing.

          Staying in lane, staying a constant distance from cars ahead of you, and occasionally changing lanes on a straight expressway-- these are all useful as driver assist, but it's not self driving.

          If you want to compare miles driven on self-driving to things that are not self-driving, then a lot of cars have put in more mile

          • by Rei ( 128717 )

            Nothing you wrote changes the fact that what was written in the article is erroneous.

            * If one considers Tesla's Autopilot to be in same category as Waymo, then the claim that Waymo has driven more miles than Tesla's system is simply false
            * If one considers Tesla to not have a system in the same category as Waymo, then the claim that Waymo has driven more miles than Tesla's system is nonsensical.

            • by XXongo ( 3986865 )
              What if you consider that Tesla has been testing self-driving, but this is not the same as their autopilot, and hasn't accumulated millions of miles?
              • by Rei ( 128717 )

                Tesla only has one system. Just different revisions of it. When they complete one revision, they deploy it and move on to the next.

        • You can argue that Waymo and Tesla's goals are different, but which one has driven more miles is not up for debate.

          Depends on what you mean by "driving". The Tesla system is a glorified passenger.

        • by AmiMoJo ( 196126 )

          Semantics and legalities. It's still collecting data and allowing for the refinement of algorithms. Just over a hundred times more data.

          Teslas have far fewer sensors than Waymo vehicles, so they collect vastly less data. No lidar, for example.

          And collected data is not a very good metric. Is the quality/utility of Tesla's data as good as Waymo's? Considering that they don't even look for many types of events and don't collect a constant feed from every camera it's very likely that they are missing lots of stuff that will be vital to reaching full self driving.

          There is very little reason to think that Tesla is anywhere near Waymo on self driv

        • Semantics and legalities.

          No, technicalities. Tesla's auto-pilot does not have the capabilities that this thing has regardless of what you want to call it.

        • by DRJlaw ( 946416 )

          Uh, Tesla's "autopilot" is a driver assist, not a self-driving vehicle

          Semantics and legalities.

          It wasn't "semantics and legalities" when "autopilot" steered a vehicle into a highway divider [popularmechanics.com].

          It wasn't "semantics and legalities" when "autopilot" drove into the side of a tractor trailer [washingtonpost.com].

          Then it was the stupid driver who mistakenly used autopilot as a substitute for paying and attention because "autopilot" is not a self-driving vehicle system. Now, when it's convenient for you to argue so, it suddenly is equ

        • "And it racks up the miles on expressways-- that's the easy kind of driving."

          Some people here think this isn't a feat. I"m pretty sure thousands of human-driven vehicles result in death every year in ideal conditions. Not every accident happens when its icy/raining/tornadoing/etc.

          Honestly, those wrecks that happen in bad conditions could often be avoided if humans weren't so over-confident in their abilities and not able to wait (often less than an hour) for more ideal conditions.

          • As far as I know, collisions that occur in ideal or even close to ideal conditions are virtually always caused by one human doing something stupid. A classic example is of a driver realizing he needs to get over right now or he will miss his exit. So he darts over quickly, maybe forgetting to signal, certainly not signalling far enough in advance. He suddenly appears in the path of a semi, and brakes hard to make his exit. Forcing the transport to really stand on his brakes to avoid turning the idiot into a
      • Uh, Tesla's "autopilot" is a driver assist, not a self-driving vehicle. And it racks up the miles on expressways-- that's the easy kind of driving.

        It's also used on surface streets. It can follow cars, stay in complex lanes, and react to changes around it.

        I would argue that Tesla has on balance as much important experience as Waymo does, because Tesla has a lot more info on the basics of driving determined in a general purpose way, with no prior knowledge of the road you are on. Waymo's approach is more a

      • by Rolgar ( 556636 )

        Wouldn't it be better to compare fatalities/accidents broken down by the type of driving (highway or city) since they are a whole different beast, and the highway driving is easier, but likely to result in harder collisions where the city driving is more complicated but much lower speed but hopefully less likely to result in death.

    • by Ogive17 ( 691899 )
      Autopilot is a glorified cruise control. It's not quite the same thing.
    • by ledow ( 319597 )

      Yeah...

      When Tesla call it a self-driving vehicle, there will be more than a few people who will be interested in that.

      Because, for a start, they don't have a licence for that. And a lot of courts will sit up and take notice at, say, all those claims they made that a Tesla ABSOLUTELY 100% ISN'T a self-driving vehicle, and didn't kill that nice Apple engineer that time.

      You can't denounce a claim on one hand, and then try to win on that same claim somewhere else.

  • by Artem S. Tashkinov ( 764309 ) on Thursday October 11, 2018 @08:12AM (#57460446) Homepage

    Every time when I hear about these crazy miles I've got just one question to ask: has Google finally solved image recognition and I'm not talking about simple cases - I'm talking about deliberate fakes, bad weather conditions, etc. 1 [bleepingcomputer.com], 2 [theregister.co.uk], 3 [evolvingai.org].

    These issues can easily make your car software make life threatening decisions.

    • The deliberate fakes are carefully tuned to a particular network. If you don't know the network, you can't just make a fake.

      Besides, is this really a problem ? You could replace a 35 mph speed limit sign with a deliberate fake that says 65, if you wanted, but that's not really a major issue, it seems.

    • I've got just one question to ask: has Google finally solved image recognition and I'm not talking about simple cases - I'm talking about deliberate fakes, bad weather conditions, etc.

      If they had completely solved such problems one would assume they would be bringing the technology to market for sale. So the answer is obviously that they have solved some problems but not all the problems. Human drivers have problems with bad weather and deliberate fakes too. Although be honest, when was the last time you saw an actual deliberate fake sign? Computers actually could be less susceptible to these since they can reference map databases about what speed limits etc should be for a given loc

      • Uber's self driving car has already killed a person quite deliberately because the software contained a major bug.

        Now, the question is: are we ready to trust the driving software knowing that it's nigh impossible to program in all the possible choices and situations? Say, a self driving car is driving a free way at 55MPH and suddenly encounters a group of people, a stalled car and a cliff, and there's no time to apply the brake properly. Where will it go? What the decision will be? You can think of hundre

        • Pretty easy answer. Apply the brake. If there's a safe and legal way to swerve around the obstacle, do that, otherwise just stay in the lane and keep braking.

          Of course, realistically speaking, the chance of suddenly encountering a stalled car, a group of people, and a cliff on the freeway is pretty slim. The lidar/radar systems would have seen the obstacles much earlier.

  • Any snow in Phoenix lately? How about heavy rain?
    • I would really like to know what rain does to the radars. Once I tried to create a simple burglar alarm system that could see through walls using a small radar element. It seemed to work well. Until it started raining, when suddenly the system was seeing burglars everywhere.
      • I would think it depended on the wavelength. The longer the wavelength, the less it will be disturbed by precipitation. Of course, longer wavelengths generally also mean a far lower resolution. You would want a wavelength long enough to ignore rain, sleet and snow, but short enough to make sure you still see road signs, parts of children sticking out past parked cars etc.

        If I were an engineer designing the sensor suite for an autonomous anything (which I don't even remotely have the skills to do) I would

  • Perspective? (Score:4, Informative)

    by sjbe ( 173966 ) on Thursday October 11, 2018 @08:52AM (#57460602)

    But it's worth keeping things in perspective: U.S. drivers rack up some 3 trillion miles each year, so Waymo still has some ground to cover.

    Umm, WTF does this have to do with "keeping perspective"? It isn't a competition between Waymo and the rest of us human drivers to see who can rack up the most miles driven.

    • by ledow ( 319597 )

      It means that non-human-driven cars have collectively accounted for 0.000333% of all the miles travelled this year.

      This means that... pretty much... they haven't way over 99.99% of the human race has never been in, with or near a driverless Waymo car.

      This means that, pretty much, Waymo hasn't even covered a thousandth of a percent of the things it needs to cope with when driving.

      And also... that if there is a single accident, that it would scale up 300,000-fold in terms of their overall average accident rat

      • they haven't way over 99.99% of the human race has never been in, with or near a driverless Waymo car.

        They let me drive legally on the road after a single 1 hour test, where I also didn't see 99.99% of the human race, maybe even more.

        Waymo's had quite a few.

        In how many cases was their driverless vehicle to blame ?

      • This means that, pretty much, Waymo hasn't even covered a thousandth of a percent of the things it needs to cope with when driving.

        Not even remotely true. Clearly they haven't come close to every corner case they could run into (they are still testing after all) but if their tech could handle as little as you claim then they couldn't be operated at all on roads of any description. Furthermore most humans are given a license to drive unsupervised well before they have seen all the problems they can run into. We don't even require that they be full grown adults and the "test" is frankly something of a joke.

        And also... that if there is a single accident, that it would scale up 300,000-fold in terms of their overall average accident rate if we all jumped on board them. And Waymo's had quite a few.

        That's not how statistical a

      • To be fair though, places like the Hanger Lane gyratory or the Swindon magic circle are 1) rare and 2) really tough for humans to navigate. The first time any human driver is confronted with a situation like that, they panic and cause confusion in the traffic flow because they struggle to understand the flow while simultaneously operating a motor vehicle. After a few times though them though, they get better at it. An autonomous vehicle would handle the environment decently its first time through, since it
  • I think I read somewhere that the number of miles driven in one single morning commute is something like 130 million miles. So, while 10 million miles is certainly impressive we aren't yet equivalent to 0.4% of a year's commuter driving (2 commutes per day, 5 days a week, 52 weeks per year). And, as previously pointed out, the Waymo miles are not driven in every set of road conditions in the US and are very controlled.

    The real question we and they need to discuss is what is the amount of testing and user

  • First minutes of a real war, and our GPS systems will be down.
    • by Anonymous Coward
      First minutes of a real war and we will all be dead
  • Yaay, Waymo drives 0.0003 percent of the miles that humans do in a year.
    • It's not really fair to add up the miles of all humans, when they are all independent.

      By the way, did you know that me and Usain Bolt together have won 8 olympic gold medals ?

      • Regardless, because of the reason that the 10 million miles were hand picked and a safety driver likely had to intervene many times, the point is that ten million mies doesn't mean a thing.
  • None of this matters. So-called 'self driving cars' still have no capacity to actually think, and so-called 'deep learning algorithms' and so on are not a substitute for actual cognition. I still maintain that these are being rushed to market as fast as they possibly can with the only real goal in mind being to start getting ROI as fast as possible, and their legal departments have assured them that the financial risk of settling lawsuits out of court is acceptable compared to not pushing these out the door
    • The technology is improving every day, and while they'll have problem with some difficult corner cases, they'll make up for that with much better performance in other situations, like not getting distracted by a phone call or a dropped cigarette lighter.

      It's only a matter of time before the overall accident rate is lower than human drivers.

      • None of that means shit until they have AI that can actually think, and we won't have that until we figure out how our brains do that. You can hope and dream all you want and it won't change that fact. They're shit and they'll continue to be shit because it's the wrong approach.
        • They don't need to think. They just need to drive. In fact, we don't want them to think like humans,because we don't want them to make our mistakes.

          we won't have that until we figure out how our brains do that

          Nope. We just need to see what mistakes they make, and then come up with a fix. Just fix one mistake at a time, and keep doing that until it's good enough.

    • Deep Learning algorithms are arguably thinking, just a limited and alien form of it. When it comes to sensory processing and manipulating the environment, there is a scale starting at the biochemical reactions between a virus and cell and ending (so far) at human cognition. At what point along that line do you consider actual thought and learning to occur. (note that sentience or self awareness is a related but separate concept). We can condition simple insects to respond to artificial cues, which seems to
  • driven during inclement weather?

    on snow-covered roads?

    in construction zones?

    in parking lots?

    • Even if they can't do that, I'd be happy to have my car chauffeur me around when the weather is nice and the road conditions are perfect. That would likely cover 95% of the time and take an hour load off my daily driving routine.

  • Waymo's cars can all share the experience they learned from those 10 million miles. Human methods of sharing memories are far, far weaker. So you can roughly divide the human miles by the number of drivers. Very few humans have driven 10M miles in their lifetime.
  • Do the math and you'd need a whole lot of cars running around all the time at highway speeds. So how many cars are really operating? Something doesn't sound quite right here.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...