Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
United States Transportation

Waymo Cars Honk at Each Other Throughout the Night, Disturbing SF Neighbors 74

Driverless Waymo vehicles in a San Francisco parking lot have been repeatedly honking at each other, disrupting nearby residents' sleep and daily lives, according to local media report. The incidents, occurring at random times over the past two weeks, have prompted complaints from multiple condo dwellers. Randol White, a local resident, first noticed the problem when he was awakened at 4 a.m. by the cacophony. Another resident, Russell Pofsky, reported being woken up more times in the past fortnight than in the previous 20 years combined.

Waymo acknowledged the issue in a statement, saying they have identified the cause and are implementing a fix. The company's response comes after affected residents reached out to report the problem.
This discussion has been archived. No new comments can be posted.

Waymo Cars Honk at Each Other Throughout the Night, Disturbing SF Neighbors

Comments Filter:
  • Brain (Score:4, Insightful)

    by JBMcB ( 73720 ) on Tuesday August 13, 2024 @11:09AM (#64702420)

    If a company could issue a fix for the brains of the motorcycle riders who like racing at one in the morning near my house that would be great.

    • Miles Hudson just entered the chat.
    • Nature has already found a remedy. The fatality rate for motorcyclists is I forget either 5 or 10 times that of car drivers. Nature has found a way to separate the brains from the bodies.

  • by Dave Emami ( 237460 ) on Tuesday August 13, 2024 @11:14AM (#64702438) Homepage
    Is it car mating season, maybe?
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Tuesday August 13, 2024 @11:18AM (#64702458)
    Comment removed based on user account deletion
    • by flink ( 18449 ) on Tuesday August 13, 2024 @11:26AM (#64702496)

      If I as an individual got a couple dozen cars together and honked their horns all night and refused to stop, I'd probably be arrested and fined for disturbing the peace and my cars would be impounded. I don't see why Waymo should be able to get away with anything less. If anything they should face harsher penalties since they are a large company with a legal department who should know better. They have enough resources to send someone out to manually disconnect the batteries on the cars if they have a software bug they can't fix quickly.

      • If I as an individual got a couple dozen cars together and honked their horns all night and refused to stop, I'd probably be arrested and fined for disturbing the peace and my cars would be impounded.

        A lot of criminal law and procedures is based around the concept of intent. In the scenario you described, you intended to do what you did. In the Waymo scenario, the behavior is not the intended behavior. Since it is a corporation and no actual intent, the criminal system remains unactivated.

        Is that the way it SHOULD be? Hell no. The honking is just as undesirable regardless of intent.

        • by flink ( 18449 )

          Negligence is also a crime. If I left a pile of buggy electronics out in my yard that periodically cause a 120dB siren to go off, you'd bet I'd get a visit from the cops and forced to do something about it, even if I did plead incompetence in building a homemade bugler alarm. Waymo could definitely manually intervene if they don't have an immediate software fix for the problem. They chose not to and are allowed to get away with it.

          You generally don't get off the hook for violating noise ordinance just b

          • Negligence is also a crime. If I left a pile of buggy electronics out in my yard that periodically cause a 120dB siren to go off, you'd bet I'd get a visit from the cops and forced to do something about it ...

            You, as an individual, do not get the benefit of doubt. You, as part of a business, get some leeway when it comes to doubt. You, as part of a corporation, are automatically granted leeway when it comes to doubt.

            You can see it in action here and all around you in other situations.

    • I'm inclined to agree. No double standard. If they want their cars on the roads, they should face every penalty that a typical driver would face, including when refusing to comply with orders given by the police. If they don't have the proper mechanisms in place to comply with lawful orders, they aren't fit to be on the road, any more than you or I would be if we were incapacitated or incompetent. It's fine if the car needs some reasonable accommodations—a deaf person might need to have things written

      • I'm inclined to agree. No double standard. If they want their cars on the roads, they should face every penalty that a typical driver would face, including when refusing to comply with orders given by the police. If they don't have the proper mechanisms in place to comply with lawful orders, they aren't fit to be on the road, any more than you or I would be if we were incapacitated or incompetent. It's fine if the car needs some reasonable accommodations—a deaf person might need to have things written out before they could understand what's being asked of them—but the car itself needs to be capable of complying, independently of a Waymo call center being involved.

        To an extent, but law isn't like a software program, it's supposed to applied reasonably to the circumstance. Self-driving cars get leeway that human drivers (don't throw the book when the honking goes a little wonky), and human drivers can often get away with a rolling stop when a self driving car should not be allowed to do a rolling stop [consumerreports.org].

        Now for safety issues there's some hard lines you need to enforce, but for this honking it's more about making sure they fix the issue.

      • by Cyberax ( 705495 )

        I'm inclined to agree. No double standard. If they want their cars on the roads, they should face every penalty that a typical driver would face

        It's SF. The police doesn't care if you drive your fart-mobile with ultra-loud exhaust. So it's completely consistent, no double standards.

      • If they want their cars on the roads, they should face every penalty that a typical driver would face, including when refusing to comply with orders given by the police.

        So all the white cars get a pass to drive however they want but the police just beat the living shit out of the black, brown and red cars because of something minor like they didn't know someone stole their rear license plate or that a brakelight burned out? I don't see how that's particularly useful.

    • by King_TJ ( 85913 )

      This article claims this has been going on for about 2 weeks. Honestly? While I'm sure that's annoying, it also seems like an unsurprising outcome from putting a bunch of driverless vehicles in one parking area for the first time. (I can see how they'd have some proximity detection capability so one will honk if it detects another at a close distance and moving towards it.) I doubt this is something you can really expect a corporation to get resolved with the snap of a finger...

      Probably took at least a w

      • Comment removed based on user account deletion
        • Yes, but in your example there were drivers in trucks who could've been instructed to stop blowing their horns after hours (and, yes, most municipalities have laws on the subject).

          And in this case the driverless cars have been instructed to stop blowing their horns, and have apparently done so.

          AI systems do odd and surprising things sometimes. People do odd and surprising things sometimes, too. I have a Tesla with FSD and it's very interesting to watch the sorts of strange things it does. The oddest one, which has only been happening for the last couple of months, is that at some intersections it will have plotted a navigation path that includes, say, a right turn, but the car wil

          • Comment removed based on user account deletion
            • Yes, self-driving cars don't have language models and can't understand verbal instructions from police. Instead, the report has to go to the company, who has to get engineers to change the software, which takes longer. I addressed that in my post.
            • I suppose the robot could be arrested - I wonder how much the horn would blow while they were towing it to the pokey?

              Who's responsible for keeping those things on the road? No, not some lackey in a NOC-like facility. The c-suite guy over the project. That's who you're arresting.

      • Comment removed based on user account deletion
        • While we're on the subject, where's the enforcement on livery laws? These things are taxis. In San Francisco, that means it needs to be recognizable as a taxi, and have a meter with fixed rates, among other things. And livery laws exist to prevent taxis (and robotaxis) from just DDOSing streets with too many cars for no fucking reason, kidnapping people and passenger safety in general. Do the robot black market cabs drive better than the nonrobot black market cabs? Yes, but that's... a low bar to clear
      • Fining them is pretty reasonable. Do it on a sliding scale, though. Waymo's part of a multibillion dollar company so what would be a $400 fine for us better be a $400,000,000 fine for them.
    • But that's not how fascism works.

      Unfortunately people have come to the point where if Waymo hadn't stopped, their parking lot would have been fire-bombed.

      Hopefully law and order can be restored soon so things don't continue on this path.

    • by dbialac ( 320955 )
      If they went this route, you'd have to find an excuse as to why the same law doesn't apply to taxi drivers.
    • Yeah, let's go thermonuclear on a car company that rapidly fixed a problem when identified. That'll make all the difference to their behaviour in response to bugs!

      What?

    • by Tablizer ( 95088 )

      Such fines are peanuts to Waymo. Being first to market is more important to them, a market potentially worth hundreds of billions, and few grand in traffic fines won't dissuade them.

    • Charging Waymo for the misbehaviour would help create a precedent that the software provider is 100% responsible for when things go wrong on self driving cars. We need this determined very clearly and unambiguously.

    • by Lehk228 ( 705449 )
      that's not how laws work, they can only be fined as the law is already written, and nobody wants 5 figure noise violation tickets
  • Who can blame these vehicle acknowledge each other's presence?

  • by Smonster ( 2884001 ) on Tuesday August 13, 2024 @11:21AM (#64702476)
    Yeah...I am pretty sure the cars were plotting something. It is when they get real quiet that you have to really start worrying.
    • by dbialac ( 320955 )
      Waymo is working on developing a mating call.
    • Didn't Steven King write a horror story / movie about that topic? "Maximum Overdrive", I think?

      • The story was called "Trucks" which became the movie "Maximum Override". He also wrote Christine and Duel.

        I think it's interesting that the master of horror is less terrified of clowns and demons than he is of ... traffic. It does make sense though since in the US cars maul 50,000 people a year and demons and clowns kill about 100 people max.

  • to the question, do it honk [youtube.com]?
  • I have seen a few that drive near a bush, lift rear wheel and then release some gasoline onto the bush.

    Then, when another waymo comes - it first sniffs the bush with its sensors and then releases its own gasoline onto the same bush...

  • by echo123 ( 1266692 ) on Tuesday August 13, 2024 @11:33AM (#64702524)

    This article explains all. [theverge.com]

    Check out the livestream. [youtube.com] If you scroll back to 1.33am for example, the next 10 minutes are very revealing.

    • by tragedy ( 27079 ) on Tuesday August 13, 2024 @12:25PM (#64702682)

      So, looking at that video, it appears that what is happening is that when a car backs up in front of one of the Waymo cars, some sort of collision prediction system kicks in and it automatically honks to let the car in front know that it's there. It also looks like it may automatically back up to avoid a potential collision. So, if there's a line of Waymo cars coming in to park, and the front one backs up, the car behind it honks and backs up, then the car behind that honks and backs up, then the car behind that honks and backs up and so on.

      So, first of all, honking your horn that readily is obnoxious. The bigger problem though is that, if the Waymo cars are following so closely that cars in front have no room to maneuver without tripping collision detection, then they're clearly following too closely. There clearly should be some awareness by the cars that they are in a parking lot and not the open road. Not to mention that this quite clearly demonstrates that these driverless cars are completely unaware of whether other cars on the road are also driverless. Overall, this is not confidence inspiring.

      • Overall, this is not confidence inspiring.

        They identified the problem and are rolling out a fix, one that solves an issue across an entire fleet. In the mean time we've been asking people for 60 years not to act like arseholes and there's still no fix in place.

        What about this isn't confidence inspiring? That human drivers still exist?

        • by Tablizer ( 95088 )

          They never showed the guinea pigs (like SF) on Jetsons.

        • by tragedy ( 27079 )

          What about this isn't confidence inspiring? That human drivers still exist?

          One of the things that's not confidence inspiring is that this is an emergent behavior problem based on simple rules. Essentially, we have a state machine here. Now, picture a highway, filled with Waymo cars in bumper to bumper traffic. They leave a buffer space between themselves, but it's obviously just a little bit beyond what they consider a safety zone. The traffic is slowed because of an accident ahead on the road or some other blockage and the lead car is stopped. Part of the blockage is cleared, and

        • It shows they are using an ad-hoc, Agile-style development methodology, which is not appropriate for self-driving cars.
      • Watching the stream for a while I think there is also some bug in the parameters for the lot, like they've assigned the wrong number of cars or blocked off spaces and then not updated the max number of cars. So the cars are just over-constrained. They are also super aggressive to the point where they will steal each other's spaces. Also they should not be backing into these spaces because one will pull up behind another that is trying to back in and block it out. That is exactly one of the reasons that
      • Adding the dimension of "Is it another Waymo?" adds complexity to the decision making requires a new data exchange mechanism that could have bugs and likely would increase errors in all other dimensions. We will get there and eventually transportation will be largely autonomous, but first these things need to get better handling unexpected situations with as little additional complexity as possible.
      • So, looking at that video, it appears that what is happening is that when a car backs up in front of one of the Waymo cars, some sort of collision prediction system kicks in and it automatically honks to let the car in front know that it's there. It also looks like it may automatically back up to avoid a potential collision. So, if there's a line of Waymo cars coming in to park, and the front one backs up

        So, the programmers have not considered, for even the simplest scenario imaginable, how traffic will work with lots of robocars on the road? Or a just a few parking next to each other? Transit planners think about this for human-driven traffic; it's a rich field of study and very interesting. Mathematically characterizing how everything flows, from stoplights to freeways. But the people who can actually totally control these new robocars, who are programming the heuristic algorithms, didn't even have a clue

      • The bigger problem though is that, if the Waymo cars are following so closely that cars in front have no room to maneuver without tripping collision detection, then they're clearly following too closely.

        That wouldn't surprise me at all. Most amateur drivers follow too close as it is, and then they expect robots to drive even harder for no reason. Seriously folks, it's 2 seconds in clear and dry weather, add a second for wet, add a second for every 10t of vehicle you have (round down), double it for frozen. No X car lengths, that's crap. It's how many seconds from the vehicle ahead. And the fastest you should be driving is no faster than you can see ahead to stop, so that's pretty unlimited at midday b

    • This article explains all. [theverge.com]

      Check out the livestream. [youtube.com] If you scroll back to 1.33am for example, the next 10 minutes are very revealing.

      However, during that livestream from August 5th, there are comments about Waymo saying they had it fixed and it wouldn't happen any more... and there are no more recent videos, so it seems like it has been fixed.

      • ...However, during that livestream from August 5th...

        With all due respect I believe you are mistaken. The livestream is current, and you can scroll back at least 24 hours.

        https://www.youtube.com/live/h... [youtube.com]

        • ...However, during that livestream from August 5th...

          With all due respect I believe you are mistaken. The livestream is current, and you can scroll back at least 24 hours.

          https://www.youtube.com/live/h... [youtube.com]

          Ah, you're correct. I was fooled by the start date and the fact that scrollback is only 24 hours. So the comments about it being fixed were more recent, and we'll see if it is actually fixed tonight.

          Thank you for the correction.

    • On the video. Fascinating.

  • Sally (Score:3, Interesting)

    by Iamthecheese ( 1264298 ) on Tuesday August 13, 2024 @11:40AM (#64702540)
    I have to link this [bestreadables.com] relevant story.
  • "First the Puerto Ricans, now the robots! I can't flip this dump and move to the suburbs soon enough!"
    • Sorry to say but the suburbs aren't far enough. City cancer is spreading there too. Just visit their local box stores and witness the cages and customers being treated like enemies of the retailer. Stop at all the nearby major intersections and look for beggars; if they exist you haven't gone out from the city far enough.
  • I wonder if the problem would be fixed faster if they skipped talking to the PR flacks and just arranged random nocturnal honking for waymo leadership [waymo.com]?
    • My thought exactly. A few of those Execs need a late night horn storm, night-after-night-after-night. And probably some disabled vehicles blocking their driveway, with no one inside of course.

  • The cars aren't being driven by AI. It's some Mechanical Turk workers in India.

  • Contrary to techbro delusions, this shit will never be ready for prime time, be safer than human drivers, or more cost effective than actually running traditional trains and trolleybuses.
  • by Tablizer ( 95088 ) on Tuesday August 13, 2024 @03:14PM (#64703264) Journal

    ...recursively dumb bots? I'm shocked! This is Slashdot!

  • ... of Vita Carnis "Trimmings". (warning: pretty gruesome horror)

If all else fails, lower your standards.

Working...