Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security News

Top Security Researchers Ask The Guardian To Retract Its WhatsApp Backdoor Report (technosociology.org) 70

Earlier this month The Guardian reported what it called a "backdoor" in WhatsApp, a Facebook-owned instant messaging app. Some security researchers were quick to call out The Guardian for what they concluded was irresponsible journalism and misleading story. Now, a group of over three dozen security researchers including Matthew Green and Bruce Schneier (as well as some from companies such as Google, Mozilla, Cloudflare, and EFF) have signed a long editorial post, pointing out where The Guardian's report fell short, and also asking the publication to retract the story. From the story: The WhatsApp behavior described is not a backdoor, but a defensible user-interface trade-off. A debate on this trade-off is fine, but calling this a "loophole" or a "backdoor" is not productive or accurate. The threat is remote, quite limited in scope, applicability (requiring a server or phone number compromise) and stealthiness (users who have the setting enabled still see a warning; "even if after the fact). The fact that warnings exist means that such attacks would almost certainly be quickly detected by security-aware users. This limits this method. Telling people to switch away from WhatsApp is very concretely endangering people. Signal is not an option for many people. These concerns are concrete, and my alarm is from observing what's actually been happening since the publication of this story and years of experience in these areas. You never should have reported on such a crucial issue without interviewing a wide range of experts. The vaccine metaphor is apt: you effectively ran a "vaccines can kill you" story without interviewing doctors, and your defense seems to be, "but vaccines do kill people [through extremely rare side effects]."
This discussion has been archived. No new comments can be posted.

Top Security Researchers Ask The Guardian To Retract Its WhatsApp Backdoor Report

Comments Filter:
  • by Anonymous Coward on Friday January 20, 2017 @11:07AM (#53703599)

    http://technosociology.org/?page_id=1687

    Rather than recursive links to other slashdot articles on the subject

    • Looks like the link to the original report (not in the Guardian article, but posted a couple times in the comments) might be Slashdotted. I found an archived copy [archive.org] at Internet Archive. It was posted last April and updated last May.

  • by Anonymous Coward on Friday January 20, 2017 @11:08AM (#53703601)

    Why the heck would they retract the truth?
    If your threat model includes government spying, WhatsApp is not secure since the government can force WhatsApp to reissue your key and then scoop us the resulting messages.
    The editorial spin on this story from slashdot is very disappointing.

    • by ledow ( 319597 )

      If WhatsApp want to sniff your messages, they can. They update the app to just not encrypt.

      If government forces them to do that, they can.

      In and of itself, that's an entirely different threat model.

      What this says is not "WhatsApp is 100% secure to use" (because security experts are not stupid enough to ever say that).

      They are saying "This compromise that you claim lets anyone open your encrypted messages? Yeah, it's rubbish unless you literally take over WhatsApp servers."

      There is no service in the world

      • The point of the "compromise" is not to let "anyone" open your encrypted messages, it is exactly for letting WhatsApp (the people that already control their servers) open your encrypted messages.

        And while this design flaw is being touted as a convenience feature, there's no telling what other flaws can be used along with this one for additional exploitation.

        And warning the user of a possible compromise AFTER the message has been sent? Yea that's real good security right there.

    • Why the heck would they retract the truth? If your threat model includes government spying, WhatsApp is not secure since the government can force WhatsApp to reissue your key and then scoop us the resulting messages. The editorial spin on this story from slashdot is very disappointing.

      There is no back door. The security issue that stemmed all of this is that whatsapp will deliver messages that were sent while a user moves from one device to another. So, if I send it to you while your phone is busted and you reinstall on a new phone, you get the messages. The recepient key changes, and the sender is notified of this.

      The security angle is that with SMS verification you could intentionally intercept someone else's messages. Well, message (singular) because as stated, it notifies the sende

      • by arth1 ( 260657 )

        There is no back door. The security issue that stemmed all of this is that whatsapp will deliver messages that were sent while a user moves from one device to another. So, if I send it to you while your phone is busted and you reinstall on a new phone, you get the messages. The recepient key changes, and the sender is notified of this.

        The problem, if I understand this correctly, is that the sender is notified after the message has been recrypted and sent to the recipient.
        If it alerted and required an accept before the message was sent to the new key, I don't think anyone would have a problem with it.

        • The problem, if I understand this correctly, is that the sender is notified after the message has been recrypted and sent to the recipient. If it alerted and required an accept before the message was sent to the new key, I don't think anyone would have a problem with it.

          But it is not a back door. It's a very limited channel to obtaining a few messages that requires you to have some way of verifying the account (SMS interception). If you are going to build a back door to something, this is about the worst way possible.

          • by arth1 ( 260657 ) on Friday January 20, 2017 @12:03PM (#53704001) Homepage Journal

            I think back door is a completely wrong description, but I still think it is a security concern.
            If a notification that the recipient key has changed only occurs after delivering the message anyhow, it kind of defeats having key verification in the first place.

            It's like if your bank re-routes your money transfer to a different recipient account than what you initially specified, and notifies you after the fact, instead of asking you if it's okay before doing so.

    • Comment removed based on user account deletion
  • Remember (Score:5, Insightful)

    by GeekWithAKnife ( 2717871 ) on Friday January 20, 2017 @11:17AM (#53703661)

    WhatsApp is big money...and combined with the fact it's hard to prove that a vulnerability was intentional and thus a "back door" it's hard for Joe Average to tell who's right.

    Don't worry about this stuff. Just keep using WhatsApp. It's just as secure as everything else, honest.

    Telling people not to use WhatsApp is apparently "endangering people"...as it is a "crucial issue".

    Summary; do not use Signal, ChatSecure, OTR or Telegram. Use WhatsApp, it's clearly safer #because_danger (??).


    Personally I never thought WhatsApp was secure even after this (maybe backdoor-ed) end to end encryption - Consider many people use WhatsApp? it's the number one target IM. If it ever was secure it won't be so tomorrow.
    • Why would I use Telegram if I were concerned about security? It has a closed-source, roll your own crypto system. WhatsApp and Signal use OpenWhisper.

      Anyway, WhatsApp might have security vulnerabilities or backdoors but the reported "backdoor" isn't a backdoor. It's a design choice, and there is an option for security-conscious people to see when a new crypto key is generated.

    • by Agripa ( 139780 )

      Telling people not to use WhatsApp is apparently "endangering people"...as it is a "crucial issue".

      I do not know if it is happening here but there is actually precedent for security agencies doing this. The next best thing to compromising a secure system is to make the users believe that you have so they change to something less secure.

  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Friday January 20, 2017 @11:17AM (#53703663)
    Comment removed based on user account deletion
    • I guess because it is .001% harder to use...

      I was going to say "because it isn't integrated into your FB contacts" but that might not be true... depending on how you sync your contacts.

    • by cryptizard ( 2629853 ) on Friday January 20, 2017 @11:47AM (#53703883)
      Read the article. The people they are concerned about are journalists and activists in repressive countries who use WhatsApp because it provides encrypted messaging. If they switch to Signal, which almost no one uses, just being observed using it may be enough cause for the government to pick them up. If they are able to use WhatsApp, however, they are hiding among the millions of other people that use it for no special reason other than it is a good messaging app.
    • by sl3xd ( 111641 )

      The story may be different if Signal was a federated protocol with entirely decentralized servers (like email).

      However, it's not, and there's a single point of failure that can be blocked.

      WhatsApp became popular and widespread before many repressive governments realized what it could do, so they can't block it without widespread outcry.

      Not so with Signal, which is blocked, and therefore not an option.

      • The point is that if WhatsApp is not blocked and Signal is, using WhatsApp is better than other options. You say yourself that the single block-able route is not the difference, its that one is blocked and the other isn't. As for the article, I would say that if someone's life or freedom depends on whether WhatsApp is secure -- they better well understand how this vulnerability applies to them based on their specific usage pattern, not based on some generalization from a newspaper article.
    • What's the point of being on an Instant Message service if none of the people you actually want to message are on it?
  • by Nidi62 ( 1525137 ) on Friday January 20, 2017 @11:22AM (#53703695)
    In these days of 24 hour news cycles and online publication, journalists and editors don't have time to do basic things like fact check with experts or even spell/grammar check. With no print deadlines they can throw up anything online at any time and easily edit it later, and preferably give it a nice clickbait title. It's the race to be first that journalism has always had but taken to an extreme combined with the fact that many journalists don't have the background or interest in the field the topic they are writing on is in.
    • I agree with your assessment but would suggest you remove the words, "journalists."

      There aren't any.

      That shit died when advertisers, CEOs and shareholders grabbed "news" by the fucking balls.

  • "Telling people to switch away from WhatsApp is very concretely endangering people." -- err, what?!? How in the world is that "concretely endangering people?!?"
  • ... including the comment section, is like using a fucking elephant gun to kill a piss ant.

  • From Schneier [schneier.com]:

    How serious this is depends on your threat model. If you are worried about the US government -- or any other government that can pressure Facebook -- snooping on your messages, then this is a small vulnerability. If not, then it's nothing to worry about.

  • Educating the public to privacy and security issues is a worthwhile exercise. Maybe it isn't a backdoor but people seem to be increasingly concerned when it is suggested that their messages can be intercepted and read by third parties. This can only be a good thing. Our privacy has been eroded by several large corporations and a weird fascination with social media. Several companies want access to all of our data but the number of high profile breaches illustrate a significant risk in trusting others wi

  • Honestly, why would anyone use Facebook software and not be concerned? I think Mark Z is in trouble from all ends at the moment and is butt buddies with those he shouldn't be. They even said in the post to not incurage people to stop using Whatsapp because Signal isn't available to everyone. That right there should tell you if that's the best argument they can give to the average nontechnical person, that Signal should be the preferred choice anyway. If a country is blocking Signal then they are blocking Wh
  • ...we need the ability to disable permissions right upon installation of the app. When android says the app requires wifi password, camera, SD card access, your firstborn, address book access and more, there should be a box next to the permission to disable right then. I know there are apps that allow you to do that, but you need to remember to run them afterwards, you need root, and you need to redo it in case of upgrade.
  • What WhatsApp does is reducing their E2E security to the security level of TLS. This means nobody can read the content except the server. With TLS, because its plaintext there, with WhatsApp because they can change the crypto keys and nobody cares (and most people do not even the the message).
    When you accept, that it's only transport security but not end-to-end anymore, you can use a lot more messengers, as most use TLS (i.e. because apple forces them to do).

  • "WhatsApp has enough security for those who don't need any."
  • Yeah, sure. I can’t for the life of me understand who could get worried about this.

If it's worth doing, it's worth doing for money.

Working...