Forgot your password?
typodupeerror
Bug The Almighty Buck

Encrypted Messaging Startup Wickr Offers $100K Bug Bounty 39

Posted by samzenpus
from the getting-paid dept.
alphadogg writes "Two-year-old startup Wickr is offering a reward of up to $100,000 to anyone who can find a serious vulnerability in its mobile encrypted messaging application, which is designed to thwart spying by hackers and governments. The reward puts the small company in the same league as Google, Facebook and Microsoft, all of which offer substantial payouts to security researchers for finding dangerous bugs that could compromise their users' data. Wickr has already closely vetted its application so the challenge could be tough. Veracode, an application security testing company, and Stroz Friedberg, a computer forensics firm, have reviewed the software, in addition to independent security researchers."
This discussion has been archived. No new comments can be posted.

Encrypted Messaging Startup Wickr Offers $100K Bug Bounty

Comments Filter:
  • by mfwitten (1906728) on Friday January 17, 2014 @03:21AM (#45983335)

    You'll get better regulation from this than from anything that could possibly be concocted by government bureaucrats.

    Note: This requires the real threat of economic loss, so an organization that can demand payment regardless of its performance—i.e., the government—cannot implement something similar.

    • Re:Real Regulation (Score:4, Insightful)

      by Rosco P. Coltrane (209368) on Friday January 17, 2014 @03:39AM (#45983407)

      Government bureaucrats don't concoct regulations anymore. At least no regulations that doesn't serve their interests. In case you haven't noticed, it's pretty much we-the-people against them nowadays.

      • by mspohr (589790)

        Most government regulations these days are written by industry in order to reduce competition and make their life easier and more profitable. They pay good money to bribe politicians to get these laws and regulations established. No surprise then that the regulations end up being against the interests of most people.

    • by mspohr (589790)

      You'll get better regulation from this than from anything that could possibly be concocted by government bureaucrats.

      Note: This requires the real threat of economic loss, so an organization that can demand payment regardless of its performance—i.e., the government—cannot implement something similar.

      Freedom Industries just announced that it was declaring bankruptcy after contaminating drinking water for 300,000 people with it's unregulated toxic chemical storage leak. ... so much for accountability... taxpayers will be left paying for the cleanup as well as suffering the toxic effects...
      (I'll bet the bosses of this company got their money out early.)

      • by mfwitten (1906728)

        Bankruptcy is defined by the government.

        Corporate liability firewalls are defined by the government.

        Taxpayer cleanup is established by the government.

        Competition in regulation was destroyed when a monopoly on regulation was declared by the government.

        The existing regulation was establisehd by that government.

        I see one common element throughout all of the details you dislike. Can you spot it?

        • by mspohr (589790)

          Did you read my message?
          The government is a tool of business. Corporations buy politicians to get the laws they want.
          FTFY:
          Bankruptcy is defined by the government... in response to corporate requests and bribes.

          Corporate liability firewalls are defined by the government.... in response to corporate requests and bribes.

          Taxpayer cleanup is established by the government.... in response to corporate requests and bribes.

          Competition in regulation was destroyed when a monopoly on regulation was declared by the gove

          • by mfwitten (1906728)

            Corporations couldn't buy so much power if the government didn't have so much power to sell in the first place.

            In other words, either the problem is economic success through voluntary interaction, or the problem is a centralized monopoly on involuntary interaction for hire to the highest bidder. Which one is it?

  • That's a nice publicity stunt though
    • by Pi1grim (1956208)

      My though exactly. Even if third-party researchers cannot find any vulnerability in the protocol itself, who says there isn't a backdoor in the server part, that will reduce security to 0? Pretty sure they won't open the server part to scrutinity (even if they do, how do we verify that it's the same version running on the actual server?)

    • It's a scale. The more serious bug gets more money, with the typical being about $10K, per TFA.
  • by Rosco P. Coltrane (209368) on Friday January 17, 2014 @03:35AM (#45983395)

    Wouldn't it be funny if the NSA came forward and claimed the prize money many times until the company went under? Because surely they have backdoors all over the place to walk right through these guys' security measures.

    • by Pi1grim (1956208) on Friday January 17, 2014 @03:42AM (#45983413)

      Maybe it would, but those backdoors are worth much more to NSA unpublished. As well as all the data that passes through the vulnerable services. So should you scenario come to life, it would be huge success for endusers, as many vulnerabilities would be closed.

      Regarding the article: talk is cheap, show me the code. And let me host this server myself, with inter-server communication. Otherwise it's no better than hangouts, iMessage, Whatsapp, Viber and whatnot else is now trying to be the one and only messaging service. You can't even begin speaking of security if a) you can't audit the code b) you can't control the data.

  • by Anonymous Coward on Friday January 17, 2014 @03:47AM (#45983429)

    I'd bet its susceptible to:
    The phone you run it on is tracked, and the company that does so shares that data.
    Timing attacks: if you send data at some time, and someone else gets a message then, that implies you communicated with them.
    Visual surveillance. Camera sees you type, camera sees your message.
    They claim "sender-based control over who can read messages, where and for how long". This is impossible. If the receiver can see the message, they can record it.
    Boarder patrol requesting access.
    Torturing you as an "enemy combatant"

    And some likely others:
    How do they handle key distribution? If you setup communication with someone via email, text or whatever, that can be compromised before you even start.

    Looking through the tech they claim to be using, it seems like they lack defenses against Rubber-hose cryptanalysis [wikipedia.org]. Is there any effort in the area of deniable encryption, or maintaining plausible deniability about having messages or particular contacts? I suspect not.

    Its rather impractically expensive to provide sufficient random cover traffic on a phone to blind against timing correlation attacks on video messages. Given that we know the cell networks are heavily watched, even if the messages were routed through Tor that wouldn't be enough to reliably disassociate sender and receiver (You would want the ageing options planned for I2P for that). Then just get a warrant, and compel them to disclose the contacts and any pending messages. There are [partial] defenses that can be employed here (like TrueCrypt does with hidden volumes for example), its not unsolvable, just often ignored.

    Security is hard. Security against a large scale threat such as governments is very hard. Securing the message contents is easy, securing that there was a message is the real challenge.

    All that said, it looks like they likely do a pretty good job of making end to end encryption accessible. While thats not all one might want, its more than most of us get, so its still a good thing. Its progress, not a solution.

    • by Pi1grim (1956208)

      I'm pretty sure they omitted the part where users have to exchange keys over trusted channel (or at least a channel that prevents or makes it really hard to tamper with it). And this allows for a mitm attack, so all that fancy encryption is absolutely useless, since the attacker will have both keys and total control. What we need right now is not a gazillion of apps that create the illusion of privacy, but a protocol and a set of standards for federated communication channel (pretty much what XMPP is). Sinc

      • by Anonymous Coward

        Sadly email, no matter what you do to the body, has unencrypted headers. There is no hope of hiding who messages who when with how much content.

        I agree that we to use a standard protocol over a federated network. Make easy to use apps the route it over Tor and use hidden services to locate each other (at least for text, and non realtime voice and video). I want an easy to install personal server (say, runs on my RaspberryPi) that lets me securely access it (via a Tor hidden service for example, since the wo

        • by tramp (68773)
          Any linuxdistro does have several mailservers to choose from which can be combined with Roundcube webmail.
      • I assume the secure channel goes over the servie provider ... so unless he himself is the mitm or is "already cracked" .... you get the picture.

  • A serious vulnerability? The people using it of course, always the most serious vulnerability
  • by spacefight (577141) on Friday January 17, 2014 @04:01AM (#45983487)
    It's 2014, after all.
  • by Anonymous Coward on Friday January 17, 2014 @04:19AM (#45983561)

    What other vulnerability do you need ?

    • Agreed. The NSA can a National Security Letter to demand that Wickr release an update to their software that forwards all of the plain text to the NSA. Wickr will be unable to challenge that directive in court or make public that it was received.

      There are many good arguments for allowing proprietary software in the public sphere, but when it comes to privacy and encryption, I think we have no choice but to accept open source as the only way to go.
    • What other vulnerability do you need ?

      That's an excellent (and sad) point. Just to re-enforce it, and to perhaps defend the _intentions_ of the company's founders, they've already made public that their leader was approached, after giving a conference talk, by a man claiming to be from the FBI who asked nicely for her to cooperate on installing a back door. Apparently her microphone was still hot from giving the speech.

  • I support the sentiment of these guys but your code is going to be running on a platform that is largely exploitable by most English speaking foreign governments and possibly well funded crooks.

    What this means is that no matter how good your software is it will be ultimately rendered useless by going after the host platform and memory.

    Also anything that uses a public key exchange is only secure because certain reversals of transformation are 'hard'. There is no universality to hard, what is hard for m
    • by Pi1grim (1956208)

      >> Also anything that uses a public key exchange is only secure because certain reversals of transformation are 'hard'.
      Nope, it's also only secure as long as you verify that the key you have in front of your eyes corresponds to the person you want it correspond to.

      >> There is no universality to hard, what is hard for me may not be hard for you..

      Actually you might want to refresh your memory a little bit about cryptography. To crack a decent asymmetric cypher it would take more than visible unive

  • Would the 100k cover lawyers expenses if you used this method?

    XKCD:Security [xkcd.com]

  • I guess it wouldn't count to run their app on a rooted phone that presents compromised APIs to the apps? Or crack it open, inject logging code, repackage and resign it then submit it to a third-party marketplace? That is to say, the standard security problems all apps face as opposed to a flaw specific to the Wickr app?
  • why not hire some QA to do stuff like that full time and they also get some in house beta testing that is not just the coders testing there own code.

    • by Shados (741919)

      Yeah, because the average QA is a master of cryptography. You need to hire security specialists for this....and they did.

      Now, after all of that, they want to make sure nothing slipped.

  • by Anonymous Coward

    1. American company
    2. Software is not open source
    3. Reliance on the company's servers, not peer to peer
    4. I can't run my own server
    5. Susceptible to traffic analysis
    6. Runs on a mobile platform I cannot fully control

    What more do you want?

    It is no point to discuss details like buffer overflows when the whole premise is, quoting Doge: such flaw, much hot air.

    The whole thing revolves around an illusion of trust. The company wants to create an image of trustworthiness so that you trust their ability to offer pr

  • by Anonymous Coward

    A competing app did just that, and a guy from Russia won the $100k [telegram.org]. Now they're offering $200k.

    Still no article on Telegram in /.

To do nothing is to be nothing.

Working...