Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Facebook Bug Security The Almighty Buck IT

Facebook To Pay Hackers For Bugs 54

alphadogg writes "Facebook is going to pay hackers to find problems with its website — just so long as they report them to Facebook's security team first. The company is following Google and Mozilla in launching a Web 'Bug Bounty' program. For security related bugs — cross site scripting flaws, for example — the company will pay a base rate of $500. If they're truly significant flaws Facebook will pay more, though company executives won't say how much. 'In the past we've focused on name recognition by putting their name up on our page, sending schwag out and using this an avenue for interviews and the recruiting process,' said Alex Rice, Facebook's product security lead. 'We're extending that now to start paying out monetary rewards.'"
This discussion has been archived. No new comments can be posted.

Facebook To Pay Hackers For Bugs

Comments Filter:
  • by Anonymous Coward on Friday July 29, 2011 @10:55PM (#36930166)

    No, you're the one that's delusional. Believe it or not, people reported these even before today responsibly. Why? Because it's the right thing to do. The monetary incentive is there to encourage people to spend a bit more time looking.

  • by lgarner ( 694957 ) on Friday July 29, 2011 @11:31PM (#36930322)
    No assurance they aren't doing that already.
  • by dbIII ( 701233 ) on Saturday July 30, 2011 @01:22AM (#36930724)
    Here's some big ones:
    Domain name time to live is only 30 fucking seconds! That means anything on the net looking for facebook rechecks twice a minute to see if it really is where it says it is. That's a lot of extra traffic but more importantly latency - a waste of everyone's time as their browser checks if facebook is still there and waits patiently back for the the news that facebook hasn't moved anywhere in the last 30 seconds. Because such stupid settings waste time and traffic RFC1035 requires a minimum of at least 300 seconds for TTL. Because nobody thought anybody would be so stupid facebook stopped working via a lot of web proxy software a few years ago until it was all patched especially for facebook.

    Content is marked as being from the year 2000! That's a nasty hack to force web browsers to refresh as fast as they can - a big waste of space that is truly antisocial since there are a lot of broadband plans worldwide that have download limits.

    Content that should be able to be cached is marked as non-cacheable! Maybe the page has changed, but has the facebook logo and a pile of other static content been redesigned in the last minute? Who cares - let's force the user to download it all over again and make it tricky for their ISP or company proxy server to cache it all! Let's make them pay more for their internet connection (download limits remember), add a lot of entirely useless repeat traffic to reduce the available bandwidth and increase latency with a pile of pointless host lookups.

    Draconian workplace policies that ban facebook are not always there to stop people wasting time, they are sometimes there because facebook wastes a lot of network resources so it comes down to a choice of blocking a site that is buggy by design or paying for a better connection and still having to limit staff facebook use at busy times.

Our business in life is not to succeed but to continue to fail in high spirits. -- Robert Louis Stevenson

Working...