Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security The Internet News

Web App Scanners Miss Half of Vulnerabilities 68

seek3r sends news of a recent test of six web application security scanning products, in which the scanners missed an average of 49% of the vulnerabilities known to be on the test sites. Here is a PDF of the report. The irony is that the test pitted each scanner against the public test files of all the scanners. This reader adds, "Is it any wonder that being PCI compliant is meaningless from a security point of view? You can perform a Web app scan, check the box on your PCI audit, and still have the security posture of Swiss cheese on your Web app!" "NTOSpider found over twice as many vulnerabilities as the average competitor having a 94% accuracy rating, with Hailstorm having the second best rating of 62%, but only after extensive training by an expert. Appscan had the second best 'Point and Shoot' rating of 55% and the rest averaged 39%."
This discussion has been archived. No new comments can be posted.

Web App Scanners Miss Half of Vulnerabilities

Comments Filter:
  • by nuckfuts ( 690967 ) on Saturday February 06, 2010 @05:23PM (#31047996)

    No vulnerability scanner will ever detect 100% of the vulnerabilities possible. They're still very useful, however, because no website is going to have 100% of all the vulnerabilities possible.

    Think of it another way. If your website has only 1 vulnerability and the scanner detects it, then it's 100% effective.

    If your website has only 1 vulnerability and no scanner detects, score 1 for the bad guys. The cat and mouse game continues.

  • PCI Still Important (Score:4, Interesting)

    by savanik ( 1090193 ) on Saturday February 06, 2010 @05:34PM (#31048060)

    The key message here is that simply testing your web site with a vulnerability scanner doesn't make it secure. Well, duh.

    PCI is still important because before the guidelines, most people weren't scanning their web sites at all. Even when they knew how - they couldn't convince management it was worth the trouble, time, dollars, and so on. And without scans, the number of discovered web vulnerabilities approaches 0%.

    PCI isn't just about scanning your website, either. There's hundreds of things you have to do to secure everything from the physical layer up to the application layer. And having PCI be required to process credit cards makes everything much more secure. I'm talking about small businesses so cheap they don't want to put LOCKS on the doors between the outside world and the servers holding your plain-text, unencrypted credit card numbers, and who don't have the expertise to set up a web camera on their own building.

    You might not like PCI, it might be inconvenient, but it's necessary to protect the general public.

    Disclaimer: I am an information security professional.

  • by trentblase ( 717954 ) on Saturday February 06, 2010 @06:06PM (#31048236)

    You might not like PCI

    The only thing I don't like about PCI is the acronym they chose.

  • by Anonymous Coward on Sunday February 07, 2010 @12:45AM (#31050400)

    Scanning vendors are quite good at discovering non-issues such as the availability of "weak" SSL ciphers and known problems in technology stacks. They are unfortunately useless when it comes to discovery of application level security problems.

    Its all just a big scam where people pay lots of money to companies who mis-represent actual PCI compliance requirements, hit their servers with Nessus and print out a security audit pass certificate the company can hang on their wall. Its about as useful and seedy as the SSL market has become in recent years.

    Look at the nonsense in the PCI-DSS and you'll see that it was written by individuals without strong security backgrounds.

    For example they explicitly suggest the use of "secure" one way hash algorithms for storage of card data. It doesn't matter how good a fricking hash algorithim is when the entropy of the entire possible card space is less than 10 trillion!!

    They provide password complexity guidelines including changing passwords often. In practice we see all the time that all this does is increase the chances of people writing them on sticky notes and pasting them to their monitors.

    Finally we have the omnipresent virus scanning and firewall checklists. These systems do nothing to protect the application and are incapable of providing security guarantees but MGMT loves them because they get to tick off the firewall and virus checkboxes and then its just off to continue behaving stupidly as ususal with sensitive information.

    Its better than nothing but it really needs to be reviewed by a disinterested third party with a real security background.

  • by RzUpAnmsCwrds ( 262647 ) on Sunday February 07, 2010 @01:17AM (#31050552)

    But at least when it comes to hardware, we're willing to throw everything away and start from scratch.

    Is that why I'm using a pipelined, out-of-order implementation of a 64-bit extension to a 32-bit extension of a 16-bit ISA?

    I mean, shit, my Core 2 Duo supports everything from 128-bit vector instructions to segmented addressing. I have USB and PCI Express busses on my ThinkPad, but also CardBus/PCMCIA and a modem. I have Gigabit Ethernet but it is still compatible with 10Base-T. I have a DVI port (through the dock) but also a VGA port. My DVD-RW will read CDs which are 30 years old.

  • by b4dc0d3r ( 1268512 ) on Sunday February 07, 2010 @01:33PM (#31053272)

    I'm going to take issue with this and say the problem is with the internet itself, RAD applications, businesses, and self-taught coders. Allow me to explain.

    Half of the .NET code I write is copy/pasted from some other source, because the entire CLR is too complicated for a single person to understand. If I want to do a lookup table, there are a dozen ways to accomplish it, just using the objects provided by the runtime. I don't care how fast it is, unless it's called every page view, so I just google "C# lookup" and get piles of examples. Copy/paste, I'm done. Doesn't matter if it's from MSDN or a Microsoft blog or a random coder blog or wherever else, the code looks good and it works. I have no idea if the example failed to initialize some critical component.

    My employer doesn't want to pay me to read, I am supposed to be providing output they can sell to clients/customers. So I don't get a lot of time set aside for training. The way I learned .NET was our tech lead opened up a team meeting and said "I think .NET is the way of the future, is there anyone opposed to going this way?" And the only real objection was it will take longer to produce the next version of our deliverables. Management was fine with that, so we took the leap.

    We didn't sit down in a classroom and learn how things are supposed to be done. We didn't get a copy of something like Petzold's Windows bible, or Prosise MFC bible where it goes into depth about what you're doing and what things mean when the IDE puts junk in places for you. Visual Studio 2003 and above make it very easy for you to have no idea what you're doing, and still accomplish something. A quick google search can fill in all of the gaps so you have something functional.

    The same with 'Learn X in 24 hours' or 'X for dummies', lots of code samples exclude error checking/handling. Oh yes, MSDN is full of these examples. Sometimes they suggest "error handling has been omitted for clarity", while sometimes it's just assumed. Other times the author has no idea they should be handling errors because it works for them.

    So you have piles of coders learning on-the-fly, either because they can't afford the big book or because they have deadlines to meet. Copy/paste something without taking the time to fully understand what's happening, and you get potential problems. In short, easy access to code snippets makes you think you're able to do lots of cool stuff in a new language. Unless you take the time to understand everything you're running, every line of code, you're going to have problems at some point.

    Why do you think people still make mistakes like putting form variables directly into SQL? The code snippets are out there, either in the corporate source control or on random blogs. Copy, paste, pwned.

    An example, for those of you who wish to tl;dr me you can stop now.

    I used MyGeneration templates to come up with database calls for our SQL database, which used Data Access Blocks or some kind of MS best practice to write functions which called stored procedures, so you could essentialy call stored procs exactly like any other function. It generates a call for every stored proc in the database, so you can make fundamental changes to the data structure, re-generate the data access library in a few seconds, and then fix the few calls where the parameters changed.

    Very handy, except that the 'execute non-query' template had a bug in it, where the data connection never closed. We never had any problems with this app in production for 3 years. Suddenly in testing, we got a pooled connection exceeded timeout. Turns out the bug only shows up when the call happens most page views, when logging user visits in this case. Other non-query calls happened infrequently enough that they never exceeded the 100 connection default limit, live and in production for 3 years.

    Our tech lead found MyGeneration, recommended it, and we used it ever since. Not until last month di

"Everything should be made as simple as possible, but not simpler." -- Albert Einstein

Working...