How much emphasis does your organization place on the Open Web Application Security Project (OWASP) Top 10 list?
The 2013 Top 10 list of Web application security flaws was recently finalized, and it may likely become the de facto yardstick that many developers will use to test the security of their applications. Even the Payment Card Industry’s data security standards defer to the Top 10 list.
But there’s one problem: OWASP’s list of security flaws doesn’t map cleanly to software security requirements. After all, the Top 10 list includes a number of broad categories of threats with large subsets that are easy to miss.
While the Top 10 list is a useful awareness tool for developers, it should not be viewed as a prescriptive list of how to build secure software. Think of all the breaches in popular Web applications that have occurred over the past few years—specifically the ones that didn’t follow basic best practices like hashing and salting passwords. Is it really possible that none of these applications were PCI compliant or passed the Top 10?
And consider the recent HP 2012 Cyber Risk Report, which revealed these dismal findings on Web application security culled from static-analysis testing:
• 92% were vulnerable to information leakage and improper error handling
• 88% had insecure cryptographic storage
• 86% had injection flaws
• 75% had insecure direct object reference
• 61% had broken authentication and session management
Plus, these statistics gathered from dynamic testing in the same report:
• 45% were vulnerable to cross-site scripting
• 26% had insufficient transport-layer protection
• 25% had security misconfiguration
• 13% had broken authentication and session management
• 9% had injection flaws
Bear in mind that most of these vulnerabilities have been on the OWASP Top 10 list for years, and they are not at all exotic. So why are these failure rates still so high?
Overly broad categories
The Top 10 list wasn’t designed to be a prescriptive guide for secure development, but that’s how many organizations have been treating it. It’s simply too broad to be used for specific requirements. Take “sensitive data exposure” (A6 on the list) as one example. Unlike more specific concerns like “cross-site scripting” (A3), in which it’s quite clear what the developer needs to look for, sensitive data exposure (like several other Top 10 categories) is much more open-ended and could mean several things.