How much emphasis does your organization place on the Open Web Application Security Project (OWASP) Top 10 list?
The 2013 Top 10 list of Web application security flaws was recently finalized, and it may likely become the de facto yardstick that many developers will use to test the security of their applications. Even the Payment Card Industry’s data security standards defer to the Top 10 list.
But there’s one problem: OWASP’s list of security flaws doesn’t map cleanly to software security requirements. After all, the Top 10 list includes a number of broad categories of threats with large subsets that are easy to miss.
While the Top 10 list is a useful awareness tool for developers, it should not be viewed as a prescriptive list of how to build secure software. Think of all the breaches in popular Web applications that have occurred over the past few years—specifically the ones that didn’t follow basic best practices like hashing and salting passwords. Is it really possible that none of these applications were PCI compliant or passed the Top 10?
And consider the recent HP 2012 Cyber Risk Report, which revealed these dismal findings on Web application security culled from static-analysis testing:
• 92% were vulnerable to information leakage and improper error handling
• 88% had insecure cryptographic storage
• 86% had injection flaws
• 75% had insecure direct object reference
• 61% had broken authentication and session management
Plus, these statistics gathered from dynamic testing in the same report:
• 45% were vulnerable to cross-site scripting
• 26% had insufficient transport-layer protection
• 25% had security misconfiguration
• 13% had broken authentication and session management
• 9% had injection flaws
Bear in mind that most of these vulnerabilities have been on the OWASP Top 10 list for years, and they are not at all exotic. So why are these failure rates still so high?
Overly broad categories
The Top 10 list wasn’t designed to be a prescriptive guide for secure development, but that’s how many organizations have been treating it. It’s simply too broad to be used for specific requirements. Take “sensitive data exposure” (A6 on the list) as one example. Unlike more specific concerns like “cross-site scripting” (A3), in which it’s quite clear what the developer needs to look for, sensitive data exposure (like several other Top 10 categories) is much more open-ended and could mean several things.
For instance, on the one hand, it could mean a common step such as encrypting confidential data during transmission or during storage. But it could also refer to less-common tasks such as avoiding caching confidential data in temporary files or using unsafe cryptographic modes. This is the trap many organizations fall into.
The current Top 10 list also includes four other threat categories that are too general to be used as requirements: “injection” (A1), “broken authentication and session management” (A2), “security misconfiguration” (A5), and “missing function-level access control” (A7).
Few organizations understand how to properly assess these open-ended categories. Instead, they limit their scanning to a small subset of the actual threats and then assume the application has passed that requirement. The same is true with organizations that perform penetration testing to verify the application’s security.
Key vulnerabilities left out
By its very nature, the Top 10 list is limited in scope and doesn’t include all relevant threats that might pertain to specific Web applications. Here are just a few security issues that are left out of the current list:
• Mass assignment vulnerability
• Buffer overflow (not usually captured in “injection flaws”)
• Validating client certificate chain of trust correctly
• Transactional authentication (rarely falls under “broken authentication” or “authorization”)
• Security controls enforced on client but not server (e.g. input validation)
• Integer overflow
• Hard-coded secret keys
It’s also important to keep in mind that the OWASP Top 10 list is only designed for Web applications. Today’s applications are more complex and multifaceted: They might have mobile app components, rich clients and Web services, all of which have their own set of security requirements that may or may not be covered by the overly broad Top 10 categories.
What developers should use instead
The OWASP community is aware of the limitations of its Top 10 list; that’s why they created a more comprehensive software security requirements program called the Application Security Verification Standard (ASVS) project. This program hasn’t had the same level of appeal as the Top 10, however, so fewer organizations have adopted it.
It’s time for developers to stop using the Top 10 list in the wrong way. It was meant to be an awareness tool, not a list of requirements. Instead, they should utilize a more-effective tool like the ASVS. Effective security requirements need to be specific so that they’re not open to interpretation; testable, so that developers can be sure they’ve met the requirements; and relevant, so that they actually apply to the application in question.
Rohit Sethi is vice president of SD Elements, a security consulting company. He also created the OWASP Design Patterns Security Analysis project.