3. Neglecting to authorize after authentication: Knowing a user’s identity is not enough to decide whether or not they may perform certain actions, according to the report. “Authorization should be conducted as an explicit check, and as necessary even after an initial authentication has been completed,” said the report. “Authorization depends not only on the privileges associated with an authenticated user, but also on the context of the request.”

4. Lack of strict separation between data and control instructions: “Comingling data and control instructions in a single entity, especially a string, can lead to injection vulnerabilities,” according to the report. “Lack of strict separation between data and code often leads to untrusted data controlling the execution flow of a software system.”

5. Not explicitly validating all data: Software systems and components often make assumptions about data they report on, and if designers don’t explicitly make sure that such assumptions hold, vulnerabilities will appear, according to the Center. “As such, it is important to design software systems to ensure that comprehensive data validation actually takes place and that all assumptions about data have been validated when they are used.”

6. Misuse of cryptography: Common pitfalls of cryptography include rolling your own cryptographic algorithms or implementations, misusing libraries and algorithms, poor key management, randomness that is not random, failure to centralize cryptography, and failure to allow for algorithm adaptation and evolution, according to the report. To avoid these pitfalls, the center recommends always working with an expert if possible to ensure developers aren’t making decisions on algorithms and cipher modes.

7. Failure to identify sensitive data and how they should be handled: To make sure data is handled properly, designers need to identify different levels of data classification and factor all relevant considerations into the design, according to the report. Not all data protection requirements are the same, and designers need to identify all the ways that data could be exposed or manipulated, the report said.

8. Failure to consider the user: Software systems interact with humans in one way or another, which is why all relevant stakeholders should always be considered, according to the report. “The security stance of a software system is inextricably linked to what its users do with it. It is therefore very important that all security-related mechanisms are designed in a manner that makes it easy to deploy, configure, use and update the system securely,” the Center wrote.

9. Failing to anticipate how integrating external components can open a vulnerability: Designers and security architects have to make sure time is allocated to consider how an external component is going to impact a system, and assume that all external components are untrustworthy until proven otherwise, according to the report.

10. Brittleness in the face of future changes: Software security must be designed for change, according to the report. The Center recommended designing for security updates; for security properties changing over time (for example: when code is updated); designing with the ability to isolate or toggle functionality—making it possible to turn off compromised parts and turn on performance-affecting mitigations if needed; for changes to things intended to be kept secret such as encryption keys and passwords; for changes in security properties of components beyond your control; and for changes to entitlements (for example: when a staffer leaves the organization or when job functions change).

“The reason why design flaws don’t get as much attention as software bugs is because they are hard to deal with,” said Cigital’s McGraw. “It’s much easier to solve the bug problem because we have technology to help us solve it. There is no such thing at the design level, but that doesn’t mean they aren’t as important.”