As teams are pressured to release software more rapidly, more and more aspects of software development are being forced to “shift left,” moving up earlier in the development lifecycle.
Because of the speed in which code is updated and delivered, security can no longer be thought of as an afterthought, said Rani Osnat, VP of product marketing at Aqua Security, a company that specializes in container security. “That’s why we profess to shift left security and basically embed it as early as possible in the development process so that developers can do a lot of the work in advance as they deliver the applications and not expect to throw it over the fence and have someone else take care of it.”
Operations teams can no longer accept an application as is and plan on securing it once it is deployed in the runtime environment, Osnat said.
Application security used to act as governance and as a gate that security teams applied to evaluate the security of software before it was deployed. “I think as trends like agility or trends like continuous delivery or DevOps come into play, that role as a point-in-time gate and as a governance function is being questioned,” John Steven, senior director of software security at Synopsys, an application security company, explained.
RELATED CONTENT: A guide to DevSecOps tools
He added that when teams go to implement security, they often search through regulations or information on the web to look for what they should care about. “I think organizations are struggling to figure out what’s the difference between what the web tells me I should look for in terms of security problem and what would impact my business in terms of risk,” said Steven. “And so they’re struggling to figure out what they need to pay attention to.”
They question how attackers will explore their organization and attack its assets and how that is different from what they paid attention to in the past. They also question how they will adapt the sensors that are already in place to look for vulnerabilities, Steven explained.
According to Arkadiy Miteiko, co-founder and CEO of CodeAI, an AI-based security platform, the top performers in the industry typically have three things implemented in their security workflows:
- They injected code analysis tools into the development process and enforced fixes prior to deployment,
- They automated attacks against pre-production code and prevent that code from reaching production if attacks are successful, and
- They continually test the production environment for weaknesses in an automated fashion.
Though many organizations have already adopted DevOps, one trend is now DevSecOps, which adds a security team in addition to the development and operations teams.
Osnat believes that security teams should be responsible for creating and enforcing security policies and determining what is an acceptable level of risk. The implementation of those policies, however, should be handled jointly by security and development teams.
According to Osnat, there is a shortage of cybersecurity professionals, and that shortage is not getting any smaller. According to a survey published by the National Institute of Standards and Technology this June, there are 301,000 open cybersecurity jobs throughout the United States. A report from Cybersecurity Ventures predicts that the number of openings will rise to 3.5 million by 2021.
“On the other hand, there are many more developers in the world,” said Osnat. “If you look at it as a global issue, basically what’s happening is that developers are developing more applications faster and delivering code faster than security can catch up to. That’s something where really the only way to address it is not to just give more work to security, but to move some of the burden to the developers in using best practices to secure applications when they are developed. But, of course they need to be taught and told what they are expected to do. It’s not something that you can expect them to just figure out on their own.”
The shortage of cybersecurity professionals can also be addressed by incorporating artificial intelligence into DevOps and security workflows. “The future belongs to intelligent machines which are able to augment some of the security testing functions while working alongside humans,” he said. “A shortage of skilled security professionals on both sides (AppSec and CyberSec), and their relatively high cost will drive an adoption of intelligent automation powered by AI systems and Quantum computing.”
Shifting culture as well
There is also the issue that shifting testing left requires a huge cultural change within the organization. “Cultural imperatives are very hard for organizations to adopt because organizations reject culture change like viruses,” said Synopsys’ Steven.
Even though the spirit of DevOps involves breaking down the silos between developers and operations, that does not always happen, explained Steven. Often, organizations will hire a DevOps engineer, typically reporting up to operations. “They’ve taken this cultural imperative to break down the walls, and they’ve turned it into a role in one of the silos, which is of course a perversion of the intent.”
“I would hate for DevOps just to become a set of tools that a security group or operations group buys to engage developers more effectively, but they all stay in their silo,” Steven continued.
Steven explained that the companies that have successfully scaled up well and handled performance well, those are the companies that effectively broke down those silos. Those organizations made security everyone’s job and the security team acted as a coach on the sidelines, while also enabling visibility into what was going well, what was going poorly, and where more time needed to be spent, he said.
When organizations aren’t able to break down those silos and let developers handle security, it may be a result of organizations not planning out their goals correctly from the top of the organization down to the individual teams, explained Pete Chestna, director of developer engagement at CA Veracode, a provider of an automated end-to-end service that simplifies application security. Companies should look at their goals and whether or not the development teams are accountable for what they build. If they’re not, that’s an area that needs to be addressed within the organization.
When development teams have the option, they may push the responsibility onto some other group. “Once that becomes a non-option then they start to make that change real,” said Chestna.
“There’s a lot of automation that you can do, which again is absolutely mandatory in these environments because of the speed in which code moves in the pipeline,” said Osnat of Aqua Security. “It is just not manageable with purely manual control.”
RELATED CONTENT: How these companies can help make your applications more secure
A role for artificial intelligence
Introducing AI into the equation can solve some of the issues here. “Generally speaking, AI is extremely good at recognizing patterns and making statistical predictions based on its pattern recognition,” said Miteiko, of CodeAI. “Noise is a recognizable pattern. Once it has been recognized it can be filtered out. The quality and security issues that we are dealing with in code today are the same coding errors we fixed years ago.”
Shifting the burden to developers seems like the ideal solution, but often the developers’ education did not properly prepare them to code securely. “It’s a muscle that development organizations don’t have,” CA Veracode’s Chestna explained.
“If you allow developers to continue to code incorrectly and then correct them later, you’re not really helping them be better,” said Chestna. “DevOps is all about continuous improvement. So we need to take the knowledge of what they struggle with and we feed that back to them in the form of training and then measure whether or not that training was effective, and they would get better in that process.”
According to Chestna, the idea of coding securely can be taught, it is just a matter of whether organizations will put pressure on universities to change their curriculum. “They’re not going to do that until we change the requirements,” he said. “So until you start to say that this is something that I want to hire, and I want your university to support this – that’s something that’s not going to happen, but that’s really the shift left that I want to see.”
He explained that 25-30 years ago, students knew that when they wrote code, someone would test it afterwards. Later, students were taught to write tests as they coded, whether via test-driven development or unit tests.
“Similarly, if we start to put security into the vain, we’ll have the same effect where graduates walk into a company and know code it has to be secure, it has to function, and it has to perform well,” said Chestna. “Those are things where if they’re taught earlier on, it just becomes part of their nature.”
Looking towards the future, many experts agree that there is still much to be done.
“I think the fact is it is growing,” said Osnat. “I think the first generation of solutions that were out there were very much tied to specific programming languages and specific environments. I think as we move into cloud-native applications a lot of these things start to go away because they are created to run in different environments, to be a lot more flexible.”
Osnat also believes that we are not very far away from a day where a lot of companies that provide development platforms will embed security tools in those platforms.
“If, in the next five years, vendors are able to provide the industry with tools that have the capabilities required to win this security game we’ll begin to see drastic improvements in the overall security posture,” said Mitieko of CodeAI.
In the future, the burden will not just fall to the developers and security teams. Software vendors will be expected to integrate security into their tooling as well.