As security and privacy grow in importance, regulatory compliance is becoming an increasing priority for most businesses.

But let’s just say it: compliance audits are not fun.  That’s especially true when it comes to engineering and development teams, who are tasked with gathering all of the relevant data – in other words, evidence – needed to assess and demonstrate compliance with various regulatory frameworks.  The more complex the environment, the harder the task.

Making matters worse, evidence collection is a highly manual process that, depending on the size of the organization and the number of audits, can consume hundreds of hours annually, at least.  For many enterprises, the commitment is measured in thousands of hours spread across multiple different teams.  This is time these teams and individuals could better spend doing their regular jobs.  For development teams, this means building and improving software.

 So, compliance is simultaneously a critical need and a huge time suck.  How do we square that circle? It turns out the answer has been staring us in the face:  use automation to make compliance fit into a DevOps approach.

What do I mean by that?  I mean it is entirely possible to automate the collection of evidence across public cloud environments and connected SaaS systems to rapidly gather technical data such as specifics on user access, encryption of data at rest, key management, network segmentation, firewalls, vulnerability scan reports and more. The results can be transformed into formatted reports ready for internal review, analysis and auditor review/approval. Even better, the data is considered more complete and accurate from a compliance perspective, because it includes time stamps and other metadata to ensure capture and processing integrity.

Some teams and organizations already try to accomplish this on an ad-hoc basis through scripting.  That’s a natural inclination for a developer, and I’m willing to bet a lot of people reading this article immediately thought “I can build a script for that” – if they haven’t done it already.  But there are a number of problems with using scripts. 

First off, scripts are time-consuming to build, and there can be a lot of them required to have a meaningful impact on compliance audits.  Second, those scripts need to be tested and, as systems evolve, modified to keep pace.  Third, who is doing the next audit?  If it’s not you, do they understand your scripts?  Are you going to explain the process to them?

Various vendors and cloud providers have also tackled this challenge, some more successfully than others.  Much of the reason goes back to that notion of complexity.  Are all your systems native on a single cloud platform, or do they stretch across different clouds or hybrid environments?  Are you only going after compliance with a single regulatory framework, or do you want to automate multiple certifications? When you mix in multiple clouds or hybrid infrastructure, add in different SaaS tools and systems, tackle more than one compliance audit annually, etc. – all of a sudden, the scope and scale of evidence collection can become massive.  Only the broadest of vendor approaches will get the job done.

Done properly, however, this type of commercial automation solution does offer significant benefit.  Not only does it dramatically lessen the burden (and improve the quality) of evidence collection, it also creates the equivalent of a “system of record” to centralize, organize and codify all compliance data.  Want to look at what was gathered for past audits?  Need to apply the same evidence across multiple different frameworks, such as SOC 2 and PCI DSS?  It’s all right there for the taking. 

From a DevOps perspective, another intriguing long-term possibility is tackling compliance with a broader, community-based approach. Consider: adherence to industry standards for data handling, privacy and protection is not a competitive issue. We are all better off if everyone shares a basic commitment to complying with these IT security and compliance standards. In that light, the more we can lower the barriers to achieving compliance, the broader the adoption. 

While vendors can broadly address compliance automation for cloud and SaaS platforms, there are still a vast number of potential data sources, and many may be either too limited in scope to justify commercial connectors, or may reflect custom development.  A community mindset might encourage people and organizations to tackle these issues through the use of standardized, open APIs.  Engineers could thus share custom collectors, cloud and SaaS vendors could support those APIs to facilitate capture and extraction of relevant data, and compliance and tool vendors could design their collection engines, evidence libraries and usage policies such that customization and sharing is possible. Similarly, automation tools must not only pull data from standard systems, but support pushing of data from custom systems or legacy infrastructure. In my view, this bottoms-up approach is the best way to stop wasting DevOps’ time on audit prep while still allowing enough autonomy to do what works best for their specific environment.

 Ideally – and again, in keeping with a DevOps approach – it should be possible to achieve some semblance of continuous compliance, where automated collectors routinely gather evidence from cloud infrastructure, data that is then fed into relevant reports and audits.  The goal would be to move beyond episodic audits to assess and monitor security, privacy and compliance drift over time.  Much in the same way that organizations hopefully don’t take a once-in-a-year look at their security posture, they similarly shouldn’t assess compliance only during an audit.

Ultimately, compliance audits are never going to be fun.  But tackling compliance with a DevOps automation mindset will allow organizations to improve security and privacy, and make the process a lot less painful.