“Operationalization is the process of strictly defining variables into measurable factors. The process defines fuzzy concepts and allows them to be measured, empirically and quantitatively.”

-

Here we are again, heading back into the season of industry prognostication. 

Typically we start to see the “next year’s big thing” stories in mid-to-late December; much like your local retailer I’m going to put a flag in the ground a bit in advance.

My reasoning? We think that operationalization is and will continue to be one of the hot topics in our space in the coming months, and perhaps even years.

That said, this term you see above is far from new. Despite the fact that, like many buzzwords, the second or third Google hit asks “is operationalization a real word?” in this case the answer is factually “yes”. And an increasing number of people are using it.

So what does it infer within the context of IT security, or more specifically data protection, and why should you care?

As long as I’ve been in the IT security space – a timeframe now spanning roughly [ack] 15 years, first as a reporter and now as a marketer – the push for better metrics to define and measure success has been a major issue.

Yet, here we are heading into… 2020 [egads] and many practitioners tell us that their organizations are still using outmoded variables such as “how many vulnerabilities we patched this month” to define their ongoing effectiveness.

We all know this is a flawed approach as the sheer volume of remediation, even as related to critical patches, hardly quantifies anyone’s ability to best protect their most sensitive assets, and data.

Now back to operationalization. What we clearly still need are more precise security factors that can be “measured, empirically and quantitatively” – especially some key drivers of program success that have remained elusively “fuzzy”. We also need a more effective manner of tying topline initiatives [protect all the data!] to operations-level goals [chase down today’s biggest threats!].

In a completely predictable and wholly biased fashion, I’m going to cite one such related metric that points directly to the value proposition of the technology that my employer Bay Dynamics and its business partner Symantec would like to sell you.

That being, determining, measuring and improving the overall success of information protection. Ultimately, outside of DDoS and other such service interruptions, everything we do in security these days relates directly to protecting the data.

How do we achieve this? Typically, through the alignment of layers of technical and process-based controls. We implement tools such as DLP, endpoint, cloud and access management, among many others, then attempt to optimize them.

And how do we measure it?

Monitoring the efficacy of this ecosystem remains elusive. While industry experts such as SANS and the MITRE peeps etc. tell us we need better means of validating controls… we are still told of many Board Level reports of “well, we this patched X number of exposures” this month/quarter/year.

This is likely a massive oversimplification, but it is what we’re told and to some extent where we are as an industry.

Enter op-er-a-tion-al-i-za-tion [don’t try to say it fast, if at all; perhaps defer to “ops”, which is somewhat both confusing and convenient given that moniker’s existing connection to “operations”].

The opportunity that we are seeing is that – when targeted, integrated, dare we say “advanced” analytics are tied together with information protection infrastructure [for example DLP] it provides an immediate ability to better measure and quantify the overall success of Information Protection.

For instance, by gaining rapid visibility into the efficacy of security policies [these are too noisy, these are showing us the real risks, these people just need more training] immediate gains can be made. 

Or, if you can quickly identify which DLP alerts actually represent material risks to your most sensitive data, utilizing both supervised and unsupervised Machine Learning to pump up the “bad” and de-escalate the merely “not great” or “broken business process”… now you can truly begin to augment and advance the efforts of human analysts – to both optimize their work and measure success [versus say, “I reviewed X number of alerts today”].

This of course also drives more targeted remediation of critical exposures, a process itself that can also be monitored and measured to denote ongoing performance.

To be specific, we maintain that by taking this automated analytics-based “operationalization” approach to data protection you can better define and measure:

  • Where the program and tooling is most effective
  • What policies need to be improved
  • How the most pressing alerts are being handled
  • How effectively those issues are being remediated

Thus,  “strictly defining variables into measurable factors”.

Worth noting, security operationalization also supports key tenets of another growing industry movement, the Forrester-led Zero Trust methodology.

Here’s hoping that somebody reading this can supply an even more nuanced and precise interpretation.

Enjoy the holidays!