Security Experts:

Leveraging Gap Analysis to Drive Security Metrics

Gap Analysis Can Serve as a Wonderful Driver for Improving Security Metrics 

In my previous column, I discussed using gap analysis to assess an organization’s information security program and build a work program to improve it. One of the key steps discussed in that article was the need to prioritize and remediate findings. Once the security organization undertakes this important endeavor, how can it ensure that it is properly addressing gaps and remediating the risk that results from them?

One way to achieve this goal is to continually measure progress, performance, and risk. In this regard, gap analysis provides the security organization with another good driver for its metrics. As an added benefit, developing metrics with the intent of keeping after gaps allows the security team to identify hiccups early on and course correct before those hiccups develop into serious problems.

That is all well and good, but practically speaking, how can a security team develop sound, value-added metrics to help it measure its progress, performance, and risk against gaps?


In a sense, each identified gap is analogous to a homework assignment for the security organization. One great way for the security organization to measure how it is doing on its assignments is to assess its progress against them. This involves the following steps:

● Define desired outcomes

● Determine intermediate milestones

● Devise accurate measurement framework to assess intermediate progress

● Set realistic dates for intermediate milestones

● Set realistic date for overall completion

● Devise accurate measurement framework to assess overall progress

● Measure continually throughout the process

As you can see, the process by which security measures its progress towards addressing its gaps is highly dependent upon developing timely, accurate, and relevant metrics along the way.  The security organization’s interest in staying on schedule and continually improving naturally helps it develop and mature its metrics.


Not all means by which gaps are addressed are created equally. Sadly, some organizations check certain boxes that make it appear that they’ve done their homework, when, in fact, they haven’t effectively addressed the gaps they claim to have addressed.  How can a security team demonstrate that they have effectively addressed gaps, rather than just having checked the appropriate boxes?

This is where performance metrics come into play. Good performance metrics are designed to measure the efficacy of a control, remediation, or solution, rather than whether or not one exists, which is what the checkbox approach measures.

Developing good performance metrics necessitates the following steps:

● Define desired improvement or efficiency gain

● Determine ranges and tolerances for good, acceptable, and unacceptable performance

● Define objective measurements

● Measure and calculate results

● Adjust and reorient trajectory as necessary

● Measure continually to ensure accurate and up-to-date data

Similarly to how the security team measures progress, measuring performance also requires it to develop meaningful performance metrics.  The team’s interest in ensuring that it is truly addressing its gaps, rather than just paying lip service to them, brings it a good distance towards improving the overall state of security metrics.


Any gap in an organization’s security posture has the potential to introduce risk into the organization. While risk can never be eliminated, it can be minimized, managed, and mitigated. That sounds good in theory, but in practice, how can a security organization ensure that it focuses itself on activities that aid in reducing risk, rather than those that do not?

This is where risk metrics shine. When designed properly, they are crucial in helping an organization accurately assess changing risk levels and respond to them intelligently and strategically. Achieving proper risk metrics requires that the security organization:

● Understand the risks of concern to the business

● Classify those risks as either key or non-key

● Weight and prioritize risks based on the potential for damage to the business

● Aggregate risks into categories (e.g. NIST CSF Categories)

● Define objective measurements to assess risk levels

● Determine acceptable ranges and tolerances for risk levels

● Measure and calculate risk levels for key risk indicators

● Measure and calculate risk levels for other metrics around both key and non-key risks

● Compute aggregated risk levels

● Draw overall risk picture, incorporating weight and priority into the calculation

● Refine continually to ensure accurate measurement of risk levels

The very process used to understand the risks tied to gaps brings significant value in helping the organization build and mature its metrics.  Like progress and performance metrics, risk metrics also serve the dual purpose of driving security metrics.

What Comes Next

Metrics should be living and dynamic. As such, the security organization needs to continually measure, refine, and adjust in accordance with changing risk levels and the changing business environment. If a metric begins to lose value or is measuring something that is no longer relevant, it should be modified or eliminated. If a metric indicates that progress is occurring too slowly, that performance is suffering, or that risk levels are rising, action should be taken strategically. The metrics we produce should guide us and allow us to course correct intelligently, rather than haphazardly or through guessing. Gap analysis can serve as a wonderful driver for improving security metrics if the security team understands how to seize the opportunity it presents.

Related ReadingUsing Gap Analysis to Fix a Leaky Enterprise

view counter
Joshua Goldfarb (Twitter: @ananalytical) is currently Director of Product Management at F5. Previously, Josh served as VP, CTO - Emerging Technologies at FireEye and as Chief Security Officer for nPulse Technologies until its acquisition by FireEye. Prior to joining nPulse, Josh worked as an independent consultant, applying his analytical methodology to help enterprises build and enhance their network traffic analysis, security operations, and incident response capabilities to improve their information security postures. He has consulted and advised numerous clients in both the public and private sectors at strategic and tactical levels. Earlier in his career, Josh served as the Chief of Analysis for the United States Computer Emergency Readiness Team (US-CERT) where he built from the ground up and subsequently ran the network, endpoint, and malware analysis/forensics capabilities for US-CERT.