Vulnerability management has historically been treated as an engineering exercise that is disconnected from how security flaws relate to the business and the actual threat they pose.
The increasing adoption of undefended new technologies like Internet of Things (IoT) and escalation in cybercrime activity have given rise to more damaging breaches. The ensuing regulatory and legal scrutiny have revealed the shortcomings of this traditional approach. This raises the question about the limitations of traditional vulnerability management and what steps can be taken to drive a new, risk-centric approach designed to expose imminent threats (for mitigation) and more effectively reduce risk across the expanding attack surface.
According to the 2017 U.S. State of Cybercrime Survey (PDF), 39 percent of respondents reported that the frequency of cyber security events has increased over the past 12 months. This is reflected in daily news reports about data breaches and newly found vulnerabilities. In turn, organizations plan to upgrade their IT and data security to avoid cyber-attacks in the years to come. Based on the State of the CIO, 2014 – 2017 report, this has now become the second highest priority behind assisting in achieving set revenue goals.
In light of the wave of data breaches in 2017, we need to consider whether traditional approaches to vulnerability management are still viable and if just upgrading existing methods or tools is sufficient. Traditional mid-sized organizations are faced with an average of 200,000 vulnerabilities across their ecosystem, often leaving their security analysts without a clue on where to start. Given that the enterprise attack surface continues to expand from endpoints, applications, databases, mobile devices to IoT, things will only get worse. That’s why Gartner in its 2017 State of the Threat Landscape talks about the fact that “your roofs are leaky, and getting leakier”.
Vulnerabilities are not a new phenomenon – they are as old as computers. And while vulnerability management tools and practices have evolved over the past few decades by adding new capabilities like authenticated or agent-based scans, at their core they still rely on the Common Vulnerability Scoring System (CVVS), which is maintained by the Forum of Incident Response and Security Teams (FIRST).
It is easy to be misled by CVVS scores and play math games with them. However, these exercises typically only reduce risk on paper – not in reality. Traditional vulnerability management approaches practice gradual risk reduction. They either focus remediation actions on the most severe vulnerabilities based on a high CVSS score (so-called vulnerability-centric model) or the value and exposure of an asset (i.e., Internet-facing, third-party access, contains sensitive data, provides business critical functions; so-called asset-centric model). Unfortunately, both practices are often tied to reducing the most amount of risk with the least number of patches.
These traditional approaches no longer suffice. Instead, according to Gartner (see “Threat-Centric Vulnerability Remediation Prioritization”), organizations should transform their vulnerability management practices to a threat-centric model, which allows for imminent threat elimination rather than gradual risk reduction. An imminent threat can be identified by correlating vulnerabilities to their prevalence in the wild:
• Is a vulnerability being targeted by malware, ransomware, or an exploit kit?
• Is a threat actor leveraging the vulnerability and targeting organizations like ours?
Under this new model, imminent threats are prioritized and remediated first. While we can’t predict who will attack us, we can predict who or what successfully could.
Ultimately, organizations that are planning to “upgrade” their existing vulnerability and patch management practices should move beyond looking at the “what is”, in terms of their current security posture. A better approach is to augment tools with emerging security analytics and cyber risk management capabilities that allow for a scenario and objective-based approach that assesses the “what could be”, and predicts the impact and outcomes of potential threats.

Torsten George is a cybersecurity evangelist at Absolute Software, which helps organizations establish resilient security controls on endpoints. He also serves as strategic advisory board member at vulnerability risk management software vendor, NopSec. He is an internationally recognized IT security expert, author, and speaker. Torsten has been part of the global IT security community for more than 27 years and regularly provides commentary and publishes articles on data breaches, insider threats, compliance frameworks, and IT security best practices. He is also the co-author of the Zero Trust Privilege For Dummies book. Torsten has held executive level positions with Centrify, RiskSense, RiskVision (acquired by Resolver, Inc.), ActivIdentity (acquired by HID® Global, an ASSA ABLOY™ Group brand), Digital Link, and Everdream Corporation (acquired by Dell).
More from Torsten George
- Navigating the Digital Frontier in Cybersecurity Awareness Month 2023
- Automated Security Control Assessment: When Self-Awareness Matters
- MOVEit: Testing the Limits of Supply Chain Security
- Today’s Cyber Defense Challenges: Complexity and a False Sense of Security
- Why Endpoint Resilience Matters
- Ransomware Attacks: Don’t Let Your Guard Down
- Password Dependency: How to Break the Cycle
- Is Enterprise VPN on Life Support or Ripe for Reinvention?
Latest News
- In Other News: New Analysis of Snowden Files, Yubico Goes Public, Election Hacking
- China’s Offensive Cyber Operations in Africa Support Soft Power Efforts
- Air Canada Says Employee Information Accessed in Cyberattack
- BIND Updates Patch Two High-Severity DoS Vulnerabilities
- Faster Patching Pace Validates CISA’s KEV Catalog Initiative
- SANS Survey Shows Drop in 2023 ICS/OT Security Budgets
- Apple Patches 3 Zero-Days Likely Exploited by Spyware Vendor to Hack iPhones
- New ‘Sandman’ APT Group Hitting Telcos With Rare LuaJIT Malware
