Connect with us

Hi, what are you looking for?

SecurityWeekSecurityWeek

Vulnerabilities

CVE and NVD – A Weak and Fractured Source of Vulnerability Truth

MITRE is unable to compile a list of all new vulnerabilities, and NIST is unable to subsequently, and consequently, provide an enriched database of all vulnerabilities. What went wrong, and what can be done?

CVE, NVD Shortcomings

The Common Vulnerabilities and Exposures (CVE) List and the consequent National Vulnerability Database (NVD) can no longer be considered a single central source of vulnerability truth.

Nobody doubts that the current CVE system can and should be improved. Overseen by MITRE (sponsored by the DHS), the CVE List is absorbed and its data enriched by NIST in the NVD. MITRE is responsible for the vulnerability numbering system, while the NVD has become the cyber defenders’ source of truth on the vulnerabilities.

Since mid-February 2024, a banner has appeared at the head of NVD entries: “NIST is currently working to establish a consortium to address challenges in the NVD program and develop improved tools and methods. You will temporarily see delays in analysis efforts during this transition.”

[Note: NIST updated its information while this article was being written. The update is included as an addendum at the end of our discussion.]

What went wrong, and what can be done?

The problems

The danger in having a central database of vulnerabilities is that it focuses attention on the content. ‘Vulnerabilities and their details can be found here.’ By implication, if a vulnerability isn’t included, it isn’t a vulnerability.

This is simply wrong. Threat intelligence firm Flashpoint noted in March 2024 it was aware of 100,000 vulnerabilities with no CVE number and consequently no inclusion in NVD. More worryingly, it said that 330 of these vulnerabilities (with no CVE number) had been exploited in the wild.

The opposite is also true – inclusion doesn’t necessarily disclose a genuine vulnerability. On February 21, 2024, Daniel Steinberg (founder and lead developer at the cURL project) blogged: “I keep insisting that the CVE system is broken, and that the database of existing CVEs hosted by MITRE (and imported into lots of other databases) is full of questionable content and plenty of downright lies.” For Steinberg, an invalid, cURL vulnerability had been given a legitimizing CVE number. 

Advertisement. Scroll to continue reading.

For similar treatment with another CVE (CVE-2023-52071), cURL produced its own spoof CVE listing: 

Bogus report filed by anonymous
Project curl Security Dismissal, January 30, 2024 – Permalink
VULNERABILITY
None. CVE-2023-52071 was filed and made public by an anonymous person due to incompetence or malice. We cannot say which and the distinction does not matter to us…

False negatives (exclusion) are a serious threat. They are invisible vulnerabilities that are not promoted into the defenders’ triaging efforts. False positives (wrongful inclusion) are not a direct security threat but can be a huge waste of resources for both the ‘accused’ provider and security teams engaged in unnecessary vulnerability triaging. Both effects are contrary to the purpose of CVE numbering by MITRE and subsequent vulnerability enrichment by NIST.

On the false negatives, explains Sasa Zdjelar, CTO at ReversingLabs, “The result is that an application free from formal CVEs cannot be considered safe, trusted, or secure. In fact, just the opposite is true: a lack of CVEs does not connote ‘perfect security’, rather it communicates a ‘black box’ approach to software and product security that is often hiding an army of software security ‘skeletons’ in the form of remotely exploitable vulnerabilities.”

The basic problem is one of resources versus the expanding volume and complexity of vulnerabilities. Vulnerabilities are increasing at around 20% per annum (likely to hit 3,000 new vulnerabilities per month by the end of 2024). Omissions could at least partially be solved by MITRE expanding the number of approved CVE Numbering Authorities (CNAs, aka Candidate Nomination Authorities) able to request CVE numbers on their own authority. If more organizations can generate CVEs, there will be more CVEs in the CVE List, and fewer omissions. That’s the theory. But more CNAs increases the possible inclusion of errors while decreasing the value of a single, centralized, trusted source of truth – potentially allowing, in cURL’s words, an incorrect CVE filed “by an anonymous person due to incompetence or malice”. The current 350 CNAs remains a tiny fraction of the number of code producers who could theoretically monitor and report on their own vulnerabilities.

A similar resource problem makes it difficult for NIST to adequately enrich all the CVE reports as they hit the NVD database. This could explain the halt (it’s not really a delay) in NVD vulnerability analyses while the organization compiles a consortium presumably to spread the load. At the time of writing, there is no knowledge of the consortium’s purpose, potential membership, or size, nor a projected completion date.

The current state

To improve anything, you must first understand what is lacking.

One of the problems in using the NVD for vulnerability management is that it is effectively historical data. While inclusion in the NVD system implies that a patch is probably available (or at least a workaround), it also means that the vulnerability is known – and that exploitation may follow rapidly. It is impossible for security teams to triage all CVEs – which means the problem is twofold: which vulnerabilities are relevant to specific environments, and which are the most urgent?

NVD attempts to assist in this triaging process in several ways. For example, by enriching the CVE record with a Common Vulnerability Scoring System (CVSS) severity rating (from 1 to 10 – the higher the number, the harsher the potential impact of exploitation), and with product mapping (which links the vulnerability to specific products and versions). The NVD also provides a reference – if applicable – to CISA’s Known Exploited Vulnerabilities (KEV) List. This is a separate list of vulnerabilities that are known to have been exploited; and its inclusion can add a sense of urgency to the triaging process. 

There are two weaknesses to this process. Firstly, if the vulnerability is not included in the CVE List, it is not included in the NVD, and there is nothing to enrich with, at the very least, a CVSS. Secondly, if there is no inclusion in the KEV List, there is no additional ‘sense of urgency’.

Yet another separate database attempts to solve this: the Exploit Prediction Scoring System (EPSS) run by FIRST Org. This uses machine learning on historical events to gauge a likely exploitation score for any given CVE from 0 to 1 (that is, a percentage). 

So, a CVE from MITRE with a high CVSS and product mapping to my systems from NIST, plus a high EPSS score from FIRST Org and possible inclusion within the KEV List would combine to suggest a vulnerability in urgent need of remediation.

But there remain two problems: firstly, we have already seen that the CVE List is incomplete and has issues of reliability; and secondly, there is nothing here to indicate whether any user’s specific implementation is vulnerable to the steps necessary for this specific exploitation.

As an example, Dana Wang, chief architect at OpenSSF, suggests, “The CVE system lacks the necessary granularity to accurately associate vulnerabilities with specific versions of open source software packages and their fix status. As a result, there is a significant noise-to-signal ratio.”

One thing that could help with the last point could be yet another database: MITRE’s ATT&CK framework. ATT&CK describes the different TTPs used in different attacks. It does not relate those attacks to specific CVEs. However, a newer project, MITRE Engenuity, is attempting to map CVEs to TTPs. A user would then be able to compare CVE-known vulnerabilities to the TTPs necessary for exploitation, and gauge whether that user’s own specific environment is susceptible to that vulnerability. This won’t reduce Wang’s concern over noise but could increase the signal part of the noise-to-signal ratio.

Tick all the boxes and you have a vulnerability that requires immediate patching. The problems are that only the vulnerabilities that have been given a CVE, and that, to coin a phrase, the details are all over the place.

“The issue I’ve come across time and again,” comments Steve Benton, VP threat research at Anomali, “is trying to use CVE/CVSS as the sole guide to risk decisions and response – that’s where it unravels. The CVE/CVSS system provides a comparative landscape that based on the scoring can set an initial priority for assessment. But that assessment must take into consideration several other factors to determine a true priority for the organization, and key to that is dynamic use of threat intelligence bonded to security operations.”

Sasa Zdjelar, chief trust officer, ReversingLabs
Sasa Zdjelar, chief trust officer at ReversingLabs

Stressing the fractured nature of vulnerability data and the difficulty in finding and combining the essential elements, Zdjelar makes an important point: “If the CVE system was optimal, there wouldn’t be so many companies successfully selling solutions that help organizations identify, prioritize, and resolve publicly disclosed vulnerabilities… The fact is, there is no single comprehensive source that captures all relevant information about vulnerabilities.”

What we do not yet know is what MITRE and NIST – who are both clearly working on a solution – are planning to do.

Attempted assists

We shouldn’t dismiss the current CVE system. There is much good that can be found and much good that has been done. It has been pivotal, for example, in moving the old debate about full disclosure toward responsible disclosure – that is, allowing vendors reasonable time to patch vulnerabilities before they become general knowledge.

“The CVE system is not perfect – but it’s invaluable,” adds Marie Wilcox, security evangelist for Panaseer. “While new vulnerabilities are appearing with greater regularity, making timely updates increasingly important, many attackers still rely on older, well-known CVEs because they know that organizations haven’t patched them. While in an ideal world we would have the most up to date and complete information on vulnerabilities, there is still a lot of value in what the database offers today. Having this source of data can help businesses to improve their defenses, identify potential coverage gaps and deploy resources effectively.”

Suggestions for further improvement abound. “These,” comments Tyler Bellue, threat hunter team lead at Critical Start, “could include enhancing the transparency and methodology behind CVSS scoring to better reflect the practical risk and impact of vulnerabilities. Additionally, fostering a more collaborative environment between researchers, vendors, and defenders could improve the timeliness and quality of information shared, ultimately leading to a more robust and responsive vulnerability management ecosystem.”

Well-meaning organizations, such as FIRST, have tried to improve the situation with their own additions. One such is Wang’s OpenSSF. She accepts the provenance of MITRE and NIST but suggests a solution must go beyond the two parties. “Improving the system necessitates ongoing dialogue among MITRE, NIST, industry leaders, and open source communities. Building consensus will require a collaborative effort that takes time.” 

But she adds, “In the meantime, OpenSSF encourages the adoption of the Open Source Vulnerability (OSV) system designed specifically for open source projects. This system comprises the OpenSSF OSV schema, a vulnerability scanner, and an open source vulnerability database. The OSV schema offers a human and machine-readable data format that precisely outlines vulnerabilities and aligns them with specific open-source package versions or commit hashes. The database aggregates vulnerabilities from nearly 20 vulnerability data sources, including NVD.”

This would no doubt help the OSS vulnerability scene (no small matter), but further dissipates the concept of a single, central source of truth for all vulnerabilities.

OpenSSF is not alone. In a blog from March 8, 2024, Anchore noted that NIST’s CVE enrichment came to a halt in mid-February 2024 (around the time its banner appeared). “Starting February 12th, thousands of CVE IDs have been published without any record of analysis by NVD. Since the start of 2024 there have been a total of 6,171 total CVE IDs with only 3,625 being enriched by NVD. That leaves a gap of 2,546 (42%!) IDs.”

Anchore believes that NIST is regrouping its efforts, but that this will take time, leaving user companies more vulnerable. To fill this gap, it has launched its own new project, currently called NVD Data Overrides. “We’re working on adding the same type of thing NVD used to do to the CVE data,” says Josh Bressers, VP of security at Anchore. “The data is licensed CC0, anyone can use it for anything.” At the time of writing, this new database includes more than 500 entries. It intends to harness the strength of the open source community, adding, “In the event NVD returns, or some other project takes over the current task of NVD, we expect to continue to maintain this project.”

The CVE cleft stick

MITRE is a not-for-profit private organization (although heavily funded by government). NIST, as part of the Department of Commerce, is a government agency. It is probably the right balance that government is not involved in the collection and numbering of new vulnerabilities but can lend its weight and authority to describing and prioritizing remediation efforts. So, the CVE List of privately collected vulnerabilities (note that it is called a ‘list’) and the National Vulnerability Database (NVD) of vulnerabilities and their relevant details is a solid bedrock of vulnerability truth.

But both the List and the Database are failing to deliver. This is not a failure of design, but a failure of resources. MITRE is unable to compile a list of all new vulnerabilities, and NIST is unable to subsequently, and consequently, provide an enriched database of all vulnerabilities. NIST has its own resource problems for the enriching process. 

The first problem is for MITRE to improve its collection of vulnerabilities. It could achieve this by increasing the number of CNAs. But where do you stop? How do you balance collecting all vulnerabilities with allowing everybody and anybody to request a CVE reference? Increasing the collection pool simultaneously increases the potential for false positives. On the one hand, Zdjelar comments, “Only around 350 organizations are registered as CVE Numbering Authorities (CNAs) with the ability to assign CVE numbers to vulnerabilities. That’s a miniscule fraction of the number of companies producing code. The lack of universal participation in the CVE system, and the time consuming process for obtaining CVE numbers greatly limit the reach and, therefore, the usefulness of the CVE system.”

But on the other hand, increasing the number of CNAs tends to decrease the value of submissions to MITRE. “Initially, the vulnerabilities listed in the National Vulnerability Database (NVD) were reported by seasoned researchers or well-established practitioners, and the assignment of a CVE was recognition of their contributions to the industry,” comments Brian Fox, CTO and founder of Sonatype. “As software security gained traction over time, an influx of budding researchers used the NVD and CVE credit as a way to jumpstart their careers – leading to the convergence of bug bounties and widespread accessibility that created a land grab race to the bottom.”

Consequently, seeking to include all vulnerabilities within the CVE List stretched MITRE’s own resources and then bottlenecked NIST’s resources. “This flood of low quality reports has exacerbated challenges within the NVD program, and we are now seeing month long delays in the analysis of new reports,” continues Fox.

Dan Lorenc, founder and CEO at Chainguard
Dan Lorenc, founder and CEO at Chainguard

Third party organizations have attempted to ease the burden by developing their own ‘enriched’ vulnerability databases. But this merely weakens the value of having a single central source of truth backed by the authority of government. Users must now interrogate multiple different databases to find the information they need to triage and where necessary remediate vulnerabilities.

That is far from optimum. “I don’t think a distributed group can really replace the main value of NIST,” comments Dan Lorenc, founder and CEO at Chainguard. Following NIST recommendations is central to government advice. “What we need is a central group that is trusted by the government to be doing the scoring, even when they get things wrong. And there are mechanisms where the community can correct them – but it’s not going to be data that the government is going to trust. The government is only really going to trust the data coming from them, which is where all these standards and regulations that say you must use this data comes from.”

Consider, for example, the SEC disclosure rules. “If you miss a vulnerability scan because NIST had it wrong,” he continued, “that will look a lot different than if you miss a vulnerability scan because you and your team or somebody else messed something up. So, it’s one of those cases where the perfect solution from a security perspective isn’t a perfect one from a compliance and regulatory one, which happens all too often.”

The bottom line is that effective cybersecurity and cybersecurity compliance in the US requires a government backed single central source of vulnerability truth. NIST, with its current resources, cannot currently provide this. It says it is seeking assistance from a ‘consortium’. Whether it can achieve this and remain a government single source of truth remains to be seen. But something must be done, and quickly. 

Addendum: NIST updated its information while this article was being written. The April 2nd update is included below:

NIST maintains the National Vulnerability Database (NVD), a repository of information on software and hardware flaws that can compromise computer security. This is a key piece of the nation’s cybersecurity infrastructure.

There is a growing backlog of vulnerabilities submitted to the NVD and requiring analysis. This is based on a variety of factors, including an increase in software and, therefore, vulnerabilities, as well as a change in interagency support.

Currently, we are prioritizing analysis of the most significant vulnerabilities. In addition, we are working with our agency partners to bring on more support for analyzing vulnerabilities and have reassigned additional NIST staff to this task as well.

We are also looking into longer-term solutions to this challenge, including the establishment of a consortium of industry, government, and other stakeholder organizations that can collaborate on research to improve the NVD.

We will provide more information as these plans develop. NIST is committed to its continued support and management of the NVD.

Related: VulnCheck Raises $3.2M Seed Round for Threat Intel

Related: How to Predict Your Patching Priorities

Related: Vulnerability Management Firm Vicarius Raises $30 Million

Related: Vulnerability Management Fatigue Fueled by Non-Exploitable Bugs

Written By

Kevin Townsend is a Senior Contributor at SecurityWeek. He has been writing about high tech issues since before the birth of Microsoft. For the last 15 years he has specialized in information security; and has had many thousands of articles published in dozens of different magazines – from The Times and the Financial Times to current and long-gone computer magazines.

Trending

Daily Briefing Newsletter

Subscribe to the SecurityWeek Email Briefing to stay informed on the latest threats, trends, and technology, along with insightful columns from industry experts.

Join the session as we discuss the challenges and best practices for cybersecurity leaders managing cloud identities.

Register

SecurityWeek’s Ransomware Resilience and Recovery Summit helps businesses to plan, prepare, and recover from a ransomware incident.

Register

People on the Move

Professional services company Slalom has appointed Christopher Burger as its first CISO.

Allied Universal announced that Deanna Steele has joined the company as CIO for North America.

Former DoD CISO Jack Wilmer has been named CEO of defensive and offensive cyber solutions provider SIXGEN.

More People On The Move

Expert Insights

Related Content

Vulnerabilities

Less than a week after announcing that it would suspended service indefinitely due to a conflict with an (at the time) unnamed security researcher...

Data Breaches

OpenAI has confirmed a ChatGPT data breach on the same day a security firm reported seeing the use of a component affected by an...

IoT Security

A group of seven security researchers have discovered numerous vulnerabilities in vehicles from 16 car makers, including bugs that allowed them to control car...

Vulnerabilities

A researcher at IOActive discovered that home security systems from SimpliSafe are plagued by a vulnerability that allows tech savvy burglars to remotely disable...

Risk Management

The supply chain threat is directly linked to attack surface management, but the supply chain must be known and understood before it can be...

Cybercrime

Patch Tuesday: Microsoft calls attention to a series of zero-day remote code execution attacks hitting its Office productivity suite.

Vulnerabilities

Patch Tuesday: Microsoft warns vulnerability (CVE-2023-23397) could lead to exploitation before an email is viewed in the Preview Pane.

IoT Security

A vulnerability affecting Dahua cameras and video recorders can be exploited by threat actors to modify a device’s system time.