By security practitioners, for security practitioners novacoast federal | Pillr | novacoast | about innovate
By security practitioners, for security practitioners

Why the Vulnerability Management Industry Failed Us, and What To Do About It

How far have vulnerability management tooling vendors actually come since it became a thing in the 90’s?

25 years ago, security vulnerabilities in software were the focus of penetration testers. It was the time that birthed the white hat/black hat hacker culture, when cultivating the best toolset for finding unguarded open doors was the de facto strategy.  

When one of the first open-source vulnerability scanners was released in 1998, it quickly became a favorite of security practitioners. The tool scanned for known vulnerabilities that might exist on a remote system by checking for old versions of operation systems and applications with known, exploitable vulnerabilities.  

It wasn’t long before companies adopted this method as the standard for assessing vulnerability hygiene. A paradigm of scan, analyze, remediate, repeat was born. A vulnerability management company was eventually conceived to shape the original concept into a full business model. 

This paradigm is how most organizations still approach vulnerability management today, but it’s fundamentally flawed. As an industry, we continue to fall further and further behind the curve as new vulnerabilities outpace the rate at which we can remediate using a hands-on methodology that can’t keep up. 

Why vulnerability management tools are failing us 

The average time to fix a vulnerability today is between 60-150 days. That begins with the opening of the vulnerability “window” – the time between when a zero-day is publicly discovered and when developers publish a fix – and ends when patches are finally applied to affected systems. The range is hugely dependent on how effective an organization is at patching.  

While the vulnerability management space continues to be dominated by a single vendor, there is a persistent failure to address the limited usefulness of remote scanning as the means of gathering accurate vulnerability data from assets. The vulnerability window is far too large and provides ample opportunity for attackers to apply their own measured approaches to exploiting a longer list of vulnerabilities. 

Remaining adhered to the scanning paradigm, despite an industry shift to endpoint agents that provide actual telemetry, has resulted in a myopic dance around vulnerability edge cases – those that exist on non-endpoint devices such as network appliances, routers, firewalls, etc.  

These edge cases are effectively a red herring from a legacy vendor, distracting from the real problem which can’t be solved by their proprietary means. With approximately 90% of vulnerabilities affecting the endpoint, scanning has no compelling usefulness in comparison to the endpoint agent. 

Legacy vulnerability management tools aren’t management tools – they’re low-probability scanning tools that produce a report full of problems that can be punted to the next team. 

Why the legacy vulnerability management paradigm doesn’t work

The current vulnerability management paradigm perpetuated is fundamentally flawed for the modern age. For more than twenty years the focus has been to scan, import, analyze, then decide on a course of remediation. That doesn’t work in 2021. Here’s why: 

Scanning, by nature, is periodic. Scans are scheduled on an arbitrary or recommended interval, and by the time the returned findings are imported and analyzed, the data is practically useless in the context of the modern vulnerability window, as the situation changes rapidly. 

Remediation rate will flounder until it can be automated. Manual relay of information between a vulnerability analyst and patching team stands no chance of contending with the insurmountable volume of vulnerabilities. 

Vulnerabilities and configuration security should be considered the top security issues in terms of urgency. The Center for Internet Security’s Top 20 controls, or the specific rules and recommendations for securing systems and data, don’t mention malware defenses inside the top 10. What should be inferred from this is that VM is a daily focus and not a monthly cycle. 

Seeing a vulnerability through the remediation phase is the only way it can be considered resolved. The legacy paradigm is so focused on simply gathering data that it doesn’t bridge the gap to the actual critical measure of success: patches being applied. It’s almost as if the strategy stops at reporting and that’s considered a resolution. It’s only halfway to solving the problem. 

What should we have been focused on to solve vulnerability management? 

The more modern philosophy of vulnerability management is thinking of it as a continuous cycle that incorporates reliable endpoint data, tooling to automate patching, and closing of the loop with visibility into patching success and performance indicators that apply to the entire process. 

The data gathering step should employ endpoint agents that have full privileged local access to report accurate telemetry for software inventory and operating system specifics. Relying on anything slower than real-time telemetry to determine endpoint vulnerabilities should be considered coming up well short of what we’re capable of with current technology. 

The remediation step should begin instantly. Admins can configure tooling to automatically apply patches for vulnerabilities. If this sounds optimistic, it is. Sure, upgrades can always encounter problems but if the majority of patches are applied with the absolute minimum vulnerability window, it’s vastly more secure than waiting for decision makers to review a report and assign personnel to apply the patches manually on an indeterminate timetable. 

Any manual review should be relegated to governance and performance statistics in the interest of tuning and maturity. 

What should we be doing about vulnerability management today? 

Today, we should be embracing a forward-thinking approach to instant vulnerability data and automated patching.  The volume of vulnerabilities is relentlessly expanding and should be considered too insurmountable a task to manually remediate.  

Legacy vulnerability scanners would have you believe they’re leading the charge in vulnerability management, but they’re so off target in fundamental strategy that it’s not worth seriously entertaining as an effective solution.  

Previous Post

Why All the Hype About XDR?

Next Post

What Security Practitioners Need to Know About the T-Mobile Cyberattack

Innovate uses cookies to give you the best online experience. If you continue to use this site, you agree to the use of cookies. Please see our privacy policy for details.