Connect with us

Hi, what are you looking for?

SecurityWeekSecurityWeek

Incident Response

Big Data: Noise, or Actionable Cyber Security Info?

You Can’t Respond to Big Data with Big Collection… 

You Can’t Respond to Big Data with Big Collection… 

Big Data. The phrase conjures a sense of next-gen problem-solving through sheer informational might. There is a whole lot of hype today around Big Data, leaving many to assume that it can only contribute positively, which means – the more data – the merrier. But in truth, Big Data is not making life any easier for security professionals. In fact, it is making them far less than merry.

What do we know for certain? We know that today’s security tools are detecting huge numbers of potential security events, culling massive amounts of data that analysts are expected to then comb through and utilize to connect the dots.

Security analysts are collecting all events, but are struggling to filter out non-relevant signals in an attempt to isolate the important events from the rest of the noise.

Bad idea.

Large organizations continue to be breached, with slow-moving investigations taking even 100 days or more just to recognize the initial breach. Bottom line: organizations have labored for years now to address Big Data’s implications –with too few effective results to show for it.

You just can’t respond to Big Data with Big Collection. Why? The problem with collecting and analyzing everything is that it requires huge amounts of computerized resources, which dramatically slows down the analyses of these systems. The unintended consequence: more false positives, because when a system needs to work with so much noise, its error rate increases.

Big Data

There is simply no way to “carefully” select the relevant data and then analyze it – scaling the mountains of data – and respond in a timely fashion. This is the case regardless of sector and application.

Advertisement. Scroll to continue reading.

What’s more – even if a head-count is at all effective, the sheer number of people necessary to analyze these inputs is extremely expensive and far from cost effective. Without scaling, there’s no effective way to analyze so much information in real-time. What that means is important security events or threats are routinely missed – drowned out in the sea of information generated by all the data.

As real attacks go unnoticed, and analysis and overall response time are slowed, organizations are typically investigating the past, rather than tackling the present.

CSOs are frustrated. Their CEOs insist that they identify all possible threats to their business, mitigate them in real-time, and do so within budget constraints. It’s a seemingly impossible mission.

The New Paradigm

In truth, some security tools are suited for certain types of attack vectors and are dysfunctional when it comes to others.  For example, don’t count on IDS devices and reputation systems capabilities from a sandbox tool (many sandbox products have integrated IDS and reputation feeds which are far from best of breed in these areas). The reverse is also true – don’t trust IPS devices for all cases, as some are better for privileges escalation because their research teams have been more focused on these threat vectors, while others are better at DoS Buffer overflow attack vectors.

The key is to understand and rate the capabilities of each security tool, including their detection capabilities, investigation functions, and mitigation/remediation capabilities.

Once this has been ascertained, a layer of intel that allows each step of an attack to be identified and its intent understood – and then correlating them with the most optimal security response (i.e., the features within the best-rated tools) becomes possible… and would go a long way towards solving the Big Data problem of plenty.

The advantages of this process are enormous. It allows for the collection of data from tools that are best able to triumph over the most pressing threats. This means rather than blindly collecting all data points – which does nothing to reduce the noise level – we should be gathering higher quality data.

Higher quality data means you can effectively track event correlation, mapping the most relevant and effective investigation and mitigation functions – all with higher accuracy, in real-time.   

In short, instead of hoarding data in the hopes of taming it for a company’s business and security advantage, the future will require enhanced understanding of an enterprise’s actual capabilities and the intent of the attacks targeting them. Only this will allow organizations to structure the best response for each stage of an attack campaign.  This new paradigm will solve the real, fundamental problem that the industry, by and large, continues to ignore – Big Data at your own big risk.

Related: The Role of Big Data in Security

Written By

Click to comment

Trending

Daily Briefing Newsletter

Subscribe to the SecurityWeek Email Briefing to stay informed on the latest threats, trends, and technology, along with insightful columns from industry experts.

Don’t miss this Live Attack demonstration to learn how hackers operate and gain the knowledge to strengthen your defenses.

Register

Join us as we share best practices for uncovering risks and determining next steps when vetting external resources, implementing solutions, and procuring post-installation support.

Register

People on the Move

SSH Communications Security has appointed Pauli Haikonen as the company’s Chief Information Security Officer (CISO).

Cloud and container security firm Sysdig has tapped William Welch as CEO on its path to an IPO.

Dave Scher has been promoted to Deputy Chief Information Officer at MITRE.

More People On The Move

Expert Insights

Daily Briefing Newsletter

Subscribe to the SecurityWeek Email Briefing to stay informed on the latest cybersecurity news, threats, and expert insights. Unsubscribe at any time.