Organizations Need Robust Security Controls to Ensure Data Remains Separated
NEW YORK – GIGAOM STRUCTURE DATA CONFERENCE – When it comes to Big Data initiatives, the goal is to derive as much insight from a broad array of data as possible. However, organizations face a number of security challenges when integrating various data sources, a speaker told attendees at GigaOm’s Structure Data conference on Wednesday.
One such difficulty is the fact there are compliance rules and other regulations dictating just what you can do with data.
“There are literally laws on the book that say: Thou shalt not integrate two data sources,” said Peter Guerra, a principal at Booz Allen Hamilton.
There is no silver bullet when it comes to designing Big Data architecture, Guerra said. Organizations in regulated industries have to really think about how to bring together while keeping things separate, Guerra added.
There are two ways to discuss security in the context of Big Data. The first is the more obvious, about securing the data that has been collected to ensure they don’t fall into malicious hands. The second aspect, however, is just as compelling as it focuses on using the data to improving security. Consolidating all the data sources makes it possible to “ask horizontal questions” they couldn’t ask before, Guerra said.
Guerra described how a commercial/financial client who had experienced a data breach used Big Data to track down how the attacker compromised the system, when, and why. The organization eventually found the answer inside help desk call logs and tickets as the attacker had called in pretending to need a password reset.
The compliance rules mean organizations need a robust security program to ensure the data remains separated. When data lived in IT silos, it was actually easier to keep each source of data and system separate from each other, as per compliance requirements. Data was also protected from insider threats as no one single person had access to all the information.
One way to design security in Big Data architectures is to built-in controls right from the start so that no one person has unfettered access to all the data, Guerra suggested. Data-tagging to track each piece of data and defining granular rules on who can access which kind of data under which conditions makes it possible to always know who touched data when.
“It gives them the ability to be almost like the GPS for data,” he said.
The possibility that having all the data in a centralized repository may make it easier for an attacker to steal the information is only a concern if Big Data architectures were not designed “in the right way,” Guerra said.
Big Data doesn’t necessarily mean organizations have to adopt public cloud infrastructure, even though major cloud providers have done a good job of addressing compliance concerns, Guerra said.
For many organizations, it’s not a question of regulations, but just their comfort level with having data outside their boundaries.
“There is a creepiness factor” about exposing that data, Guerra said.