This past weekend, my wife and I took in a Saturday matinée to see the film adaptation of Thomas Pynchon’s Inherent Vice.
As any film adaptation of a Pynchon book is apt to be, it came across as noisy, otherworldly confusing, intermittently meaningful and more like some kind of aimless Ludovico technique than a movie. And that’s not a bad thing. It’s really all you can ultimately achieve when you try to make a movie out of one of his novels.
Good try, Paul Thomas Anderson. Good try.
As I sat there watching, thoughts from my day job kept nagging at me as if the movie had actually stirred up something in my subconscious.
Specifically, I kept thinking about the trend lately around so much cyber threat intelligence from so many solutions providers delivering feeds from so many similar sources. And, altogether more bothersome, how so many of the providers lately are partnering up with one another to offer access to so many of each others’ threat intelligence feeds inside their own platforms. In my head, I kept seeing one of those photos of a person with the same photo of the same exact person dressed the same exact way hanging on the wall behind them and another photo….well, you get it.
Suddenly the movie started making more sense to me than the threat intelligence market. Then it hit me.
Perhaps not coincidentally, many of Pynchon’s works have something to do with information theory and its effects on our lives. In particular, he’s keen on understanding the effect of entropy of information on human communications. The whole subject of entropy itself is more than a little confusing and has many applications in multiple domains and can be interpreted to apply to many things and in different ways both conceptual and applied.
Even after many years of trying to grasp it fully, I confess that true understanding of all its facets remains elusive to me. But, for the more core concepts, it makes all too much sense. In fact, via it’s inventor Claude E. Shannon, information theory and entropy actually form the foundation for all modern computing and networking.
Sounds tough. And it is. But it’s the application of information theory and entropy in a conceptual way that interests me here with regard to the preponderance of threat intelligence.
Information theory is actually a branch of mathematics that attempts to explain the nature of communications data as it is transmitted, stored or received, as well as the variables that affect its transfer such as noise, the amount of data transmitted, number of distinct and different message sources, the type and size of channel it’s transmitted on, reliability and intelligibility.
Seen this way, information can essentially be treated as mathematically-defined.
Related to this, entropy is a measure of the information contained in a given message. As far as communications go, it can be said that entropy is the measure of loss in transmitted messages. By evaluating the various variables in communications transmissions, like the newness and uniqueness of the messages and their sources, how often messages repeat and how much they’re amplified over a period of time and distance, it can be used to calculate the likelihood of a message successfully reaching its intended receiver and, by extension, the effectiveness of the message itself – i.e. the knowledge and info it transmits.
Make sense? If you take all that in – and for the purposes of this article – let’s just say that entropy = boring, noisy, redundant old news.
Again, you may be wondering how this relates to cyber threat intelligence specifically?
To put it in plain terms, there are so many similar pieces of threat intelligence being transmitted so often (i.e. redundant) from so many similar sources at such an alarming rate as to make deriving any real, worthwhile meaning by the receiver of all these threat messages, well, highly unlikely. In other words, there’s so much of the same data being sent from so many of the same sources as to render it essentially useless for the consumer.
In 1986, Professor Orrin Klapp wrote a series of essays about all this titled “Overload and Boredom: Essays on the Quality of Life in the Information Society” in which he stated:
The larger the amounts of information processed or diffused, the more likely it is that information will degrade toward meaningless variety, like noise or information overload, or sterile uniformity…The more information is repeated and duplicated, the larger the scale of diffusion, the greater the speed of processing, the more opinion leaders and gatekeepers and networks, the more filtering of messages, the more kinds of media through which information is passed, the more decoding and encoding, and so on– the more degraded information might be.
Interesting perspective (even if he already kinda drives the point home right there in the title, am I right?).
For Klapp, speaking in 1986 about what he was witnessing as a rising increase in volume of communications and media, the constant inundation of information leads to degradation of meaning. In fact, for him, information had potentially become so entropic as to affect the quality of human life itself across the world.
For those of us today in the micro world of cybersecurity and the business of cyberdefense, it’s getting awfully noisy too.