Years ago on the wall of an IT department, I spotted an innocent looking red box, labeled “Break In Case of Emergency”. Inside was a hefty wire-cutter – the ultimate firewall. The humor of this won’t be lost on security-minded folks. The job of a firewall is to block all traffic by default, allowing only certain traffic by exception. Considering only security and ignoring business requirements, the best firewall is one that allows no traffic to pass through. That image stuck with me, and I find myself contemplating what it represents when applied to public cloud adoption.
From a security perspective, a wire cutter represents the best solution if not having a single false negative is the only requirement. If ensuring there are no false positives is the goal, achieving a perfect score is as simple as removing all security. In reality, security doesn’t operate in a perfect theoretical vacuum like a classical physics problem—it is a bit more like calculus with prognostic variables. We can call it a spectrum that asks us to solve a maxima/minima problem that considers, roughly, the cost of having security, against the cost of not having it.
Taking that spectrum and applying it to public cloud adoption is an interesting thought experiment. From what I have observed, avoiding cloud isn’t exactly using wire-cutters, but it’s certainly taking the highly useful Swiss army knife out of the toolbox. In other words, declaring “no public cloud, ever” is akin to completely disposing of the most versatile tool in your toolbox, and turning to the wire-cutters instead.
Not every organization will be able to take advantage of public clouds in the short to medium term, and some never will. There are wire-cutter-worthy systems out there, but they are far and few between. If you’re certain your organization is not using public cloud at all, I’ll give a much repeated warning by saying you can only be sure by checking expense reports. The vast majority of organizations aren’t at either extreme of the spectrum. Hence, it’s a question of “which workloads” go on the cloud.
Not every workload is appropriate for public cloud. For enterprises, regulatory compliance poses a problem. Regulations and compliance mechanisms (imposed externally or internally), often don’t keep-up, as they are not built to deal with the bleeding, leading, or even dull edge of innovation. Most often, an enterprise datacenter does have prime candidates for public cloud. These are workloads that can be migrated with varying levels of care and effort, while other workloads are not likely to leave the private datacenter model any time soon.
For the vast majority of organizations, there is a spectrum of workloads. On one side are the public-cloud wire cutter systems, on the other are the security-free, libertine workloads. This means that most organizations will have a hybrid of public cloud and private datacenter computing.
Today, an example of a candidate for hybrid is a content streaming enterprise. The peaks and valleys of demand of the actual streaming can be offset by engaging public cloud providers. At the same time, the systems that process customer payments are held in-house to make Payment Card Industry (PCI) audits more predictable. Another example is a software-as-a-service start-up that has one thousand end-users today, but may need to provision twenty-five thousand additional users tomorrow. Of course, most organizations don’t have needs that are quite as clear-cut.
What, then, fuels the wire cutter crowd? I’ve heard two common objections: control and security.
Control, when it comes to a service, is best called “insight”. A lack of control rears its head when a public cloud provider experiences an outage. No public cloud wire-cutter proponent will defend the idea that private datacenters are impervious to outages. Being able to follow troubleshooting during an incident, and get a detailed root-cause analysis, leaves the trust in their hands. Providers do have some ground to cover in providing customers with real insight into outages.
On the security front, public cloud providers also have similar issues. Like virtualization vendors, efforts on the security front have been passive, at best. Endpoint security vendors, in particular, have generally done a poor job of adapting to virtualization, and a nearly non-existent job of working with public cloud providers. While moving perimeter security from a physical to virtual appliance is relatively simple, adapting endpoint security to fit a security-as-a-service model (aka – usage-based, on-demand, etc.) has been rare.
Being IT, the spectrum shifts constantly. But, in this case, the direction is predictable. Organizations will move workloads to public cloud, and faster than one could have expected five years ago. It was just ten years ago that ESX was barely one year old. Amazon EC2 launched five years later. Security vendors are also creating, or buying (if there is a rare someone to buy in this quickly moving market) innovation. Amazon has taken some early steps to collect and promote layers of computing that run on AWS via their marketplace, including security.
Overall, declaring “no public cloud, period” recalls echoes of “no virtualization, ever” or “paper only and always – computers are just for arithmetic”. As public cloud vendors gain more enterprise experience and security vendors create public-cloud-specific offerings, the natural evolution of virtualization, cloud, and public cloud, continues.
As security professionals, our lives would be much simpler with the liberal application of wire cutters, though operations folks would not be very pleased with us. As the organizations we work with turn to cloud, public and private, we provide our greatest value by innovating for today’s virtualization spectrum, and anticipating what’s on the horizon.