Security Experts:

Virtualized Data Center Security Part 2 - What Changes, What Stays the Same

As You Adopt Cloud and Virtualization Technologies in Your Data Center, What Security Requirements Change, and What Stay The Same?

In part one of the Virtualized Data Center Security series, I wrote about how important is it to ensure security solutions are integrated with orchestration systems so that security does not slow down the dynamic nature of virtualization and cloud environments. In today’s installment, I want to continue the discussion on security requirements beyond automation and orchestration - what stays the same and what changes as you move towards virtualization and cloud?

What Stays the Same?

The objective of the data center is to serve up applications quickly, reliably and securely. This has not changed, and the issues that challenge a traditional data center - namely performance, threat protection, access control and flexible networking integration - are still applicable in a virtualized data center. However, these requirements - in particular threat protection and access control - are now complicated by three factors:

Virtualization Security

New application landscape – Applications used to be characterized by the ports and protocol they used to communicate. For example, port 53 for DNS, and port 21 for FTP. This is no longer true. Many enterprise applications utilize multiple ports, non-standard ports and exhibit evasive behaviors like port-hopping. The Microsoft SharePoint application, for example, requires ports 80, 137, 443, 1433, 32843, 138, 139, 1434 and more depending on the specific function to be enabled, and uses “web” ports even though its primary function is an internal enterprise collaboration application. SAP and Oracle also use ports of high ranges, and a very large set of them.

Clearly, traditional security solutions that utilize ports and protocol to identify and control traffic in the data center are ineffective, and leave security holes not only for other application traffic to traverse but also threats. The traditional solutions also make the security admin job excruciatingly painful. Imagine trying to keep up with the proliferation of applications in a virtualized environment.

Threat landscape – The threat landscape has also evolved from notoriety-based attacks to targeted, sophisticated attacks driven by cybercriminal and nation-states. Threats use a complete framework to infiltrate an organization, from application-enabled vectors and exploits to modern malware. The challenge is that traditional network solutions that operate in silos, such as the addition of firewall helpers, like IPS and anti-virus applications behind a firewall, and cannot understand threats that coordinate multiple disciplines. These threat protection solutions apply protection based on a foundation of ports- and protocol. This means they miss threats that utilize similar evasive behavior as applications, and tunnel across non-standard ports. When these threat protection solutions are applied across all ports, they can completely crush the performance of the data center.

Distributed enterprise – The third factor that complicates the security of the virtualized data center is who can now access the data center applications, which now range from internal employees to mobile or remote users, to business partners that require access from a variety of devices and locations. The consumerization of IT and BYOD becomes a network problem once the user or device gets on the network, and challenges entail enabling access based not only on user, application and content, but also on the device used to access applications.

Therefore, while the key security requirements for the virtualized data center stay the same, the approach to resolve them must take into consideration these factors. It requires a security solution that can identify applications regardless of ports and protocol, and that can enable the applications by user or group so the policies can be appropriately distinguished for employees, contractors and business partners. Finally, it requires the ability to address new modern threats that utilize multiple vectors, without impacting the performance of the data center.

What’s Changed?

Of course, you now also have to deal with new security challenges from the virtualized infrastructure. The virtualized infrastructure is made up of many different components-- from hypervisor to guest operating system and application. These components need to be secured to ensure protection for the virtualized environment. Protecting this server infrastructure is largely dependent on software hardening best practices as well as external security appliances to front-end virtualized servers.

More importantly, there are additional characteristics of the virtualized environment that introduce challenges to security:

Dynamic nature of virtualization - Virtual machines can be highly dynamic, with frequent add, move and change operations. This complicates the ability to track security policies to virtual machine movement so that requirements and regulatory compliance continue to be met.

Visibility of Intra-host Communications - Virtualized computing environments also enable direct communication between virtual machines within a server. Intra-host communications may not be visible to network-based security appliances residing outside a virtual server. Or, alternatively, the routing of intra-host virtual machine traffic to external physical appliances may not be ideal because of the performance and latency requirements. Visibility of intra-host communications may also be required if virtual applications of different trust levels reside within a single server. This means, of course, that a virtual firewall is needed in these use cases.

The consideration of security for a virtualized data center must therefore include not only the ability to address the new application and threat landscape, but also provide the flexibility to be able to accommodate the characteristics of the new virtualization technology.

Alright, what does this all mean?

Let’s summarize the five key requirements as you embrace virtualization and cloud computing technologies in your data center. What stays the same? What changes?

1. Safe application enablement – Safe application enablement means identifying applications regardless of ports or protocol, enabling the applications by user/group while inspecting the content of all threats. This is true for all traffic, including virtualized applications that may require the need for a virtualized firewall to accomplish this. (new!)

2. Threat protection – This requirement stays the same but the approach should change to solutions that feature a complete threat framework to accommodate the new sophisticated, modern and targeted attacks in the wild today. Oh yes, most importantly this must be done without impacting performance! (new!)

3. Flexible integration in the network – Security cannot be a burden in the network, so it must easily integrate into the networking architecture, and be able to accommodate changes in the security posture without rearchitecting. (same!)

4. Cloud readiness – To accommodate the new architecture, the security solution must keep up with virtual machine movement, and be able to integrate with orchestration systems (new!)

5. Management – We haven’t touched on management, but of course this requirement is tablestakes for any security design. The more complicated it is to manage security policies, the less visibility you have on what’s actually happening in your network. Therefore, this requirement is for centralized management, with a single unified policy that integrates application, user, content, and more importantly the same management platform for both physical or virtualized firewalls. (same!)

I hope this was helpful and I welcome your feedback in the comments. In the third and final installment of the series, I’ll address the choice between physical and virtualized firewalls. I touched on the need for virtualized firewalls in this column, but in the final installment I will define exactly which form factor is ideal depending on your environment.

view counter
Danelle is CMO at Blue Hexagon. She has more than 15 years of experience bringing new technologies to market. Prior to Blue Hexagon, Danelle was VP Marketing at SafeBreach where she built the marketing team and defined the Breach and Attack Simulation category. Previously, she led strategy and marketing at Adallom, a cloud security company acquired by Microsoft. She was also Director, Security Solutions at Palo Alto Networks, driving growth in critical IT initiatives like virtualization, network segmentation and mobility. Danelle was co-founder of a high-speed networking chipset startup, co-author of an IP Communications Book and holds 2 U.S. Patents. You can follow her at @DanelleAu.