Now that virtualized resources and interwoven cloud services are digging deeper into legacy IT infrastructure footprints, many businesses are realizing the convenience of on-demand computing power, quick application deployment and scalable storage. At the same time, however, these new service models, almost by definition, ensure that IT operations are no longer tied to particular, discrete pieces of hardware. This is double-edged sword for IT departments. On the one hand, it may mitigate the risk of a standalone server or PC failing and initiating data loss or downtime, but it also means that traditional approaches to perimeter security, which create essentially a single comprehensive layer around local information silos, may become less effective.
To meet rising access demand from mobile users and handle big data applications, data centers have expanded, in the process tapping into network-distributed resources. In the attempt to perhaps decentralize IT operations, at least in terms of the basic hardware and power sources that they use, organizations may have instead ironically created an even more centralized target for attack and disruption: The data center, which is now home to copious amount of critical information and a magnet for cybercrime.
Going forward, perimeter security will need to be supplemented by measures that specifically address vulnerabilities like Web applications. In others words, locking down the perimeter will not matter unless the assets inside are also individually insulated, especially since hackers may not always attack from the proverbial front gate, instead targeting weak points via tactics like email phishing. The perimeter will still require attention, but organizations should take the security overhaul opportunity to also rethink data storage practices, discarding extraneous data and focusing on storing everything else such that it attracts less attention and produce fewer repercussions if breached.
Perimeter security now about securing many locations, devices and applications
In a recent piece for Datacenter Dynamics, Nigel Stephenson likened the traditional approach to data center security to a moat encircling a castle. Under this model, administrators can tightly control application access and authentication, ensuring that even mission-critical data stores are secure. Moreover, they are safe not simply because of the access controls, but because they are not overly networked, meaning that breaching one of them, while costly, would not necessarily entail the subsequent penetration of the whole network.
However, the prospect of attacks on networked data centers has already become reality, and it requires a new security mindset attuned to Internet-based vulnerabilities found across multiple endpoints and applications. Stephenson proposed “motels” as a new metaphor for thinking about securing each of their virtual machines and devices against threats. Underscoring the stakes, a recent Verizon study of data breaches discovered a rise in external attacks using a deadly range of tactics aimed at users and remotely hosted assets, including social engineering, malware distribution, email phishing and distributed denial of service campaigns.
Over 90 percent of data breaches were attributed to external actors, primarily organized crime groups. Web applications were an especially popular vector for attacks and provided entry for opportunistic hackers, who initiated over 70 percent of attacks on businesses, easily outnumbering pre-targeted incidents.
The Verizon authors did not discover any particular hacker propensity for attacking virtualized resources rather than internal ones. However, they did highlight the appeal of targeting remotely hosted devices, which may be less well-configured and secure than on-site equivalents. From 2008 to 2012, servers were the largest compromised asset class, finally losing the top spot to human users in 2013. The frequency with which hackers targeted servers and individual users hints at their ongoing attachment to one of the most effective tactics for bypassing network security, namely, email phishing.
The efficacy of email phishing in the networked enterprise
The Verizon researchers discovered that email phishing was an enormously successful way to get users to take action. While the payloads may not always be deadly, the efficacy of phishing indicates the difficulty of securing all of an enterprise’s networked users and appliances against danger.
“Running a campaign with just three emails gives the attacker a better than 50 percent chance of getting at least one click,” observed the study’s authors. “Run that campaign twice and that probability goes up to 80 percent, and sending ten phishing e-mails approaches the point where most attackers would be able to slap a ‘guaranteed’ sticker on getting a click.”
Stephenson provided extra color for the dilemma that CIOs face, highlighting the growing centrality of mobile devices that offer around-the-clock email access in addition to weakly secured Web applications and possible attack channels like social networks. On a more technical level, the basic notion of highly distributed software, whether delivered via virtual machine or a public/private cloud, means that managing access can become a headache and that any device could potentially become a cybercriminal’s gateway into a heavily consolidated network.
“Lego-like applications, built on re-usable services, are increasingly common in today’s data centers,” explained Stephenson. “They accelerate development considerably but they also make it more difficult to enforce access requirements because of the fan-out hierarchy per user session and high number of TCP connections per client interaction.”
Encrypting data, tracking IP addresses and securing the network
To get a handle on virtualized networks, Stephenson recommended that enterprises “flatten the network architecture to as few layers as possible,” grant access based on IP address and apply security solutions for virtual machines and Web applications. In addition, they may need to revise their counterstrategies for DDoS attacks, which have evolved to maintain low profiles as they eat up processor resources.
In a piece for FierceMarkets, writer David Weldon examined the current state of cloud security, observing that it remains a chief obstacle to the adoption of additional virtual and remotely hosted resources. For heavily regulated industries like healthcare, potential cloud projects are also saddled with specific requirements governing any virtual infrastructure.
Cloud providers can assist organizations set up secure, high-performing networks, but ultimately the onus is on CIOs to realize that virtualization, cloud computing and mobility mean that ever-more data is in transit and at risk. Accordingly, they may need to turn to encryption as another key component in their security blanket.
“Always assume that any data that leaves the building is at risk,” High Cloud Security CEO Bill Hackenberger told Weldon. “You need to bring your own security. Encrypt the data before it goes to the cloud. Your data is your responsibility.”