
Enterprises are in the midst of developing massive amounts of new, cutting-edge software and websites optimized for mobility. To this end, most are simultaneously upgrading their IT networks, implementing virtualized and cloud-based resources that make it more feasible to distribute applications across a comprehensive range of devices.
The shift from legacy IT infrastructure to remotely hosted computing has made it more important than ever that businesses perform diligent testing of any software or network before it goes live and becomes mission-critical. Unfortunately, too many of them neglect to seek help, in the form of professional IT security health checks (ITHCs) that could shore up operational security and ensure that businesses do not lose customers or revenue due to avoidable oversights.
Given the complexity of virtualized technologies and cloud-based implementations, which are often amalgams of infrastructure, software and developmental tools from a variety of vendors and open source projects, it is imperative that enterprises take the time to carefully vet their networks and software. The need for thorough testing is exacerbated by the growing number of Web-based attacks and outages affecting businesses across the globe.
Evolution of malicious Web traffic underscores potential risks of inadequate testing
Writing for ZDNet, Larry Seltzer analyzed Akamai’s most recent State of the Internet Report, which discovered growing variance in where malicious Internet traffic originates and what ports and protocols it likes to target. Countries that once dominated the share of distributed and Web-based attacks, like the U.S. and China, have given way to an increasingly diversified assortment of nations. Asia accounted for nearly 80 percent of all malicious traffic in the second quarter of 2013.
In the past, cybercriminals frequently the Microsoft-DS port, which is used for Microsoft and Samba SMB networking. However, they have recently changed tacks and devoted more resources toward the HTTP, HTTPS, SSL and TSL ports.
In light of these trends, businesses cannot become complacent after securing only a single port on the network or an isolated aspect of their product. More specifically, Akamai’s research revealed that most distributed denial of service attacks still go after high-profile assets like enterprise websites and ecommerce platforms, some of which may be opportunistic targets.
Gmail outage illustrates that even remote possibilities can become reality
Regular ITHCs are essential for mitigating security risks during the rollout of any complex IT project. However, TechTarget contributor Robert Newby recently highlighted the difficulties associated with these processes, which are often beset by delays, inadequate documentation and improper budgeting of time. Lead times can take take weeks, despite the website, application or network needing to go live on a much shorter timetable.
Newby’s general prescriptions for improving ITHCs centered on companies taking a proactive approach that planned for testing and vulnerability scanning far in advance. Even large, mature organizations can run into issues when their seemingly airtight networks fail to account for improbable (but potentially costly) contingencies, so it is crucial that all IT departments businesses get professional help in identifying and addressing a full range of risks.
In a piece for Mashable, Samantha Murphy Kelly chronicled a recent Gmail quasi-outage that resulted in delayed message delivery. The incident was result of a highly unlikely network failure and should serve as an example of the level of diligence that organizations need to bring to the table to prevent falling victim to similar events.
“The message delivery delays were triggered by a dual network failure,” stated Google via its official blog. “This is a very rare event in which two separate, redundant network paths both stop working at the same time.”
NIST tools can make software testing easier for businesses and developers
While networks and websites are common targets for attack, the business software that they are designed to support may harbornumerous risks in its own right. In a piece for Forbes, Bruce Rogers examined the possibly of widespread vulnerability of the enterprise to simple attacks that exploit loopholes in “patchwork” software.
Many of the software tools for running servers and supporting development have roots in open source projects. The security-challenged Android operating system is also technically a distribution of Linux. Over the past six years, requests for open source components have risen from 100 million to 20 billion as developers seek inexpensive, flexible frameworks for building applications.
However, coordinating different components and ensuring that they work securely has been a challenge, with the recent rollout of the U.S. government’s Affordable Care Act website being a prime example. In 2008, the National Institute of Standards and Technology released a system testing that made it easier to identify software flaws before a project went live, but many developers have not changed their practices despite having access to more data.
Going forward, it will be important that the public and private sectors work together on projects in the spirit of the NIST’s tool, while creating a better framework for organizations to apply testing results to developmental practices.
Internet of Things will require more scrupulous IT security testing
The seemingly insatiable demand for new software development paradigms will likely continue as enterprises set up the “Internet of Things,” or ecosystems of networked appliances that can also run advanced applications..
Inevitably, these devices, which could include anything from a smartwatch to a more sophisticated server cabinet, will house sensitive data and support critical networks. Accordingly, these novel machines will need the same level of security scrutiny as traditional assets.
“[The growth of the Internet of Things] is not a hard step; it is more of a gradual slope,” said Forrester Research principal analyst Andrew Rose. “Many things are connected to the Internet now, and we will see an increase in this and the advent of contextual data sharing and autonomous machine actions based on that information.”
The physicality of the Internet of Things means that security testing will also require careful attention to both physical and virtual threats. ITHCs must evolve to account for increasingly complex, multi-component IT infrastructure that utilizes a dizzying array of hardware, virtualized assets and open source software. NIST has provided a good blueprint for how to get moving on improving software testing, but the cybersecurity community must take additional steps to shore up IT projects.