Continuous monitoring is rapidly becoming the new baseline for enterprise network security. Today's IT architectures are complex contraptions that process enormous amounts of data, while serving as the foundation of organizational continuity and efficiency. Needing to keep tabs on so many systems, enterprises have moved away from discrete, checklist-laden compliance exercises and instead taken up continuous monitoring, a cutting-edge approach to risk management that better equips them to deal with issues such as advanced persistent threats.
Continuous monitoring basics: What it is and how it protects the network
The U.S. federal government has been one of the foremost champions of continuous monitoring, mandating its uptake for agencies via a 2010 update to the Federal Information Security Management Act, dubbed FISMA 2.0 to differentiate it from the original 2002 legislation. The reasoning behind the overhauled FISMA is that new processes are not merely useful but necessary for managing risks as cloud computing, diversified endpoint fleets and data analytics become the norm. Continuous monitoring usually entails:
- Oversight of how systems are authorized and which individuals are accessing them. Responsibility and accountability for security mechanisms may also be assigned to specific stakeholders.
- Network analytics that screen for anomalous activity. Log source data from routers and switches, as well as enterprise resource planning and other applications, is collated to create a baseline against which behavior can be evaluated.
- Real-time and historical analysis of data. Events can be assessed on their own and/or in relation to similar occurrences from the past. The real-time aspect gives IT teams the ability to make quick yet informed decisions about how to address problems. To this end, automation of processes and systems may help.
- Broad integration and coordination between information systems and the organization at large, so as to encourage ideal execution of cybersecurity strategy.
The National Institute of Standards and Technology has positioned continuous monitoring as a broad-based approach, predicated on technology in addition to processes and individual activity. Recently, the government's commitment to this strategy has begun bearing fruit.
The Department of Homeland Security, for example, confirmed the implementation of its Einstein 3 software that covers bases such as network flow monitoring, intrusion detection and automated remediation. The Marine Corps has also made progress in setting up mechanisms to safeguard its complex IT architectures, which encompass many systems that remotely connect and plug into its main service network.
Why continuous monitoring is needed to shield enterprises from APTs, DDoS and other threats
While continuous monitoring is most commonly discussed in the context of the public sector, it has consequences for businesses, too. As demonstrated in the Trend Micro report "Continuous Monitoring in a Virtual Environment," traditional cybersecurity tools such as antivirus software may no longer be sufficient to beat back the rising tide of APTs, distributed denial-of-service attacks, ransomware such as CryptoLocker and modularized malware such as FLAME and the BlackHole exploit kit.
At least one major cybersecurity vendor has already declared antivirus "dead" for its waning abilities to thwart intrusions that result in data breaches. Certainly, the current threat landscape is strewn with problems that demand a comprehensive monitoring:
- Social engineering is on the rise, buoyed by media such as Facebook, Twitter and LinkedIn. A group of Iranian operatives recently drew U.S. officials into a years-long scheme designed to build trust for siphoning of sensitive information.
- DDoS tactics have evolved. Attacks now exploit novel surfaces such as the legacy Network Time Protocol. At the same time, they're growing in intensity. June elections in Hong Kong were marred by a 300 Gbps DDoS labeled one of the most sophisticated ever.
- APTs put data at risk by evading detection for months, if not years. At least 75 U.S. airports were targeted by one such campaign in 2013. The nonprofit Center for Internet Security, which discovered the issue, documented twice as many security incidents in 2013 as in 2012, with many of them involving ransomware, vulnerabilities in Web servers and APTs.
Moreover, the rise of the cloud and the consumerization of IT has given immense power – in terms of both raw technical capability and access to valuable data – to an increasingly broad range of actors. Staging a sophisticated attack can be as easy as having an Internet connection and basic know-how of how to operate a server and/or craft a convincing online persona. Such was the case in an early 2014 DDoS attack on CloudFlare that clocked in at a then-record-breaking 400 Gbps peak bandwidth.
"Remarkably, it is possible that the attacker used only a single server running on a network that allowed source IP address spoofing to initiate the requests," stated CloudFlare CEO Matthew Prince at the time.
Enterprises still have a ways to go in modernizing their monitoring
Shoring up defenses to account for new threats may seem difficult, especially given the size and complexity of many enterprise IT systems. A recent ESG survey found that firms were lacking in many key areas that could open them up to cyberattacks. When asked what was the weakest link in their security chains, the study's 257 respondents stated:
- Monitoring user activity (28 percent)
- Endpoint monitoring (25 percent)
- Threat intelligence (24 percent)
- Data access monitoring (23 percent)
- Network traffic monitoring and analytics (22 percent)
Taken together, these data points make a strong case for continuous monitoring implementation, even for organizations outside the public sector. Keeping assets safe requires more than just installing antivirus software to shield machines from malware. IT departments must account for what users are doing, which applications are in use and how the network is functioning overall.
Moreover, a sensible continuous monitoring strategy also helps enterprises deal with a different, but related, problem: Having too much data to make sense of. A 2012 Meritalk survey of 151 federal CIOs and IT managers found that virtually all of them expected their stored data volumes to keep growing over the next few years. With continuous monitoring, data can be intelligently utilized to thwart APTs and other modern threats.