Trend Micro Facebook TrendLabs Twitter Malware Blog RSS Feed You Tube - Trend Micro
Search our blog:

  • Recent Posts

  • Calendar

    August 2014
    S M T W T F S
    « Jul    
     12
    3456789
    10111213141516
    17181920212223
    24252627282930
    31  
  • About Us
    TrendLabs Security Intelligence Blog(breadcrumbs are unavailable)

    Author Archive - Vic Hargrave (Solution Project Lead)




    This is a continuation of our previous post on Hadoop security.

    As we mentioned in our earlier post, we can use OSSEC to monitor for the file integrity of these existing Hadoop and HBase systems. OSSEC creates logs which a system administrator can use to check for various system events.

    It’s worth noting that big data systems of all kinds, not just Hadoop and HBase, produce significant amounts of log data.  Installing a big data cluster is non-trivial, to say the least, and these logs play a crucial role in helping IT staff set up clusters and diagnose system problems. Big data system administrators are, in effect, already used to checking log files for potential problems.

    Some of the important Hadoop security events that OSSEC can monitor are:

    • Failed HDFS operations
    • HBase logins
    • Kerberos ticket granting
    • Root logins to nodes

    Configuring an OSSEC agent to monitor one or more Hadoop log files involves adding the paths of the log file directories to the agent’s ossec.conf file.  For a HDFS namenode we want to monitor the hadoop-hdfs-namenode-{host}.log file, where {host} is the name or IP address of the name node. This file is normally located in the /var/log/hadoop-hdfs/ directory.  Similarly, for an HMaster node we are interested in monitoring hbase-hbase-master-{host}.log file in the /var/log/hbase directory.  This gets our Hadoop and HBase log files from OSSEC agents to the server.

    The next step is to write decoder rules to parse the logs and alert rules to generate alerts based on the content of the logs. Decoders consist of regular expressions that the OSSEC server uses to find log lines of interest and map words to standard fields recognized by the server. Rules enable the server to examine the decoded fields to find content that is indicative of important security events.  When event data from a decoder for a given rule is found, the server generates an alert defined by the rule.

    Visualizing Hadoop Security Events

    The simplest way to visualize OSSEC security alerts is to continually display the alerts log file.  Although this sort of works, it’s like looking at raw data in a spreadsheet. It is difficult to impossible to spot trends in the data.

    OSSEC comes equipped to send alert data via syslog to any SIEM (security information and event management) tool that provides syslog compatibility.  One SIEM that we like to use is Splunk, together with an open source application called Splunk for OSSEC. This can be installed on the OSSEC server directly from the Splunk application console.

    Splunk for OSSEC is designed to take OSSEC alerts and then release a summary, as well as perform trend analysis. An example of an OSSEC dashboard on Splunk is shown below. Here you see summaries of events over time, including the HBase and HDFS events discussed earlier.

    Figure 2. Splunk for OPSSEC

    (Image originally from http://vichargrave.com/securing-hadoop-with-ossec/)

    Summary

    Big data systems can benefit from the host intrusion detection services provided by a HIDS like OSSEC. These systems ensure the safety and security of big data systems, which is essential to organizations adopting big data. We are contributing the OSSEC rules for Hadoop back to the OSSEC Project to promote their use in the OSSEC and Hadoop communities, in line with our previous support for open source projects.

     
    Posted in Data | Comments Off



    Over the years, the Hadoop development community has steadily added facilities to Hadoop and HBase that improve operational security. These features include Kerberos user authentication, encrypted data transfer between nodes in a cluster, and HDFS file encryption. Trend Micro has contributed several security features that were incorporated into the public Hadoop ecosystem(see our previous post Securing Big Data and Hadoop for details).

    Although these security facilities are important, they are primarily focused on protecting Hadoop data. They do not give IT staff visibility into security events inside their Hadoop clusters. That’s where a good host intrusion detection system comes into the picture.  We have been working on enhancing big data security by applying OSSEC, our open source host intrusion detection system (HIDS), to add security monitoring to Hadoop and HBase systems. In this post, we’ll go over the capabilities of OSSEC.

    OSSEC Overview

    OSSEC provides several important security capabilities including file integrity checking, system log analysis, and alert generation.  OSSEC has an agent/server architecture. Agents handle monitoring logs, files and (on Window systems) registries then sending back relevant logs in encrypted form to the server over UDP. Intrusions on agent systems are usually detectable though file changes or logged security events.

    Figure 1. Securing Hadoop with OSSEC
    (Image originally from http://vichargrave.com/securing-hadoop-with-ossec/)

    On the server, the logs are parsed with decoders and interpreted with rules that generate security alerts. OSSEC comes out of the box with a large number of decoders and rules that support a wide range of systems and events. OSSEC’s coverage can also be expanded by custom log decoders and security alert rules.

    Hadoop File Integrity Checking

    Hadoop and HBase systems rely on numerous configuration files and Java files to work properly.  Unauthorized changes to any of these files can adversely affect a cluster.  This is particularly true of the HDFS namenodes in a Hadoop system and HMaster nodes in a HBase system.  The former controls HDFS operations, while the latter is involved with I/O between HMaster and region servers.

    OSSEC can detect changes to these important Hadoop files. When an OSSEC agent is started, it recursively scans user specified directories and calculating MD5 and SHA1 hash values for each file it encounters. The file names and hashes are stored in a database on the OSSEC server.  The agent repeats this operation at user specified intervals (usually every few hours).  When the server receives a hash value for a given file that is different than the hashes were previously stored, the server will generate a security alert.  The OSSEC server records each security alert in its own alerts log file.

    Normally, the configuration files for Hadoop and HBase systems are located in the /etc directory while the Java files are located in /usr/bin and /usr/sbin. Out of the box, OSSEC is designed to do file integrity checking on all files in these directories.  However, if these files are stored in other directories it is a simple matter to check these directories as well by modifying the agent configuration file, ossec.conf.

    In the second part of this entry, we will discuss how these tools can be used to quickly detect and graphically show potential intrusion into Hadoop/HBase systems.

     
    Posted in Targeted Attacks | Comments Off


     

    © Copyright 2013 Trend Micro Inc. All rights reserved. Legal Notice