Trend Micro Facebook TrendLabs Twitter Malware Blog RSS Feed You Tube - Trend Micro
Search our blog:

  • Recent Posts

  • Calendar

    November 2013
    S M T W T F S
    « Oct   Dec »
     12
    3456789
    10111213141516
    17181920212223
    24252627282930
  • Email Subscription

  • About Us
    TrendLabs Security Intelligence Blog(breadcrumbs are unavailable)

    Archive for November 20th, 2013




    Throughout all of 2013, there have been numerous revelations about how the NSA conducts mass surveillance on the Internet. These have sent the Internet Engineering community reeling. Protocols that have been in use for decades and based heavily on intrinsic trust have had that trust violated.

    This has caused the Internet standards community to take a look at the need for encryption. Specifically, it’s been discussed whether HTTP/2.0 – the latest version of the protocol that powers much of the Internet – should be encrypted by default. Overall, this is a positive trend, but there are some challenges that should be considered.

    First, encryption without pre-existing trust adds little value. Casual eavesdropping can be prevented, but it is ineffective against a sophisticated operator. Consider, for example, self-signed certificates (often found in small or local web applications). An attacker could easily impersonate the server with a key and certificate that they create and proxy your traffic unencrypted to the real web server, giving them access to read, modify, and inject traffic within your session.

    Second, certificate authorities are not always reliable or secure either. Various CAs like Comodo, DigiNotar, GlobalSign, and Starcom, have all suffered some kind of security incident. One can argue (for a very long time) whether non-trusted CAs or having no encryption is “better”. DANE (specified in RFC 6698) allows service operators to publish keys and certificates within DNSSEC, which means that certificates can be verified without a CA being involved. Challenges like how to deal with typo-squatting domains and compromised DNS infrastructure remain, but it’s technically possible to establish public trusted encryption without the involvement of a CA.. Whether it will be put into wide use is unclear.

    Third, what percentage of the traffic needs to be encrypted? In the past, encryption was used sparingly due to the cost and the increased resources necessary. Banking-related pages and transactions were the most frequent cases where this was done. However, improvements by CDNs and gains in processing power have reduced the relative costs to the point where it is feasible to encrypt all traffic. Many sites are doing just that today.

    Finally, we have to look at encryption primitives themselves.  The security of some of these critical building blocks of security has been called into question. Some are worried that these algorithms have been weakened in such a way that government agencies can decrypt otherwise secure traffic. For example, it has been alleged that the Dual_EC_DRBG random number generator (RNG) has been compromised by the NSA by specifying insecure constants. While by no means a master key, a cryptanalyst would have an enormous head-start if they had any insight into the next number that is likely to come out of a RNG.

    Of course, some would say that wide-scale HTTP encryption is not necessary. “If I haven’t done anything wrong, I have nothing to hide.” These fall rather flat in the face of bulk large-scale data collection by governments. Could simply making the same set of search-engine queries as a terrorist put you on a watch list?

    This may seem like hyperbole, but in the world of big data small similarities often trigger associations that may or may not exist in reality. Did you sell your couch to the brother in-law of a terrorist three years ago on Craigslist? Connections that we would consider insignificant in person can take on new meaning as part of data correlation.

    We have come a long way from the early days of the World Wide Web, where everything was in plain text and images were a novelty. Now that so much of our lives exist online, it is increasingly important to have trustworthy infrastructure behind the services we use. Changes in the threat landscape mean that our infrastructure has to change too. HTTP/2.0 won’t solve all of the problems facing the Internet. However, it is a step in the right direction.

     
    Posted in Bad Sites | Comments Off


     

    © Copyright 2013 Trend Micro Inc. All rights reserved. Legal Notice