Follow up on the NSA decryption revelations
Last week, I wrote a post about allegations that the NSA has the capability to decrypt various Internet encryption packages and the potential ramifications of that. You can find that post here.
Over the weekend, I found a blog post from Professor Matthew Green a cryptographer and research professor at Johns Hopkins University. The post is On the NSA. It’s a very good and more detailed description of the issues being alleged by Edward Snowden.
Here are my thoughts on some of the issues he’s raised:
- He has not see that actual documents (nor have I) that were leaked. Sadly, without the actual documents, many of the allegations are just that: allegations. Given the seriousness of the allegations, the documents need to made available to experts to determine exactly what capabilities the NSA (and its sister agency the British GCHQ) possess. Normally, I’d be against exposing this, but without that transparency, the worst is assumed and it’s severely damaging US’ international credibility.
- On the topic of credibility, the National Institute of Standards and Technology or NIST provides a crucial role in the development and standardization of encryption standards (as well as a large number of other standards). This US government organization’s reputation has been largely untarnished and must remain so to insure that standards it endorses are trusted. Without trust, these standard protocols, API and algorithms will not be widely deployed relying instead on a host of privately developed closed solutions that will not be properly vetted by the appropriate community (e.g., cryptography community for encryption algorithms) and will not work well together. These allegations have the potential of undermining the reputation of NIST. Again, we now need transparency.
- He confirms my suspicion that the only method to decrypt large numbers of encrypted streams is via a backdoor. You must have the master key. Brute forcing decryption of AES-256 (the underlying technology of SSL) can only be done with a static (not in real-time) encrypted dataset and will take a long time.
- He mentions the two most widely deployed encryption packages used for securing web SSL traffic, Microsoft’s Crypto API (which is part of their IIS server) and the open OpenSSL OpenSSL is by far the most dominant, while the Microsoft package has about 20% market share. If one is going to install some form of back door, you’d get the most bang for the buck with these. The Microsoft solution would be the best, because as a closed system, it would be difficult for experts to find the back door. However, it would require help from the inside, either with or without knowledge from Microsoft management. OpenSSL would be more difficult to subvert due to its openness, but Green notes that the code base is complex and gnarly (my word) and likely there is not one person who understands it completely.
- Regarding smart phone 4G technology, Green notes that there is one key database at each vendor. I’m not sure whether this is a big concern or not without understanding some other details like when do keys get generated for a given phone, on initial startup or for each session or for each call? The implication here is that the vendors might be providing this database to the NSA. Keep in mind that telecommunications vendors have historically allowed the NSA access to telecommunication trunks for snooping. Providing the keys would amp this cooperation up significantly.