Mad, Beautiful Ideas
SSL Weaknesses Shown

This last week at the Chaos Communication Congress put on in Berlin, a group of international security researchers revealed some research exposing some interesting flaws in the Secure Socket Layers Certificate system we use to validate that a connection is secure.

The attack was very specific, and was based on an attack against the MD5 digest algorithm first discussed in 2004. This attack basically states that it is possible on commodity hardware today to compute two blocks of data which, though different, share the same MD5 signature. This is a huge problem in cryptography, because when a file is cryptographically signed, the private key doesn't actually sign the file (because that would be encrypting the file with the private key), but rather they encrypt a signature of the file. This means that if you have two-plaintexts that share the same signature, signing one plaintext essentially signs both.

Now, even though MD5 has been known for nearly four years to not be cryptographically secure for generating signatures, it's still been in relatively high use, and the researchers found several companies who were generating SSL certificates which would be trusted by default in IE and Firefox, but signing them with using MD5. Most certificate vendors have migrated toward SHA-1, and we're beginning to see migration to SHA-2, but the holdouts saw little reason to update, until recently, I suppose.

The attack involves generating two certificate requests that will generate the same MD5 signature, so that when the Certificate Authority signs the legitimate request, they can substitute in the illegitimate one as valid as well. There were a few problems with this. There were two parts of the signed certificate they didn't control. First, the serial number, and second the time the certificate started being valid. The serial number turned out to be easy, as RapidSSL, the provider they were using, assigns serial numbers incrementally, and they're certificates become valid exactly six seconds after submission. With that in mind, they were able to guess what the serial number and validity time would be, given that they checked the serial number counter a few days before the attack.

They would then use 200 PlayStation 3 consoles to generate a colliding request which was itself an intermediate CA (meaning the 'rouge' certificate could sign valid-looking certificates). There was a lot of work these researchers did to make the process of generating the collision faster, but they apparently don't plan to release any details of that work. I understand their decision, but I'm not sure I agree with it. Perhaps in a year or so, when more people have migrated away from MD5.

So what's this mean? Well, paired with a DNS spoofing attack it means that the attackers could redirect your banks website from your banks servers to their own, all with a valid SSL certificate. It means that a man-in-the-middle attack could be performed over a secure connection, while looking 100% valid the entire time.

Ultimately, it doesn't change anything for most users. Most users click right through invalid security certificates, blame for which I place on the high cost of SSL certificates. Perhaps with the new EV certificates we'll see prices on the less expensive certificates drop, but I doubt it. For those users who do pay attention, it means that they could be effectively tricked.

Luckily, this research has convinced those few MD5 hold-outs to switch to SHA-1, which will effectively render this attack impossible (at least until SHA-1 is broken), but the last part that the research revealed was the problems with revoking SSL certificates. The browser depends on the certificate to tell it where to look to see if the certificate has been revoked, but the rogue certificates don't supply that information, they've had to overwrite it was random data. The solution involves running your own revocation server, but that is just not reasonable for most users.

SSL is still a good thing, and something that we should be careful to know we have, but it seems that it may require some fundemental changes, not only in how it's used, but in the specification itself.