Software security vulnerabilities get reported all the time. Generally, the response when a vulnerability is uncovered is to thank (or, in many cases, pay) the researcher who found it, and then fix the problem. That’s the standard response in the industry.
A decidedly non-standard response would be to sue the people who reported the vulnerability to stop them from talking about it, and then spend two years trying to hide the issue. Sadly, that’s exactly what German car maker Volkswagen did.
The vulnerability in question was a flaw some cars’ keyless ignition system. These systems, a high-end alternative to conventional keys, is supposed to prevent the car from unlocking or starting unless the key-fob is nearby. The chip is called the “Megamos Crypto,” and is purchased from a third-party manufacturer in Switzerland. The chip is supposed to detect a signal from the car, and respond with a cryptographically signed message assuring the car that it’s okay to unlock and start.
Unfortunately, the chip uses an outdated cryptographic scheme. When researchers Roel Verdult and Baris Ege noticed this fact, they were able to create a program that breaks the encryption by listening to the messages between the car and the key-fob. After hearing two such exchanges, the program is able to narrow the range of possible keys down to about 200,000 possibilities – a number which can be easily brute-forced by a computer.
This process allows the program to create a “digital duplicate” of the key-fob, and unlock or start the car at will. All of this can be done by a device (like a laptop or a phone) that happens to be near the car in question. It does not require physical access to the vehicle. In total, the attack takes about thirty minutes.
If this attack sounds theoretical, it isn’t. According to London’s Metropolitan Police, 42% of car thefts in London last year were performed using attacks against keyless unlocked systems. This is a practical vulnerability that puts millions of cars at risk.
All of this is more tragic, because keyless unlock systems can be a great deal more secure than conventional keys. The only reason these systems are vulnerable is due to incompetence. The underlying tools are far more powerful than any physical lock ever could be.
The researchers originally disclosed the vulnerability to the creator of the chip, giving them nine months to fix the vulnerability. When the creator refused to issue a recall, the researchers went to Volkswagen in May of 2013. They originally planned to publish the attack at the USENIX conference in August 2013, giving Volkswagen about three months to begin a recall/retrofit, before the attack would become public.
Instead, Volkswagen sued to stop the researchers from publishing the paper. A British high court sided with Volkswagen, saying “I recognise the high value of academic free speech, but there is another high value, the security of millions of Volkswagen cars.”
It’s taken two years of negotiations, but the researchers are finally being allowed to publish their paper, minus one sentence which contains a few key details about replicating the attack. Volkswagen still hasn’t fixed the key-fobs, and neither have the other manufacturers who use the same chip.
Security By Litigiousness
Obviously, Volkswagen’s behavior here is grossly irresponsible. Rather than trying to fix the problem with their cars, they instead poured god-knows how much time and money into trying to stop people from finding out about it. That’s a betrayal of the most fundamental principles of good security. Their behavior here is inexcusable, shameful, and other (more colorful) invectives that I’ll spare you. Suffice to say this is not how responsible companies should behave.
Unfortunately, it’s also not unique. Automakers have been dropping the security ball an awful lot lately. Last month, it was revealed that a particular model of Jeep could be wirelessly hacked through its entertainment system, something that would be impossible in any security-conscious car design. To Fiat Chrysler’s credit, they recalled more than a million vehicles in the wake that revelation, but only after the researchers in question demoed the hack in an irresponsibly dangerous and vivid way.
Millions of other Internet-connected vehicles are likely vulnerable to similar attacks – but nobody’s recklessly endangered a journalist with them yet, so there’s been no recall. It’s entirely possible that we won’t see change on these until someone actually dies.
The trouble here is that car makers have never been software makers before – but now they suddenly are. They have no security-conscious corporate culture. They don’t have the institutional expertise to deal with these problems in the right ways, or build secure products. When they’re faced with them, their first response is panic and censorship, not fixes.
It took decades for modern software companies to develop good security practices. Some, like Oracle, are still stuck with outdated security cultures. Unfortunately, we don’t have the luxury of simply waiting for companies to develop these practices. Cars are expensive (and extremely dangerous) machines. They’re one of the most critical areas of computer security, after basic infrastructure like the electric grid. With the rise of self-driving cars in particular, these companies must to do better, and it’s our responsibility to hold them to a higher standard.
While we’re working on that, the very least we can do is get the government to stop enabling this bad behavior. Companies shouldn’t even try to use the courts to hide issues with their products. But, so long as some of them are willing to try, we certainly shouldn’t let them. It’s vital that we have judges who are aware enough of the technology and practices of the security-conscious software industry to know that this kind of gag order is never the right answer.
What do you think? Are you concerned about the security of your vehicle? Which auto maker is best (or worst) at security?
Image Credits:opening his car by nito via Shutterstock