Security has finally become a serious issue for smartphone owners. The high-profile leaks of undressed celebrity iPhone owners, combined with a constant stream of warnings about Android bugs and malware, has left many people feeling uncertain. Even governments cannot be trusted to respect their citizen’s privacy.
Apple’s Tim Cook has turned this into a selling point, with his pledge that “We’re not like all the others” when it comes to security. The message is clear; buy an iPhone if you want privacy. Here’s how iOS 8 fulfills – or fails – that promise.
The Perfect Passcode?
Tim Cook announced at the last Apple Keynote that, as of iOS 8, even Apple will no longer be able to get around your passcode and decrypt your device. This means anyone who has your iPhone or iPad, whether they’re a thief or a police officer trying to execute a warrant, will find cracking your phone to be nearly impossible. Apple says this is a big step forward for device security, but is that true?
In a word, yes. A passcode on an iOS device may just look like a number, but once enabled it activates encryption of your entire smartphone. The encryption key is generated by combining your device passcode with a secret system key. In the past Apple has kept its secret keys to help customers who’ve forgotten their code. This announcement means Apple is no longer doing that, so bypassing the code by acquiring the full encryption key is no longer viable.
With this move Apple is taking itself out of the loop. The company is telling both hackers and governments that communicating with or attacking the company to gain a device’s encryption key is pointless. While a problem for anyone who forgets their passcode, this is a boon for everyone else including Apple who can now sincerely reply to law enforcement requests with a simple “Sorry, we can’t help you.”
The new encryption strategy is particularly effective on the iPhone 5S, 6 and 6 Plus. These phones have a chip called the Secure Enclave which is built with a unique ID that is not accessible to other components and not remembered by Apple.
Apple Pay The Secure Way
Secure Enclave was widely speculated to have a robust future when Apple added it to the iPhone 5S, and those prophecies have proved correct. Apple has indeed introduced a payment system, and it appears to use the secure chip to keep your data safe.
The terminology is somewhat confusing, as Apple has taken to using the term “Secure Element” when describing where payment information is stored rather than “Secure Enclave” as before. Whatever the name, this is likely the same hardware introduced with the iPhone 5S, and Apple has combined it with NFC in the iPhone 6 and upcoming Apple Watch to enable payments.
What makes Apple Pay more secure than most is the fact your credit card information never needs to leave your device. Paying does not relay your credit card number but instead relays a unique payment ID that’s valid only for that specific transaction. This is known as tokenization. It’s been used before by secure credit card payment systems, but Apple Pay is the first to apply the idea to paying via phone.
The system is hardened against thieves, as well, because (on the iPhone 6, at least) the fingerprint reader is used to make a payment. No fingerprint means no payment. Also, as mentioned, iOS 8 devices are very difficult to crack if protected by a passcode, so the typical thief won’t even get as far as attempting to use a fingerprint.
Your Data Doesn’t Make Apple Money
Tim Cook’s recent statements about privacy don’t just focus on the hardware. They also focus on the company’s philosophy. The company’s new privacy page, which is simply a statement from Cook, points out that Apple is different from its competitors because it doesn’t make money off user data. This puts it in contrast to Google, which makes the majority of its money from advertising.
This is a good point. While Apple dipped into the advertising pool a bit with iAd, it’s far from the company’s focus. Unlike Google, which has openly admitted to reading a wide variety of user data including emails, Apple has no little incentive to dip into user data. It doesn’t generate revenue and only exposes the company to liability.
In other words, Apple is saying “Look, we all know companies exist to make money. The only thing that dictating their behavior is how they make money. We don’t make money from your data, so you can trust us.” Cynical? Perhaps. But I think it rings true.
Not everyone will agree with this argument, but its core is difficult to fault. The fact competitors like Google and Microsoft have openly admitted to using user data to drive profit in various services strengths Apple’s position.
What About Those Celebrity Photos?
Tim Cook’s sudden commitment to privacy was likely planned in advanced to the unfortunate leaking of nude photographs from several celebrity iCloud accounts, but the event probably made him double-down on emphasizing the point. Security and privacy have always been part of Apple’s strategy, so his strong reiteration of his company’s goal can be interpreted as PR damage control.
But how were the photos obtained in the first place? The answer is simple; the “hackers” cracked their targets’ passwords. This was probably made possible through a Python script that brute-forced the victims’ iCloud accounts with a list of commonly used passwords.
Apple has emphatically denied that an iCloud security flaw was responsible, though it has patched the vulnerability thought to be responsible. In other words, Apple still denies any security flaw in iCloud allowed the attack to be successful. If this is right, it means the hackers simply guessed the password of their victims through social engineering or the use of leaked passwords from other sites.
At this time it’s impossible to prove any side of this debate. Even the security flaw said to enable the attack is now in dispute, as a software developer named Ibrahim Balic has come forward with a different flaw he says might have been used, though he reported it to Apple six months ago.
While it’s not clear how the attack went down, it is clear that two-factor authentication would have foiled it. Unfortunately, 2FA didn’t protect iCloud at the time the attacks occurred. Apple has since fixed that, but there’s no specific feature in iOS 8 that makes two-factor authentication easier or even highlights it for users.
This is an area where Apple should do more. Two-factor authentication is very important and it should be recommended as the default for iOS and iCloud users. So far Apple has not taken that step, leaving many users unaware they’re vulnerable.
Do you trust Apple, or do you think its new focus on user privacy is all just marketing fluff? Let us know in the comments.