Last week, Manhattan district attorney Cyrus Vance released a whitepaper calling for the federal government to pass laws mandating that Google, Apple, and any other smartphone vendor build deliberate backdoors into their devices. Ever since iOS 8, Apple has automatically enabled full-phone encryption. As of this writing, 61.8% of all Apple devices are running iOS 9, 20.1% are on iOS 8, and just 13.4% are on iOS 7. The total number of Apple users on iOS 7 or below is estimated at ~18%. Android, however, is entirely different — an overwhelming majority of devices (74%) are still using devices that Google can access without the user’s permission.
TheNextWeb has details on the overall issue, and the problem is fairly significant on the Android side of the fence. Only 26% of devices are running Android 5.x or 6.x. Since encryption was optional in Lollipop and only went mandatory for Marshmallow, that means only 0.3% of the Android user base is protected from potential snooping by default. Lollipop owners have the option to turn encryption on, but the performance hit for doing so can be significant depending on which device you own.
The larger question is whether or not Google would assist in the unlocking at all, and that’s where a recent court case could come into play. Last month, the federal government asked Apple to decrypt and access an older iPhone running iOS 7 as part of an investigation. The government argues that requiring Apple to provide access to a device it has a legal warrant to search is permissible under the All Writs Act, which states that federal courts may “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” It was written in 1789, though it’s been amended several times since, most recently in 1911.
Apple has argued that providing access to devices running iOS 7 and earlier, while technically possible, puts the company in a dubious legal position. If Apple performs unlocks on iOS 7 and earlier devices, are the engineers that perform the work now legal witnesses in an ongoing criminal case? In a recent legal filing, Apple also stated:
As Apple notes, while it may be possible to assist the government in some situations without an undue burden, the sheer volume of cases that could wind up dumped on the company could be extremely burdensome. Furthermore, Apple isn’t a branch of law enforcement, and knowing that the company cooperates in all such investigations could harm the reputation of its products. Google didn’t join Apple in its filing, but all of the same arguments apply to Android just as much as iOS. The stakes in Android’s case are actually much higher, given the number of devices that aren’t running the latest version of the operating system.
Google hasn’t entered the fray on this topic quite as loudly as Apple has, but the company is clearly moving towards a similar model in which all devices are fully encrypted from the moment they leave the manufacturer. The fact that Google’s exposure is much higher than its principle rival is another example of how the existing Android security model is broken. Google needs a system that allows it to push critical security updates more easily and a way to ensure that more devices are kept up-to-date on the latest software. There’s no guarantee that Apple’s challenge will actually win in court, and if the federal system rules that companies must provide access to devices when presented with a warrant, full-phone encryption will be the only way to avoid the problem.
All of this assumes, of course, that the US government doesn’t pass laws limiting the use of encryption in the wake of the Paris attacks. We’ve already seen members of Congresscalling for increased federal oversight of social media and even the FCC chair, Tom Wheeler, repeating now-discredited reports that the PS4 or encryption were used to prevent authorities from knowing about the Paris attack before it happened. The fact that metadata analysis failed to prevent the attack is paradoxically used as evidence that we need even more invasive “security.”
The NSA has already admitted that its bulk data collection programs have failed to stop a single terrorist attack. Of the 227 terror investigations instigated since 9/11 in the US, 17 came from the NSA’s surveillance program. The single target convicted as a result was punished for sending money to Somalia — not for planning any kind of terrorist attack. Requiring Apple and Google to hand over encryption keys in routine law enforcement investigation has been sold as a means to keep us all safer. But the government’s record where these programs are concerned is evidence that there’s a very real cost to that security — and the benefits of paying that cost have yet to emerge.