Many iOS “Unused” encryption measures, say cryptographers

iOS doesn’t use built-in encryption measures as much as it could, allowing potentially unnecessary security vulnerabilities, according to cryptographers at Johns Hopkins University (via Wired).

IPhone 12 security feature

Using public documentation from Apple and Google, law enforcement reports on circumventing mobile security features and their own analysis, cryptographers assessed the soundness of iOS and Android encryption. The research found that although the encryption infrastructure on iOS “sounds great”, it is largely left unused:

“On iOS, in particular, the infrastructure is in place for this hierarchical encryption that sounds great,” said Maximilian Zinkus, lead researcher at iOS. “But I was definitely surprised to see then how much it’s unused.”

When an iPhone turns on, all stored data is in a “Full Protection” state, and the user must unlock the device before anything can be decrypted. While this is extremely certain, the researchers pointed out that once the device is unlocked for the first time after rebooting, a large amount of data moves to a state that Apple calls “Protected until first user login.”

Because devices are rarely restarted, most data is in the “Protected until first user authentication” state, rather than “Full protection” most of the time. The advantage of this less secure state is that decryption keys are stored in quick access memory, where they can be quickly accessed by applications.

In theory, an attacker could find and exploit certain types of security vulnerabilities in iOS to obtain encryption keys in quick-access memory, allowing them to decrypt large amounts of data on the device. This is believed to work for many smartphone access tools, such as those from forensic access company Grayshift.

While it is true that attackers require a specific operating system vulnerability to access keys, and Apple and Google fix many of these flaws as they are noticed, it can be avoided by hiding deeper encryption keys.

“I was really shocked because I got into this project thinking that these phones really protect users’ data,” says Johns Hopkins cryptographer Matthew Green. “Now I’m out of the project thinking that almost nothing is protected as much as it could be. So why do we need a back door for law enforcement, when the protections that these phones actually offer are so two calls? “

The researchers directly shared their findings and a number of technical recommendations with Apple. An Apple spokesman gave a public statement in response:

“Apple devices are designed with multiple layers of security to protect against a wide range of potential threats, and we’re constantly working to add new protections to our users’ data. As customers continue to increase the amount of sensitive information they store on their devices, we will continue to develop additional protections in both hardware and software to protect their data. “

The spokesman also said Wired that Apple’s security work focuses primarily on protecting users from hackers, thieves and criminals who want to steal personal information. They also mentioned that the types of attacks that the researchers pointed out are very expensive to develop, require physical access to the target device and only work until Apple launches a patch. Apple also stressed that its goal with iOS is to balance security and convenience.

.Source