Home / Mac / Understand how Apple Security Research Devices are likely to work and remain secure

Understand how Apple Security Research Devices are likely to work and remain secure



Apple recently announced the Security Research Device (SRD) program, in which selected security researchers receive special iPhones to help them equip iOS security issues (see “Apple Releases Dedicated Security Research Device,” July 23, 2020). Apple can then fix any vulnerabilities they find, hopefully before they are exploited.

Based on Apple’s announcement, supported by some logical move, we can speculate on how this program and the specific devices will work.

Apple SRD

When you buy an iPhone from Apple, it is a production phone, with a production code signing key burned into the system on a chip (SoC). Having a production key is referred to as production merged (often abbreviated to prod fused together). It runs release versions of iOS, which are signed with Apple production proof (internally called prod cert). When iPhone starts, it checks code that is burned into the SoC operating system. If it is not properly signed with a product proof, the iPhone refuses to boot at all. Jailbreaking is (in part) the art of finding a way around this limitation.

Similarly, iOS only runs apps signed by the App Store cert, which prevents page loading, the ability to install apps directly, without going through the App Store. There are a few exceptions, such as a third-party developer signing an app they build with their third-party development certificate, which allows the app to run on their iPhones. In practice, this process is often full of problems. (This description simplifies the details of code signing, but it covers the general procedure.)

Developing software internally at Apple would be extremely difficult with these limitations in place. Internally, Apple engineers use iPhones with a development key (they are called dev fused), and they run development buildings of iOS. Dev builds of iOS include a shell, debugging and profiling code in many apps, test hooks and internal frameworks. Dev iOS builds are signed with a dev cert, so they boot up on dev-melted iPhones.

In addition, dev builds of iOS do not check an apps certificate before running it. You can download any app you want on an iPhone with a device. This fact makes it much easier for Apple engineers to do their job. One of the reasons why the process of signing apps works poorly for third-party developers is probably because Apple engineers do not use it, so they do not use any pressure to fix problems.

Even if you managed to get a copy of an iOS development build and somehow upload it to your production iPhone, it would not boot because it is not signed with a production certificate. In the same way, iOS release builds will not start on dev-melted iPhones.

As I mentioned, dev builds of iOS include a shell, the app behind Terminal that runs Unix commands and scripts. macOS and iOS used to include bash shell. macOS has recently moved to zsh; iOS may have moved to zsh as well.

Since an iPhone is a full-fledged computer, Apple engineers log on to their dev iPhones using ssh and work in the shell. It is no different than server engineers who go to a remote server to work with server code. It is very difficult to work on a computer if you can not log on to it. Release versions of iOS do not include a shell, especially to improve security (and make life difficult for jailbreakers), since there is nothing to log on to.

If Apple wants to help security researchers find vulnerabilities, Apple needs to give them iPhones with a shell they can ssh in. certificate – call them SRD fused. They will run a special iOS build that has some of the features of Apple’s internal iOS builds and is signed with SRD cert. This means that the SRD build of iOS will not run on production iPhones, nor on Apple internal dev iPhones.

Apple is also likely to give researchers a special internal build of Xcode, plus tools designed to load software on iPhones and debug internally in iOS. Apple has a wide range of internal tools to play into the dark recesses of iOS where normal third-party developers never go. These tools will definitely help security researchers. Apple creates new internal builds of iOS, Xcode and support frames every day. While security researchers are unlikely to receive daily builds, they are likely to receive regular updates.

SRD builds of iOS probably also run apps without checking that they are code signed, just like iOS dev builds. I do not have any specific information indicating that this is true, but it will make sense. This will allow a security researcher to set any rights they want on an app and learn a lot about how iOS ‘internal protection actually works. It will also mean that they do not have to struggle with Xcode’s code signing feature every time they load an app on the iPhone.

iOS has many security levels. A fully functional exploitation is usually built by linking several individual vulnerabilities. For example, one vulnerability could allow a malformed JPEG to take over the JPEG decoder. But the JPEG decoder is sandboxed, especially so that if it is compromised, it does not get access to very much. Another vulnerability could allow escaping the sandbox. A third vulnerability could allow escalation to root. Chain these three together, then you have an exploitation, where just looking at a picture compromises your iPhone. (It’s called a “drive-by exploit.”)

While a few of the dozens of iOS security restrictions will be turned off in the SRD buildings by iOS so that security researchers can more easily do their job, most will remain in place. Apple does not need security researchers to find full-fledged multi-step news. Apple will be happy if scientists find ways around individual security constraints so that engineers can fix these bugs discreetly.

Have you ever wondered what prevents the bad guys from getting their hands on one of these SRDs? Apple probably has some strong measures in place to prevent SRDs and special iOS builds from leaking beyond the intended receivers. First, there is probably a non-disclosure agreement as tight as anything in the industry. In the past, Apple security has required that people who have access to unreleased hardware must store it in a locked, windowless room and only give keys to certain people. Apple sometimes uses different code names with different groups, so if a code name is leaking, it can be tracked.

Inside Apple “unreleased iPhones” call home via the Internet in a few days, so Apple knows that the employee it has got to keep it. The SRDs can do this as well.

Although the SRD is likely to be a standard iPhone 11, albeit with a special certificate, Apple will still not want pictures of it or screenshots of it circulating. In the past, Apple Security has put “random” markings on cases of unreleased products, so if an image were to appear on the Internet, Apple would know which device it was. Every SRD build of iOS could have identifying markers added, so if a build were to leak, Apple would know where it came from. An Apple engineer once told me that some internal OS buildings use steganography to hide the iPhone’s IP address and MAC address in the low order bits of screens. The data is not visible to the naked eye, but if a screenshot appears online, Apple has tools that can see where it came from.

If an SRD was stolen, depending on how the SRD builds of iOS are signed, Apple may be able to revoke the certificate for that device to prevent future iOS builds from running on it. Apple can probably also draw an SRD remotely, although I guess someone smart enough to steal an SRD will have it in a Faraday bag and make sure it never appeared on the Internet to receive the brick command.

Recent high-profile iPhone hacks may have prompted Apple to evaluate how it does security research. Given how secretive Apple usually is, the security research device program is an unusual move. As you can see, there is also one that undoubtedly required a significant amount of work to ensure that it could not be exploited by organized crime and government intelligence agencies. Hopefully we all get safer iPhones as a result.


Source link