Apple is seeking research on private cloud computing (PCC) systems that run more compute-intensive Apple Intelligence requests. The company is also expanding its bug bounty program, which offers payouts of up to $1 million to anyone who discovers vulnerabilities in PCC.
The company has been bragging about how much AI functionality (branded as Apple Intelligence) it can run on its devices without ever leaving its Mac, iPhone, or other Apple hardware. However, for more difficult requests, we send requests to PCC servers built using Apple Silicon and newer operating systems.
Many other companies’ AI applications also rely on servers to complete more difficult requests. Still, users have little idea how secure these server-based operations are. Of course, Apple has long made big claims about how much it values user privacy, so poorly designed cloud servers for AI could puncture that image. To prevent that, Apple said it designed the PCC so that its security and privacy guarantees are enforceable and that security researchers can independently verify those guarantees.
Apple offers researchers:
With bug bounties, Apple offers payments ranging from $50,000 to $1,000,000 for vulnerabilities discovered across several different categories. Apple also plans to evaluate security issues for potential rewards that “significantly impact PCC.”
The first Apple Intelligence features will be available to everyone in iOS 18.1, scheduled for next week. Some of the larger Apple Intelligence features, including Genmoji and ChatGPT integration, appeared in the first iOS 18.2 developer beta released yesterday.