Apple has created a virtual research environment to allow public access to test the security of its private cloud computing systems, with several “key components” available to researchers to analyze architectural privacy and safety features. The source code has been released.
The company is also looking to improve the security of its systems, with security updates to include bounties of up to $1 million for vulnerabilities that could violate “PCC’s fundamental security and privacy guarantees.” We have expanded our bounty program.
Private Cloud Compute (PCC) is a cloud intelligence system for complex AI processing of data from user devices in a non-privacy way.
This is achieved through end-to-end encryption, which ensures that the personal data on your Apple device sent to PCC is accessible only to you, and not even Apple can observe it.
Shortly after Apple announced PCC, the company provided early access to select security researchers and auditors so they could verify the system’s privacy and security promises.
virtual research environment
In today’s blog post, Apple announced that access to PCC will be made public, allowing anyone interested to inspect how it works and see if it lives up to its promised requirements. Announced.
The company offers a “Private Cloud Compute Security Guide” that explains the architecture and technical details of the component and how it works.
Apple also offers a Virtual Research Environment (VRE) that locally replicates cloud intelligence systems for inspection, security testing, and problem exploration.
“VRE runs the PCC node software inside a virtual machine with only minor modifications. It works,” Apple explains, sharing documentation on how to set up a virtual research environment on your device.
VRE is present in macOS Sequia 15.1 Developer Preview and requires devices with Apple Silica and at least 16 GB of unified memory.
Tools available in virtual environments allow you to launch PCC releases in an isolated environment, modify and debug PCC software for more thorough inspection, and run inference on demonstration models. You can.
To facilitate researchers’ work, Apple has decided to release the source code of several PCC components that implement security and privacy requirements.
- CloudAttestation project – Responsible for building and validating certificates for PCC nodes.
- Thinble project – Contains a privatecloudcomputed daemon that runs on the user’s device and uses CloudAttestation to enforce verifiable transparency.
- splunkloggingd daemon – Filters potential log output from PCC nodes to prevent accidental data disclosure.
- srd_tools project – Contains VRE tools and can be used to understand how VRE enables the execution of PCC code.
Apple is also incentivizing research with new PCC categories for security bounty programs for accidental data disclosure, user-requested external compromise, and physical or internal access.
The maximum reward is $1 million for a remote attack on request data, which results in remote code execution with arbitrary privileges.
Researchers who show how to gain access to user request data and sensitive information could receive a $250,000 reward.
Performing the same type of attack from a network with elevated privileges would result in payouts ranging from $50,000 to $150,000.
However, Apple has stated that it considers issues that significantly impact PCC for bounties, even if they fall outside of the bug bounty program categories.
The company believes that “private cloud computing is the most advanced security architecture ever deployed in large-scale cloud AI computing,” but with the help of researchers, security and We would like to further improve the privacy aspect.