Apple releases unprecedented amount of information on new AI security

Company will pay up to $1 million to people who find issues with AI security

Andrew Griffin
Thursday 24 October 2024 18:55 BST
Comments
(Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Apple has released an unprecedented amount of information on the security of its new AI systems, along with a commitment to pay up to $1 million to anyone who finds a problem with them.

The new information, tools and rewards are part of Apple’s plan to ensure that its new Apple Intelligence systems are private and secure.

Apple said the publication is an invitation to security and privacy researchers as well as “anyone with interest and a technical curiosity” to dig deeper into the security technology and make sure it is safe.

When Apple introduced its new AI tools earlier this year, it said that they would rely on a mix of on-device processing but also more powerful cloud computers for responding to particularly intensive requests. It also said that it had built an entirely new cloud computing system to ensure that those requests were handled with the same privacy and security that they would be if they were made on the device.

That system is called Private Cloud Compute and it means that personal data is not accessible to anyone other than the user, including Apple. The company has said that it “meant a whole bunch of technical invention” including building kinds of AI cloud processing systems that had not been made before.

Apple said that, in order to ensure people trusted the system, it would allow researchers to inspect and check the security and private promises of Private Cloud Compute. Now it has released more information on those promises.

The new announcement is really three. First, Apple will release a new security guide that offers deep detail on how Private Cloud Compute was built; the second is a new virtual research environment that allows security experts to recreate those cloud computers on their own Macs; the third is the announcement of a bug bounty to incentivise that research.

That programme offers a reward of $1,000,000 to researchers who find the most dangerous kind of vulnerability, which would allow hackers to run their own code and break into the central parts of the device. Less dramatic bugs will receive smaller payouts, and Apple says that some bugs might not fit into existing categories but that it will evaluate them nonetheless.

The other major announcements offer ways for security researchers to find those bugs. The security document offers information on how the systems were built, while the virtual research environment will let people examine how the cloud compute systems work and look at their source code.

Apple said that the new announcements were part of its belief that privacy is a human right, and that security is part of that. It invited security researchers to test the system with the hope of making them stronger.

“We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time,” it wrote in a blog post.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in