close
close

Security experts are assessing Apple Intelligence’s data protection measures

Despite concerns on social media about Apple’s integration of OpenAI’s ChatGPT solution, security experts praised the data protection system in Apple Intelligence AI services planned for the latest iOS and macOS devices from September.

Initial reaction to Apple’s data handling has been positive, primarily due to its architecture that does not store information off-device. Instead, the company promises to delete data entered by users in Apple Intelligence as soon as responses are delivered from models running on specially designed Apple servers in data centers.

“I’ve actually never seen anything like this,” Alex Stamos, chief trust officer at security firm SentinelOne, said of Apple’s technology. “People talked about it in the papers, but I’ve never seen anyone do it in a production.”

Apple has built a Private Cloud Compute (PCC) server to process AI requests from the company’s devices. According to the company, once the process is complete, the PCC server deletes all data and retains nothing.

Apple encrypts data from the device to an approved PCC node. Apple Secure Enclave, a subsystem that handles cryptographic operations on the company’s silicon, handles encryption and decryption.

“It’s basically the best thing you can do for data sent to the cloud for these types of queries, while blocking Apple from accessing the data,” Stamos said.

Apple PCC security in the factory

According to Apple, the security of PCC hardware begins at the manufacturing stage. Apple performs high-resolution imaging of PCC components before shutting down the server and activating the pre-shipment switch. Apple re-verifies when the server arrives at the data center.

Apple says it won’t retain data, but companies are always skeptical of such claims.

Matt WlasachVice President of Product Engineering and Sales, Jamf

At the end of the process, Apple issues a certificate for the keys stored in the Secure Enclave for each server. An Apple iPhone, iPad or Mac will not send data to a PCC node unless its certificate is verified.

Once PCC is launched, Apple plans to allow security researchers to inspect software images of every production server. Apple will reward researchers who find security issues as part of its bounty program.

“The Apple Security Bounty will reward research across the entire Private Cloud Compute software stack – with particularly significant payouts for any issues that challenge our privacy claims,” the company said on its website.

Security experts say companies that allow employees to bring their own devices to work are likely to be satisfied with Apple’s data protection system. Organizations that must adhere to strict data processing rules set by regulators will likely choose to opt out of Apple Intelligence. These include financial services, government agencies and healthcare institutions.

“Some customers, especially those in highly regulated industries, will still want to control the flow of corporate data from managed devices to any external service, even with a highly secure Apple implementation,” said Weldon Dodd, senior vice president of community for Mobile Device Management (MDM) Kandji company.

Typically, organizations under regulatory scrutiny are only allowed to use IT hardware and software that meet rigorous standards, such as the Federal Risk and Authorization Management Program and the Health Insurance Portability and Accountability Act.

Some less regulated organizations may also be wary of Apple’s claims, said Matt Vlasach, vice president of product engineering and sales at MDM Jamf.

“Apple says it won’t retain data, but companies are always skeptical of such claims,” ​​he said.

Apple promises to provide organizations with the ability to audit its artificial intelligence system, which should allay concerns, Vlasach said. However, enterprises will likely disable Apple Intelligence while the service is evaluated.

“Initially, regardless of what Apple claims, there is always significant hesitation to leverage cloud services, much less new AI technologies, until legal and compliance teams can really dig into the details,” Vlasach said.

ChatGPT integration on iOS devices

This week, Apple also addressed privacy concerns related to its collaboration with ChatGPT developer OpenAI. Under the agreement, Apple will provide device users with the ability to use OpenAI’s ChatGPT to take advantage of services not available on Apple, such as image and document understanding capabilities. Users can also use ChatGPT in Apple’s writing tools to create content.

According to Apple, it hides users’ IP addresses from OpenAI and ChatGPT does not store requests.

Security experts say some companies may balk at using a device that doesn’t prevent employees from uploading company data to the public version of ChatGPT. Apple says organizations will be able to disable ChatGPT on devices running the iOS 18, iPadOS 18 and macOS 15 operating systems, which are scheduled to launch this year with Apple Intelligence.

“This is exactly the type of certainty that enterprise clients need to carefully and thoughtfully consider how to adapt their RC (governance, risk management and compliance) policies to the new capabilities of iPads, iPhones and Macs,” he said Dodd.

Apple did not provide comment by press time.

Antone Gonsalves is the lead editor of the TechTarget Editorial, writing about industry trends key to enterprise technology buyers. He has worked in tech journalism for 25 years and lives in San Francisco.