Apple defends new system scanning iCloud for child abuse

By Laura Berrill
Apple defends new system scanning iCloud for illegal child abuse materials

Apple has defended its new system that will scan iCloud for illegal child sexual abuse materials, or CSAM, amid controversy over whether it reduces user privacy and could be used by governments to surveil citizens.

Last week Apple announced it had started testing the system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service. It said it is able to do this without learning about the contents of a user’s photos stored on its servers.

It reiterated yesterday its system is more private than those used by companies such as Google and Microsoft because its system uses both its servers and software that will be installed on people’s iPhones through an iOS update. However, privacy advocates say they are worried the system could be expanded in some countries through new laws to check for other images, such as those with political content. Apple added that governments cannot force it to add non-CSAM images and that the technology is limited to detecting CSAM stored in iCloud.

Companies in the US are required to report CSAM to the National Centre for Missing & Exploited Children and can face fines up to $300,000 when discovering illegal images and not reporting them.

A reputation to protect

Any controversy over the system threatens Apple’s public reputation for building secure and private devices, which the company has used to break into new markets in personal finance and healthcare. It continues to defend its systems as a genuine improvement that protects children and will reduce the amount of CSAM being created while still protecting iPhone user privacy.

It said the system is significantly stronger and more private than previous ones by every privacy metric the company tracks and it has gone out of its way to build a better one to detect these illegal images. Unlike other current systems which run in the cloud, it can’t be inspected by security researchers, but through its distribution in iOS, an Apple spokesperson said. It added it doesn’t scan private libraries that haven’t been uploaded to iCloud. The changes will roll out through an iPhone update later this year.



Featured Articles

KPMG appoints Global Head of AI to drive AI strategy

KPMG marks next phase in its AI strategy with appointment of Global Head of AI and launch of global framework for design, build and of use of AI solutions

Google unveils Gemini, its largest and most capable AI model

Google says its Gemini AI model is built from the ground up for multimodality — reasoning seamlessly across text, images, video, audio, and code

Technology key to integrating sustainability into strategies

Kyndryl & Microsoft study finds 16% of organisations have integrated sustainability into strategies while most view technology as key to achieving goals

Hitachi Vantara addresses cloud demand with Google Cloud

Cloud Computing

Google delays launch of long-anticipated Gemini AI model

AI & Machine Learning

Atos to deliver critical IT services to UEFA EURO 2024

Digital Transformation