Aug 10, 2021
Laura Berrill

Apple defends new system scanning iCloud for child abuse

iPhone
Cybersecurity
Technology
Apple defends new system scanning iCloud for illegal child abuse materials

Apple has defended its new system that will scan iCloud for illegal child sexual abuse materials, or CSAM, amid controversy over whether it reduces user privacy and could be used by governments to surveil citizens.

Last week Apple announced it had started testing the system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service. It said it is able to do this without learning about the contents of a user’s photos stored on its servers.

It reiterated yesterday its system is more private than those used by companies such as Google and Microsoft because its system uses both its servers and software that will be installed on people’s iPhones through an iOS update. However, privacy advocates say they are worried the system could be expanded in some countries through new laws to check for other images, such as those with political content. Apple added that governments cannot force it to add non-CSAM images and that the technology is limited to detecting CSAM stored in iCloud.

Companies in the US are required to report CSAM to the National Centre for Missing & Exploited Children and can face fines up to $300,000 when discovering illegal images and not reporting them.

A reputation to protect

Any controversy over the system threatens Apple’s public reputation for building secure and private devices, which the company has used to break into new markets in personal finance and healthcare. It continues to defend its systems as a genuine improvement that protects children and will reduce the amount of CSAM being created while still protecting iPhone user privacy.

It said the system is significantly stronger and more private than previous ones by every privacy metric the company tracks and it has gone out of its way to build a better one to detect these illegal images. Unlike other current systems which run in the cloud, it can’t be inspected by security researchers, but through its distribution in iOS, an Apple spokesperson said. It added it doesn’t scan private libraries that haven’t been uploaded to iCloud. The changes will roll out through an iPhone update later this year.

 

Share article

You might also like these articles

Laura Berrill
Digital Transformation