Apple is declaring war on child pornography in the USA. The tech company wants to use the “neuralMatch” system to scrutinize devices and iCloud and promises to comply with data protection.
With “neuralMatch” against child pornography
Starting in the fall of 2021, Apple plans to check the devices of its users and the files stored in the iCloud for child pornography. This was reported by the Financial Times. In the meantime, the facts have been confirmed.
For this purpose, the tech company has developed the system called “neuralMatch”. If the system detects illegal images, human reviewers are automatically notified. These are then supposed to “contact the law enforcement authorities”.
How does neuralMatch work?
Apple has trained neuralMatch for use with 200,000 images from the National Center for Missing & Exploited Children.
Apple plans to use hashes to detect problematic material. This can be used to match images on iPhones or in the iCloud with images already known to contain child pornography.
According to the Financial Times, Apple wants to “tag every photo uploaded to iCloud in the US with a safety voucher.” In it, the system encodes whether or not the file is suspicious content.
From Apple via an NGO to the criminal authorities
If there are matches here, Apple marks the respective suspicious image with a certificate. If this is the case, Apple is allowed to open the files for further examination.
However, the files are only checked when a certain number is reached – the company has not announced exactly how high this number is.
If a match is found, Apple informs the NGO National Center for Missing & Exploited Children. This in turn then informs the law enforcement authorities.
Apple does not want to inform users in whom the company discovers child pornography material. However, users lose access to their Apple account.
Apple prepares devices worldwide
The checks will initially only be carried out in the US. However, Apple is preparing for a possible introduction worldwide with the update to the operating systems iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.
This is because with the updates to the new operating systems, the hashes file will land on all Apple devices worldwide. However, it is not yet clear if and when the check will also be introduced outside the US.