CSAM Detection on Apple Devices and Privacy

Apple announced new features for limiting the spread of Child Sexual Abuse Material (CSAM) in the U.S. “The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple” says Apple.

Source: https://www.apple.com/child-safety/

New Features Against CSAM

Apple is introducing new child safety features in three areas. First, as we told upper, is an on-device machine learning used in Messages app. The Messages app will inform parents and also children when receiving or sending sexual explicitly photos.

The other feature is against spreading CSAM online. “To help address this, new technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC)” says Apple. Apple claims that this feature is designed with user privacy in mind. With this feature, system performs an on-device scan with a hash database of known CSAM materials provided by NCMEC and other child safety organizations.

With another technology called threshold secret sharing, if a user’s account crosses a threshold of known child abuse imagery, the cryptographic technology allow Apple to interpret the contents and disables the user’s account.

What About Privacy?

After the announcement, Edward Snowden tweeted “if they can scan for kiddie porn today, they can scan for anything tomorrow.” Also researchers claim that Apple create a backdoor on its devices and Messages app will no longer provide end-to-end encryption.

The changes apple announced are extremely disappointing. As Edward Snowden said, if they can scan photos today, it means they can scan anything one day and this situation showing us privacy for users will be much more difficult day by day.

Leave a Reply