According to Belarusian website motolko.help, developers with a Belarusian address of registration received a letter including a notification of account deactivation. It is also noticed that Apple conducts payments through the German Deutsche Bank, and a number of Belarusian banks, for example, Belinvestbank do not accept payments from it due to sanctions.
motolko.help also said that ‘“We noticed an issue when verifying your account. The legal entity information associated with your account fully matches a restricted party or one or more parties from the United States government’s consolidated screening list, another government’s sanctions list, or a restricted regions list,” Apple said in a notice.’
After these news, in the morning today, there was a post in vc.ru that is reporting that this blocking of Belarusian accounts by Apple was a mistake. An email stating that accounts belong to a restricted region was sent in error, according to a new message from Apple.
Before, we posted about Apple’s CSAM detection plans and worries about this process of customers that it could be weaponized against users’ privacy. Apple now temporarily pausing the process because of these worries of the customers.
Apple announced this delay on its Child Safety website as; “Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.“
CSAM detection was one of new features that Apple announced in August. Normally, the changes were planned to go live with iOS 15 and macOS Monterey later this year in US. Despite the delay, it seems like the company has not given up on its plan. The date of going live of CSAM detection is not yet known.
Apple announced new features for limiting the spread of Child Sexual Abuse Material (CSAM) in the U.S. “The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple” says Apple.
New Features Against CSAM
Apple is introducing new child safety features in three areas. First, as we told upper, is an on-device machine learning used in Messages app. The Messages app will inform parents and also children when receiving or sending sexual explicitly photos.
The other feature is against spreading CSAM online. “To help address this, new technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC)” says Apple. Apple claims that this feature is designed with user privacy in mind. With this feature, system performs an on-device scan with a hash database of known CSAM materials provided by NCMEC and other child safety organizations.
With another technology called threshold secret sharing, if a user’s account crosses a threshold of known child abuse imagery, the cryptographic technology allow Apple to interpret the contents and disables the user’s account.
What About Privacy?
After the announcement, Edward Snowden tweeted “if they can scan for kiddie porn today, they can scan for anything tomorrow.” Also researchers claim that Apple create a backdoor on its devices and Messages app will no longer provide end-to-end encryption.
The changes apple announced are extremely disappointing. As Edward Snowden said, if they can scan photos today, it means they can scan anything one day and this situation showing us privacy for users will be much more difficult day by day.