To avoid sanctions after Ukraine invasion, Moscow has set up its own certificate authority to issue TLS certs. As announced in government’s website, certificates will be made available to Russian websites unable to renew or obtain security certificates as a knock-on effect of Western sanctions and organizations refusing to support Russian customers.
“It will replace the foreign security certificate if it is revoked or expires. The Ministry of Digital Development will provide a free domestic analogue. The service is provided to legal entities – site owners upon request within 5 working days.”
In order to securely view a website where a certificate is used, the certificate authority must be recognized by the browser used. However, Russia is silent on which browsers will accept the certs. Considering the heavy sanctions against Russia, it seems unlikely that any browser will support certificates approved by the Russian certificate authority. But then, why was this certificate authority established?
Russia has a good alternative as browser. Yandex is local alternative for Google and YaBrowser of Yandex will likely support this certificate authority. This means, YaBrowser users can visit websites has a certificate approved by Russian certificate authority.
The certificate includes information about the key, information about the identity of its owner (called the subject), and the digital signature of an entity that has verified the certificate’s contents (called the issuer). If the signature is valid, and the software examining the certificate trusts the issuer, then it can use that key to communicate securely with the certificate’s subject (Wikipedia). The key element in digital certificates is ‘trust’. In several news portals, this Russian certificate authority news was considered dangerous because if the certificate authority will be under Putin’s control, that means Russian government can intercept and decrypt all traffic and surely, this situation violates the privacy of the users and provides more control over internet users in Russia.
Decryption keys for Egregor, Sekhmet and Maze shared by someone claiming to be the developer of all three malware.
The keys were published in BleepingComputer forum. As stated in the forum post, this was a planned leak and is not related to the recent law enforcement against attackers. Again, according to the post, none of their team members will ever return to ransomware attacks and the source code of the malware has been destroyed.
The post was containing a link to download a 7zip file with four archives containing the Maze, Egregor and Sekhmet decryption keys, as well as the source code for the M0yv malware used by the operators. However, because of being malicious, the link removed from post. It may be possible to contact to get them again.
Meanwhile, some experts corrected the decryption keys’ performance.
Before, we posted about Apple’s CSAM detection plans and worries about this process of customers that it could be weaponized against users’ privacy. Apple now temporarily pausing the process because of these worries of the customers.
Apple announced this delay on its Child Safety website as; “Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.“
CSAM detection was one of new features that Apple announced in August. Normally, the changes were planned to go live with iOS 15 and macOS Monterey later this year in US. Despite the delay, it seems like the company has not given up on its plan. The date of going live of CSAM detection is not yet known.
Apple announced new features for limiting the spread of Child Sexual Abuse Material (CSAM) in the U.S. “The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple” says Apple.
New Features Against CSAM
Apple is introducing new child safety features in three areas. First, as we told upper, is an on-device machine learning used in Messages app. The Messages app will inform parents and also children when receiving or sending sexual explicitly photos.
The other feature is against spreading CSAM online. “To help address this, new technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC)” says Apple. Apple claims that this feature is designed with user privacy in mind. With this feature, system performs an on-device scan with a hash database of known CSAM materials provided by NCMEC and other child safety organizations.
With another technology called threshold secret sharing, if a user’s account crosses a threshold of known child abuse imagery, the cryptographic technology allow Apple to interpret the contents and disables the user’s account.
What About Privacy?
After the announcement, Edward Snowden tweeted “if they can scan for kiddie porn today, they can scan for anything tomorrow.” Also researchers claim that Apple create a backdoor on its devices and Messages app will no longer provide end-to-end encryption.
The changes apple announced are extremely disappointing. As Edward Snowden said, if they can scan photos today, it means they can scan anything one day and this situation showing us privacy for users will be much more difficult day by day.