Apple made it clear: child porn will be searched only on American iPhones and iPads and only with iCloud Photos turned on

Apple has made an official statement regarding the search for Prohibited Sexually Abused Child Photos (CSAM) using a new tool for iOS and iPadOS.

This made everyone think about privacy concerns, as photos from the user’s albums will be uploaded to the cloud and viewed by service personnel. Apple said the company is not going to scan its entire library of iPhone and iPap users’ photos for child porn. Instead, the company will use cryptography to compare images with a well-known database provided by the National Center for Missing & Exploited Children.

Apple made it clear: child porn will be searched only on American iPhones and iPads and only with iCloud Photos turned on

Apple said the system has been in development for many years and is not intended to be a government control over citizens. Moreover, users in Russia and other countries of the world do not need to worry about this problem, because Apple has made it clear that this system will only be available in the United States and only if iCloud is enabled.

We are currently testing the iOS 15 Beta4 developer preview, which does not include this feature. It is expected to arrive with the release of the final version of iOS 15 this September.

Earlier, security experts warned that Apple’s new tool, announced yesterday, could be used for surveillance, putting the personal information of millions of people at risk.

.
Source Link

You may also like