Technology

#Why experts are worried about Apple’s plan to scan every picture on your iPhone

#Why experts are worried about Apple’s plan to scan every picture on your iPhone

Last night, Apple made a huge announcement that it’ll be scanning iPhones in the US for Child Sexual Abuse Material (CSAM). As a part of this initiative, the company is partnering with the government and making changes to iCloud, iMessage, Siri, and Search. 

However, security experts are worried about surveillance and the risks of data leaks. Before taking a look at those concerns, let’s understand, what is Apple doing exactly?

How does Apple plan to scan iPhones for CSAM images?

A large set of features for CSAM scanning relies on fingerprinted images provided by the National Center for Missing and Exploited Children (NCMEC). 

Apple will scan your iCloud photos and watch them with the NCMEC database to detect if there are any CSAM images. Now, the company is not really doing this on the cloud, but it’s performing these actions on your device. It says that before an image is sent to the iCloud storage, the algorithm will perform a check against known CSAM hashes.

When a photo is uploaded to iCloud, Apple creates a cryptographic safety voucher that’s stored with it. The voucher contains detail to determine if the image matches against known CSAM hashes. Now, the tech giant won’t know about these details, unless the number of CSAM images on your iCloud account goes beyond a certain amount.

Keep in mind, that if you have iCloud sync off on your phone, the scanning won’t work.

For iMessage, Apple will perform a scan and blur CSAM images. Plus, when a child views such an image, parents will receive a notification about it, so they can take appropriate action.

If a child is trying to send such an image, they’ll be warned, and if they go ahead, a notification will be sent to parents.

It’s important to note that parental notification will only be sent if the child is under 13. Teens aged 13-17, will only get a warning notification on their own phones.

The company is also tweaking Siri and Search to provide additional CSAM-related resources for parents and children. Plus, if someone is performing CSAM related searches, Siri can intervene and give them a warning about the content.

Apple has tweaked Siri and Search to provide additional resources for
Close

Please allow ads on our site

Please consider supporting us by disabling your ad blocker!