Posted by Richard Willet - Memes and headline comments by David Icke Posted on 7 August 2021

Apple Starts Scanning All Personal Photos And Images Uploaded To iCloud

Apple will report images of child exploitation uploaded to iCloud in the U.S. to law enforcement, the company said on Thursday.

The new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing, where images are transformed into unique numbers that correspond to that image.

Apple started testing the system on Thursday, but most U.S. iPhone users won’t be part of it until an iOS 15 update later this year, Apple said.

The move brings Apple in line with other cloud services which already scan user files, often using hashing systems, for content that violates their terms of service, including child exploitation images.

It also represents a test for Apple, which says that its system is more private for users than previous approaches to eliminating illegal images of child sexual abuse, because it uses sophisticated cryptography on both Apple’s servers and user devices and doesn’t scan actual images, only hashes.

But many privacy-sensitive users still recoil from software that notifies governments about the contents on a device or in the cloud, and may react negatively to this announcement, especially since Apple has vociferously defended device encryption and operates in countries with fewer speech protections than the U.S.

Law enforcement officials around the world have also pressured Apple to weaken its encryption for iMessage and other software services like iCloud to investigate child exploitation or terrorism. Thursday’s announcement is a way for Apple to address some of those issues without giving up some of its engineering principles around user privacy.

Read More: Apple Starts Scanning All Personal Photos And Images Uploaded To iCloud

Perc


From our advertisers