Apple Formally Gives Up Its Plans To Scan iCloud Photos For Child Abuse Material

One of Apple’s most contentious ideas ever, to check iCloud photographs for evidence of child sex abuse, has been formally dropped (or, CSAM).

Yes, Apple said this summer that on-device scanning, a new iOS function that silently combed through each user’s images for evidence of questionable content, would be available. The new capability was created to warn human personnel, who would then likely alert the authorities, should the scanner discover any indications of CSAM.

Privacy and security experts reacted angrily to the proposal right away, claiming that the scanning tool may potentially be used to look for different types of content.

Critics claimed that even having such scanning capabilities in iOS constituted a slippery slope towards wider monitoring abuses, and it was widely believed that the technology could rapidly turn into a backdoor for law enforcement.

Apple fought back against these complaints at the time, but in the end, the firm backed down and declared that it would “postpone” the adoption of the new feature until a later time.

It appears right now that that time will never arrive. In addition to announcing a slew of new iCloud security measures on Wednesday, the business also said that it will not be proceeding with its plans for on-device scanning.

Apple’s ideas appeared to have good intentions. The digital proliferation of CSAM is a significant issue, and specialists claim that it has gotten worse recently. Clearly, trying to find a solution to this issue was a wonderful idea.

However, it looks that the underlying technology Apple recommended using—and the privacy risks it posed—just wasn’t the ideal tool for the job.

Read More:

Content Source

Dallas Press News – Latest News:
Dallas Local News || Fort Worth Local News | Texas State News || Crime and Safety News || National news || Business News || Health News

texasstandard.news contributed to this report.

Related Articles

Back to top button