sb.scorecardresearch

Published 12:47 IST, August 19th 2021

Policy groups, countries ask Apple to scrap scan plan of iPhones for child abuse images

Washington’s CDT cautioned, that hash-based identification of problematic content is a blunt tool and the governments will “push Apple to expand its use."

Reported by: Zaini Majeed
Follow: Google News Icon
  • share
Apple
IMAGE: Unsplash | Image: self

At least 90 political, human rights, and advocacy groups globally have dispatched a letter to Apple requesting the tech giant to scrap the plan of scanning all iPhones to detect images or messages related to child sexual abuse material (CSAM). An Apple spokesperson on Aug. 18 reportedly said that the company was now facing hurdles about privacy and security concerns, according to multiple reports.

Advocacy groups that are expected to raise the issue with Apple have stated that while the move would help protect children and curb child pornography, Apple’s scanning of iPhones without prior user consent threatens the privacy and security of iPhone users worldwide.  “Apple is taking the wrong approach to meeting them,” the signatories including the Washington, DC-based nonprofit Center for Democracy & Technology (CDT) said in a statement on Wednesday.

“Apple plans to use hashes–the digital fingerprints of files–of content that has been previously reported to the National Center for Missing and Exploited Children (NCMEC) and unidentified other child safety organizations. This means every photo a user wants to upload to iCloud will be scanned and evaluated,” CDT  warned.

Washington’s CDT cautioned, that hash-based identification of problematic content is a blunt tool and the governments will “push Apple to expand the use of its hash database beyond scanning content in iCloud.” “Once Apple introduces the capability of client-side scanning for CSAM, it will open the door to demands for such scanning to occur on all images stored on the phone,” it said. The firm questioned Apple’s ‘misguided’ plan as it asked Apple to halt such surveillance of the encrypted Messages and iCloud file storage services on iPhones worldwide. 

Groups also fear that the procedure may later backfire as it might be exploited to censor protected speech, as well as prove to be disastrous for children that may be the victims, advocacy groups argued in the declassified document accessed by the news agencies. Some of the overseas signatories were now questioning Apple’s decision citing the encryption and privacy concerns and subsequent breach of the legal systems and privacy laws across many countries.  Sharon Bradford Franklin, co-director of the CDT Security and Surveillance Project told UK broadcasters that Apple’s move was ‘disappointing and upsetting’ as the company had previously donned the reputation in defense of the encryption. 

'Dire consequences' of Apple's mechanism

Several signatories were now addressing the issue of nonconsensual scanning of the Apple software and deployment of image detection systems that violate some of the countries’ digital regulations. The procedure might result in the weakening of the encryption, the global advocacy groups argued, adding that there may be dire consequence of this mechanism later on.

Among the signatories that objected to Apple’s recent announcement are India, Mexico, Germany, Argentina, Ghana, and Tanzania that have staunchly opposed the device or iMessage scanning. It would also threaten the kids privacy when images stored on the cellphone of the parents are detected, although Apple argues that it would attempt to blur nudity in children’s images allowing company to inspect only after parents were informed. But the advocacy groups and security experts fear that Apple’s “sexually explicit images” classifier will make mistakes and will scan all images sent to and from a “child” account in Apple’s “Family Sharing” service.

“Apple’s classifier will also inevitably make errors–meaning that parents will be told that someone is attempting to send their young child “sexually explicit photos” when in reality, it’s just grandpa trying to share a photo of the beach," CDT stated.

“Teenagers who receive mistaken warnings about the images they’re sending or receiving are also at risk, whether it’s the shame a girl might feel when her innocuous selfie is flagged as ‘sensitive,’” the groups argued. 

Updated 12:47 IST, August 19th 2021