New child protection tech: Apple has defended its new system that scans users’ phones for child sexual abuse material (CSAM) after it drew a backlash from customers and privacy advocates.

The technology looks for matches with known abuse material before the image is uploaded to iCloud storage.

Critics have warned that it could become a “backdoor” for spying on people, and more than 5,000 people and organizations have signed an open letter against the technology.

As a result, Apple has pledged not to “expand” the system for any reason.

Last week, digital privacy campaigners warned that authoritarian governments could use the technology to bolster anti-LGBT regimes or suppress political dissidents in countries where protests are considered illegal.

Apple, however, said it “will not honor the government’s request to expand” the system.

The company released a question-and-answer document saying it has numerous safeguards preventing its systems from being used for anything other than detecting child abuse images.

“We have faced demands before to develop and implement government-sanctioned changes that violate user privacy, and we have steadfastly refused those demands. We will continue to refuse them in the future,” the statement said.

Nevertheless, Apple has made some concessions in the past in order to continue operating in countries around the world.

Last New Year’s Eve, the tech giant removed 39,000 apps from its Chinese App Store amid a crackdown by that country’s authorities on unlicensed games.

Apple also said that its anti-CSAM tool would not allow the company to see or scan a user’s photo album. It will only scan photos that are shared on iCloud.

The system will look for matches in a secure location on the device based on a database of hashes of known CSAM images provided by child safety organizations.

Apple also claims that it’s nearly impossible to mistakenly report innocent people to the police. “The likelihood that the system will incorrectly flag an account is less than one in one trillion per year,” the statement says. Positive matches are also verified by humans.

Privacy advocates, however, argue that the only thing preventing the technology from being used for other purposes is Apple’s promise that it will not be used.

The digital rights group Electronic Frontier Foundation, for example, said that “all that would be required is an extension of machine learning parameters to find additional types of content.”

“It’s not a slippery slope; it’s a completely finished system just waiting for external pressure to make the slightest change,” it warned.

Apple also provided assurances about another new feature that will alert children and their parents using linked family accounts about sending or receiving sexually explicit photos.

The company claims that its two new features do not use the same technology, and says it will “never” gain access to users’ private messages.

Although Apple’s announcement drew a negative reaction from privacy advocates, some politicians welcomed the new technology.

British Health Secretary Sajid Javid said it was time for other companies, especially Facebook, to follow suit.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here