icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
5 Aug, 2021 19:07

Apple to scan photos on all US iPhones for ‘child abuse imagery’ as researchers warn of impending ‘1984’ – reports

Apple to scan photos on all US iPhones for ‘child abuse imagery’ as researchers warn of impending ‘1984’ – reports

US iPhone users’ photos will be scanned by Apple’s automated “neuralMatch” system for pictures of child porn and abuse, according to reports. Security researchers are alarmed the scheme threatens privacy and encryption.

Financial Times reported on the plan Thursday, citing anonymous sources briefed on Apple’s plans. The scheme was reportedly shared with some US academics earlier in the week in a virtual meeting.

Dubbed “neuralMatch,” the system will reportedly scan every photo uploaded to iCloud in the US and tag it with a “safety voucher.” Once a certain number of photos – not specified – are labeled as suspect, Apple will decrypt the suspect photos and inform human reviewers – who can then contact the relevant authorities if the imagery can be verified as illegal, the FT report said. The program is initially intended to be rolled out in the US only.

The plan was described as a compromise between Apple’s promise to protect customer privacy and demands from the US government, intelligence and law enforcement agencies, and child safety activists to help them battle terrorism and child pornography.

Researchers who found out about the plan were alarmed, however. Matthew Green, a security professor at Johns Hopkins University, was the first to tweet about the issue in a lengthy thread late on Wednesday.

The problem with this approach, Green warned, is that whoever controls the list of prohibited imagery “can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you.”

Depending on how the system works, “it might be possible for someone to make problematic images that ‘match’ entirely harmless images. Like political images shared by persecuted groups,” he added. While he could see internet trolls doing it as a prank, Green added “there are some really bad people in the world who would do it on purpose.”

“I don’t particularly want to be on the side of child porn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends,” he tweeted.

Several other researchers echoed Green’s concerns. Apple’s move was “tectonic” and a “huge and regressive step for individual privacy,” Alec Muffett, a security researcher and privacy campaigner who worked at Facebook and Deliveroo, told FT.

“Apple are walking back privacy to enable 1984, he added.

Ross Anderson, professor of security engineering at the University of Cambridge, called it “an absolutely appalling idea” that will lead to “distributed bulk surveillance” of people’s phones and laptops.

Word about Apple’s snooping plan comes just weeks after the revelation that iPhones around the world – but reportedly not in the US, for some reason – were targeted by Pegasus, spying malware deployed by the Israeli company NSO, to keep tabs on over 50,000 people, including journalists, dissidents and even heads of state.

Think your friends would be interested? Share this story!

Podcasts
0:00
29:39
0:00
28:21