In order to combat child sexual abuse, Apple intends to install a program on US iPhones to search for illegal images, specifically that include child abuse or exploitation, however, many experts believe that this may pose a risk to users' privacy because if they do This could open the door to monitoring millions of devices for users around the world.

Apple plans to scan US iPhones for child abuse images


Apple's new system

Apple detailed its proposed system, which it called "neuralMatch" to some American academics earlier this week, and according to two security researchers briefed on the virtual meeting, Apple's program is expected to be implemented more widely as soon as possible. this week.

NeuralMatch is an automated system that will proactively alert a team of human reviewers if they believe illegal images have been detected, after which the reviewers will contact law enforcement and report the owner of the device.


Apple and law enforcement agencies

Tensions between tech companies such as Apple and Facebook, which have advocated their increased use of encryption in their products and services, and law enforcement agencies, have simmered since Apple refused to help the FBI in 2016 with access to the iPhone of a suspect in a shooting rampage. San Bernardino, California.

In an effort to satisfy all parties, whether users or authorities, Apple announced its new Neural Match system, which is a compromise between its promise to protect customer privacy and persistent demands from governments and law enforcement agencies for more assistance in criminal investigations, including Terrorism and child pornography.


How does Neural Match work?

The NeuralMatch algorithm will continuously scan photos stored on an American user's iPhone and uploaded to the iCloud backup system. Known images of child sexual abuse.

Apple trained the system on about 200000 sexual assault images collected by the US nonprofit National Center for Missing and Exploited Children.

According to people who have been briefed about the plans, every photo uploaded to iCloud in the US will be given a "security voucher" that says whether it's suspicious or not, and once a certain number of photos are marked as suspicious, it will enable Apple to decrypt all the photos. Suspicious, and if illegal, will be reported to the authorities immediately.


Apple and Privacy

Apple may believe that privacy is every user's right, and despite its good intentions on the issue of child exploitation and abuse, many security experts worry that Apple may risk enabling governments around the world to seek access to the personal data of their citizens, which may go beyond its goal the original.

"It's an absolutely horrific idea, because it would lead to mass surveillance, because Apple's system could be manipulated to look for any other targeted images and text, and most importantly, Apple's idea would increase pressure on other tech companies to use similar technologies," says Ross Anderson, professor of security engineering at the University of Cambridge. .

Alec Moffitt, a security researcher and privacy activist who previously worked at Facebook and Deliveroo, said Apple's move was "completely unexpected and a significant backsliding on privacy."

Matthew Green, a professor of security at Johns Hopkins University, who is believed to be the first researcher to tweet on the issue, also believes that what Apple did will break the dam because governments will pressure other companies to do the same.

Alan Woodward, professor of computer security at the University of Surrey, said: “Apple’s system is less invasive, as the scan is done on the phone and the authorities are notified if there is a match, and this decentralized approach is the best way you can adopt to preserve privacy.”

From your point of view, do you think what Apple did is right or is it violating privacy, tell us in the comments

Source:

Financial Times

Related articles