Top Ad unit 728 × 90

Apple’s plan to scan iPhones for child porn worries some security experts

Apple recently announced a couple of new child safety features for all of its software platforms. The new features will roll out with iOS 15, iPadOS 15, watchOS 8, and macOS Monterey in the US later this year, and they aim to limit the spread of Child Sexual Abuse Material (CSAM), among other things. One of the new features will essentially scan iPhones and iPads for CSAM and report them to the National Center for Missing and Exploited Children (NCMEC). Although Apple claims that its method of detecting known CSAM “is designed with user privacy in mind, it has raised concerns among security experts.

According to a recent Financial Times report, security researchers have warned that Apple’s new tool could be misused for surveillance, putting millions of people’s personal information at risk. Their concern is based on the data Apple shared with some US academics earlier this week. Two unnamed security researchers who attended Apple’s briefing have revealed that the proposed system — called “neuralMatch” — would proactively alert a team of human reviewers if it detects CSAM on an iPhone or iPad. The human reviewers will then contact law enforcement if they’re able to verify the material.

Although security researchers support Apple’s efforts to limit the spread of CSAM, some have raised concerns about the potential of this tool being misused by governments to get access to their citizen’s data. Ross Anderson, Professor of Security Engineering at the University of Cambridge, said, “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of… our phones and laptops.” Matthew Green, Professor of Computer Science at the Johns Hopkins Information Security Institute, also raised his concern on Twitter and wrote:

But even if you believe Apple won’t allow these tools to be misused…there’s still a lot to be concerned about. These systems rely on a database of “problematic media hashes” that you, as a consumer, can’t review…Hashes use a new and proprietary neural hashing algorithm Apple has developed, and gotten NCMEC to agree to use…We don’t know much about this algorithm. What if someone can make collisions.

While the algorithm is currently trained to spot CSAM, it could be adapted to scan other targeted imagery or text, like anti-government signs, making it an exceptionally useful tool for authoritarian governments. Apple’s precedent could also force other tech giants to offer similar features, potentially resulting in a privacy nightmare.

Apple is yet to share a response to these concerns. We’ll update this post as soon as the company releases a statement. For more information about the CSAM detection feature, follow this link.

The post Apple’s plan to scan iPhones for child porn worries some security experts appeared first on xda-developers.



from xda-developers https://ift.tt/3itRNDU
via IFTTT
Apple’s plan to scan iPhones for child porn worries some security experts Reviewed by site on août 06, 2021 Rating: 5

Aucun commentaire:

New OnePlus 10 Pro leak suggests that it will offer faster fast-charging support

Over the last few weeks, we’ve seen a couple of leaks about OnePlus’ upcoming flagship — the OnePlus 10 Pro . Leaked renders of the device ...

All Rights Reserved by xda-developers © 2014 - 2015
Powered By Blogger, Designed by Sweetheme

Formulaire de contact

Nom

E-mail *

Message *

Fourni par Blogger.