For child sexual abuse images Apple plans to scan US iPhones

by Startup Miles
For child sexual abuse images Apple plans to scan US iPhones

Apple unveiled plans to scan U.S. iPhones for images of child abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.

Apple will scan photo libraries stored on iPhones in the US for known images of child sexual abuse, the company says, drawing praise from child protection groups but crossing a line that privacy campaigners warn could have dangerous ramifications. The company will also examine the contents of end-to-end encrypted messages for the first time.

Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. The tool Apple calls “neuralMatch” will detect known images of child sexual abuse without decrypting people’s messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.

But researchers worry the matching tool – which does not “see” images, just mathematical fingerprints that represent them – could be put to different purposes.

Matthew Green, a cryptography researcher at Johns Hopkins University, warned that the system could theoretically be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child abuse images. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

“This is a thing that you can do,” said Mr. Green. “Researchers have been able to do this pretty easily.” Tech companies including Microsoft, Google, Facebook and others have for years been sharing “hash lists” of known images of child sexual abuse. Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images.

Tech companies including Microsoft, Google and Facebook have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service for child abuse images. But the decision to move such scanning on-device is unprecedented among major technology companies.

Alongside the neuralMatch technology, Apple plans to scan users’ encrypted messages as they are sent and received using iMessage. An AI-based tool will attempt to automatically identify sexually explicit images, enabling parents to turn on automatic filters for their children’s inboxes. That system, which is purely aimed at providing tools to “warn children and their parents when receiving or sending sexually explicit photos”, will not result in sexually explicit images being sent to Apple or reported to the authorities. But parents will be able to be notified if their child decides to send or receive sexually explicit photos.

The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data. Coming up with the security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

But the Electronic Frontier Foundation, an online civil liberties pioneer, called Apple’s compromise on privacy protections “a shocking about-face for users who have relied on the company’s leadership in privacy and security”.

The computer scientist who more than a decade ago invented PhotoDNA, the technology used by law enforcement to identify child abuse images online, acknowledged the potential for abuse of Apple’s system but said it was far outweighed by the imperative of tackling child sexual abuse.

“Apple’s expanded protection for children is a game changer,” John Clark, the president and CEO of the National Centre for Missing and Exploited Children, said in a statement. “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.” Julia Cordua, the CEO of Thorn, said that Apple’s technology balances “the need for privacy with digital safety for children.” Thorn, a non-profit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.

 

Related Posts