Apple’s iPhone includes new tools to flag child sexual abuse – Illinois News Today

Apple’s iPhone includes new tools to flag child sexual abuse – Illinois News Today

Jack Nikas, New York Times Company

Last week, Apple announced a change to the iPhone designed to capture cases of child sexual abuse. This is a move that is likely to please parents and police, but was worried about the privacy watchdog.

Later this year, the company will use complex technology to find images of child sexual abuse that users upload to Apple’s iCloud storage service, commonly known as child pornography. .. Apple also said parents will soon be able to turn on the ability to flag children when they send and receive nude photos via text messages.

Apple said it designed new features in a way that protects user privacy. This includes ensuring that Apple never sees or finds nude images exchanged in children’s text messages. The scan will run on the child’s device and notifications will only be sent to the parent’s device. Apple provided citations from cybersecurity experts and child safety groups who praised the company’s approach.

Other cybersecurity experts were still concerned. Matthew Green, a professor of cryptography at Johns Hopkins University, said Apple’s new features set a dangerous precedent by creating surveillance technologies available to law enforcement and government.

“They have sold their privacy to the world and made people trust their devices,” Green said. “But now they are basically surrendering to the worst possible demands of all governments. I don’t know how they will say no in the future.”

Apple’s move follows a 2019 investigation by the New York Times, revealing a global criminal organization that has exploited flawed and inadequate efforts to curb the explosion of images of child sexual abuse. rice field. Research has shown that many tech companies are not properly monitoring their platforms and that the amount of such content is increasing significantly.

Although this material is older than the Internet, technologies such as smartphone cameras and cloud storage have made it possible to share images more widely. Some images have been in circulation for years and continue to hurt and annoy the people depicted.

However, various reviews of Apple’s new features show the thin line that tech companies must walk between helping public safety and ensuring customer privacy. Law enforcement officers have complained for years that technologies such as smartphone encryption are hindering criminal investigations, but technical executives and cybersecurity experts say that such encryption is for people. It claims to be essential to protect data and privacy.

With Thursday’s announcement, Apple tried to thread the needle. It said it has developed a way to help eradicate child predators that don’t compromise the security of the iPhone.

According to Apple, the iPhone uses a technique called image hashing to find child sexual abuse material (CSAM) uploaded to iCloud. This software summarizes a photo into a unique set of numbers, a kind of image fingerprint.

The iPhone operating system will soon store a database of hashes of known child sexual abuse material provided by organizations such as the National Missing and Exploited Children’s Center, and hash them against each photo hash in the user’s iCloud. Run it to see if it’s there. It is a match.

If there is a certain number of matches, the photo will be displayed to Apple employees to verify that it is actually an image of child sexual abuse. In that case, they will be transferred to the National Missing and Exploited Children’s Center and your iCloud account will be locked.

Apple said the approach meant that people without child sexual abuse material on their mobile phones wouldn’t be able to see the pictures at Apple or the authorities.

“If you have a collection of CSAM materials, yes, this is bad for you,” said Erik Neuenschwander, Apple’s privacy officer. “But for the rest of you, this is no exception.”

Apple’s system doesn’t scan videos uploaded to iCloud, even though criminals have been using this format for years. In 2019, the number of videos reported to the National Center for the first time exceeded the number of photos. Centers often receive multiple reports for the same content.

US law requires technology companies to report cases of child sexual abuse to authorities. So far, Apple has flagged fewer cases than others. For example, last year Apple reported 265 cases to the National Missing and Exploited Children’s Center, while Facebook reported 20.3 million, according to center statistics. That big gap is partly due to Apple’s decision not to scan such material because of user privacy.

Apple’s other ability to scan text message photos will only be available to families with a shared Apple iCloud account. When the parent turns it on, the child’s iPhone analyzes all the photos sent and received by text message to determine if it contains nudity. Nude photos sent to children will be blurred and children will have to choose whether to see them. Parents will be notified if a child under the age of 13 chooses to view or send nude photos.

Mr Green said that such a system could be abused because Apple has shown to law enforcement and government that there is a way to flag specific content on the phone while maintaining encryption. He said he was concerned. Apple has previously argued to authorities that encryption cannot retrieve certain data.

“What if another government asks Apple to use it for other purposes?” Green asked. “What is Apple going to say?”

Neuenschwander has dismissed these concerns and said precautions have been taken to prevent system abuse and Apple will reject such requests from the government.

“We let them know that they didn’t build what they were thinking,” he said.

The Times reported this year that Apple broke personal data of Chinese users in China and actively censored apps in China in response to pressure from the Chinese government.

Honey Farid, a computer science professor at the University of California, Berkeley, who helped develop early image hashing technology, said the potential risks of Apple’s approach deserve children’s safety.

“I think the benefits outweigh the disadvantages if reasonable safeguards are taken,” he said.

This article was originally New York Times..

You must be logged in to post a comment Login