Apple unveils plan to scan phones for child sexual abuse photos

News

GREENVILLE, S.C. (WSPA) – In an effort to crack down on Child Sexual Abuse Material, plans are in the works at Apple to scan user’s photos to detect if there is any abuse and report the images if so.

According to Apple, this scanning will come in the new IOS and iPadIOS technology prior to the photos entering iCloud. If detected, the new technology will allow the company to report these instances to the National Center for Missing and Exploited Children.

Though complex, the process is simple. According to the company, the technology system performs on-device matching using a database of images provided by NCMEC and other child safety organizations.

Those images are called ‘hashes,” which are like fingerprints. According to Trend Micro, what’s in a file is processed through a ‘cryptographic algorithm, and a unique numerical value – the hash value – is produced, that identifies the contents of the file.’

But Apple employees aren’t actively reviewing the photos. All of this is made possible through technology. The company converts the database into an unreadable set of hashes that is ‘securely stored on users’ devices.’

In a release from Apple, if the data in the photo meets a certain threshold of CSAM content, that’s when the user’s account is disabled and they are reported. ‘The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account,’ according to the company.

Chelsey Hucker is the Executive Director for Foothills Alliance, an agency that assists victims and survivors of sexual assault. She believes this move by Apple is a game changer.

“If you can track that photo information, you can track where it came from, who took it, where it was taken,” Hucker said. “And so, you know, that will obviously enhance prosecution of those types of crimes.”

She said it’s about protecting our most vulnerable citizens.

Attorney John Reckenbeil said it’s obvious one doesn’t have a right to privacy to possess child pornography, adding it’s against federal and state law.

Whenever you’re dealing witWhenever you’re dealing with child pornography you never have a right to privacy to something such criminal behavior as posessed in child pornography. Obviously that’s against federal and state law.

But argues that isn’t where the issue lies.

“However, the way that it seems with what apple is doing is that they’re going to be scanning your phone kind’ve like a governmental entity,” Reckenbeil said. “Apple would be triggering the fourth amendment because they would be acting somewhat of a quasi governmental entity. If they’re acting in the shoes of the government where they’re scanning your phone for some illegal activity, then there has to be the warrant requirement.”

He adds to check with your provider, look at the terms of use, and know your rights.

“Cause ultimately— what is going to happen is someone is going to say no, apple is going to say yes, there’s going to be a lawsuit filed, and a judge is going to have to make a determination,” said Reckenbeil.

Apple states the threshold is set to have an extremely high level of accuracy to ensure less than a one-in-one trillion chance per year of incorrectly flagging an account.

If a user feels their account has been mistakenly flagged, they can file an appeal to have their accounts reinstated.

Copyright 2021 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

High School Red Zone video and scores
Mascot Challenge
High School Standouts
Ask the Expert
First Responder Friday
Find A Job
wspa news app free for download choose your store below
download the wspa news app from the apple app store
download the wspa news app from the google play store