7950 Legacy Drive, Suite 360, Plano, TX 75024
Free Initial Consultation
214-544-0061
Call 24/7

Apple Announces New Photo Scanning Policy to Fight Child Pornography

 Posted on September 27, 2021 in Criminal Defense

Fort Worth Crimes Against Children LawyerLast month, Apple announced its new policies and programs in an effort to protect children from people who use electronic communication tools in order to recruit and exploit children to produce and distribute Child Sexual Abuse Material (CSAM). In addition to certain safety features that will be available to parents for monitoring their children’s phones, Apple has also announced it will begin monitoring iCloud photos for child abuse photos and forwarding what they find to authorities.

Scanning Photos

It may come as a surprise to some users, but the majority of cloud services, including Microsoft, Google, and Dropbox, already scan user files for any content that may be illegal or that violates the company’s terms of service. In the past, Apple has refused to participate in this policy, but that is all changing with the recent announcement.

The company has developed a new technology called NeuralHash. Apple will use the technology to search for “hashes” on a user’s electronic device. Apple will store a database of hashes that will match up with known CSAM. The database CSAM hashes are being provided to the company by child protection organizations, such as the National Center for Missing & Exploited Children (NCMEC) and others.

If the NeuralHash finds a hash on the user’s device that is similar to one or more of the CSAM’s hashes stored in Apple’s database, it will be flagged as potential CSAM and turned over to the authorities. However, unlike a search for a fingerprint match – which is very precise – the search for a hash match is very imprecise.

How Accurate Is the NeuralHarsh System?

Basically, all Apple devices will contain a database of hashes of forbidden images. Every time the user downloads or uploads an image, the device will check for any matches to the forbidden images. However, because of its failure to be precise, there is the real possibility that photos that are not inappropriate or illegal will be flagged and the user’s name turned over to law enforcement, triggering an investigation.  

Another troubling possibility is that a third party could potentially flood the user’s account with inappropriate images that will cause the user’s account to get flagged by Apple. This could lead to false investigations and maybe even false charges being filed against the user as a way to set them up, such as revenge for a breakup or a child custody battle.

One way for Apple device users to avoid the issue is to not use iCloud. NeuralHash will not be used for customers who do not use iCloud, even if they are using an iPhone or some other Apple device.

Contact a Collin County Defense Attorney

If you have been accused of possessing, creating, or distributing child pornography, you could face significant prison time if you are convicted. Do not try to defend against these charges without a seasoned Plano, TX crimes against children lawyer advocating for you and protecting your rights. Call The Crowder Law Firm, P.C. today at 214-544-0061 to schedule a free and confidential consultation and find out how our firm can help.

Source:

https://www.apple.com/child-safety/pdf/Apple_PSI_System_Security_Protocol_and_Analysis.pdf

Share this post:
Elite Lawyer AVVO National Trial Lawyer National Trial Lawyer Top 40 Under 40 SuperLawyer Client Champion 2020 Nations Top Attorneys National Association of Distinguished Counsel
Back to Top