Here\’s the Situation With Apple\’s New Photo Scanning Technology and Privacy

For years, Apple has billed itself as a champion for privacy on behalf of its users. The biggest example of this is likely their dispute with the FBI in 2016, when it refused to help unlock the iPhone recovered from a participant in a mass shooting event in San Bernardino, California.

One of the shooters had their work phone recovered, but it was locked with a password that would eliminate all data after 10 failed attempts. The FBI went to Apple to get help breaking into the phone, but the tech giant declined. Their argument was that if they were to make a privacy exception in this case, there would be no telling where that would end – creating a significant issue on behalf of the millions of iPhone users around the world. The FBI would eventually go on to get a third party to help unlock the iPhone, but the end result is the same – Apple showed on a world stage that it cared more about the privacy of its users than just about anything else.

Flash forward to 2021 and that stance seems to be coming under a bit of scrutiny. It was recently announced that beginning with iOS 15, set to be released in the fall of this year, the \”Messages\” app on the device will automatically scan for \”explicit\” photos on the accounts of children. Not only that, but all photos uploaded to iCloud – a feature that is enabled by default – will be scanned for child sexual abuse material as well. Naturally, this has caused a fair amount of controversy for reasons that are certainly worth exploring.

Apple\’s Scanning Technology: An Overview

With regards to the \”Messages\” application, with the release of iOS 15 an iPhone will automatically scan both sent and received photos for images that are considered to be \”sexually explicit\” in nature. In the event that a photo is identified that matches that description, the user\’s device will automatically display a warning outlining the risks of such material, while also asking to confirm if they really want to see the image in question. If the account belongs to a child under the age of 13, their parents will also automatically be notified – provided that both parties have opted into the service.

To their credit, Apple has indicated that they will be using artificial intelligence in an attempt to make this work. They claim that their AI algorithm is already so strong that it can tell which images are explicit and which ones aren\’t. Likewise, the entire process is executed on the phone itself – meaning that no images are being sent over the Internet for someone to review. While this has caused a fair amount of controversy in and of itself, it\’s nothing compared to the discussion that has started around the automatic scanning of child sexual abuse material.

For all iPhone owners who use iCloud, which is again enabled by default and lets you share photos and videos across multiple devices, will scan all content for things that are considered to be pornographic or otherwise sensitive in nature. If someone uploads more than 30 of such pieces of content that have been matched to an image database from NCMEC (the National Center for Missing or Exploited Children), their account will automatically be shut down. At that point, they will also be reported to NCMEC, who will likely contact the proper authorities.

The important thing to understand is that for a photo to be classified as child sex abuse material, it needs to already appear in both the NCMEC database and another database that is operated by a third party organization. In the event that you try to upload one of these photos, your iPhone will automatically attach what is called a \”safety voucher\” before it is uploaded to iCloud.

In an announcement of this service going live, Apple said that employees won\’t be able to look at any of these photos until there have been at least 30 different matches. Once that happens, all photos in an iCloud account will be decrypted and an actual, human Apple employee will look at them. If they decide that the photos do match child sexual abuse material, they will notify the National Center for Missing or Exploited Children who will then take matters into their own hands. 

Obviously, it\’s hard to be on the other side of an issue that is promoting the health and safety of children. But at the same time, privacy advocates can\’t help but ask the question – where does this end? What\’s to stop Apple from suddenly deciding that some other type of image is questionable? Unfortunately, there are no easy answers right now – but it\’s a situation that people will definitely be paying close attention to moving forward.