Apple plans to start scanning all U.S. iPhones for images of child sexual abuse using a tool called “neuralMatch” or “NeuralHash” against a database of known images. Womble Dickinson’s JDSupra article covers many of the high-level privacy concerns and explores Apple’s plans for a service that will scan encrypted messages for sexually explicit content to provide parental notice. Using a generated hash to check for matches to a known felony level image is a completely different proposition than using one of many nudity/NSFW detection systems available.

When my old team first processed bulk email for the Enron and related investigations we came up with a ‘5,000 to 1 rule’. For every 5,000 email we reviewed at least one was referred to HR for unprofessional content or images. That NSFW content still shows up in mobile device acquisitions, internet caches, picture folders and chat logs with amazing frequency. Most employees do not have the technical skills or discipline to maintain the digital divide between work and personal content and communications in a BYOD work from home reality. In recent interviews I have encountered knowledge workers continuously channel hopping between multiple accounts and services, professional and personal.  Some have hundreds of windows/apps open to manage the firehose of data. Blackberry’s and smart phones broke the work-life boundary in the late 1980’s for executives. The only practical way to keep work-life communications separate today is to go back to carrying multiple devices.

Questions to consider:

  • Will Apple make that NSFW detection service available to corporations who provide iPhones to their employees?
  • Will corporate mobile device policies push it to employee BYOD iPhones?
  • Would this surface a flood of acceptable use/HR violations? I bet it would.
  • What happens to corporate sensitive data when an executive iPhone is locked and seized by officers under the Federal Agency Task Force for Missing and Exploited Children?
  • Have you considered the fact that Apple can remotely review flagged images, lock an employee iPhone and then decrypt it for law enforcement?
  • How does this NeuralHash technology work?
  • Can it be fooled or used in a cyber-attack scenario? Yes it can.

Digging into the techie details, Apple’s CSAM system will generate a NeuralHash cryptographic safety voucher on their devices for new images uploaded to iCloud Photos. If that account crosses some arbitrary threshold of content matching known images Apple will decrypt them and review them. (Not it for that review job). If confirmed, Apple disables the user’s account and sends a report to NCMEC for investigation. It is probably a stretch to say that this monitoring of messages could break attorney-client communication privilege, but my attorney friends are more than happy to comment on that. Apple’s CSAM system has a complex system of managing the shared secret decryption keys to protect Apple from accusations of eavesdropping without cause.

This research paper, Image hash using Neural Networks, is a dry read unless you are a fellow forensic geek. My takeaway is that the system will catch images that have been rotated, cropped or had a Gaussian filter applied. I can think of many image processing approaches that might produce sufficient changes to produce different hash value. (which I am not going to explore on a public venue)

It has been decades since my crime lab days searching for child porn files buried in suspect drives. We frequently found BBS logs where these perpetrators shared apps, code and techniques to change file extensions, password encrypt and otherwise hide digital evidence. Serial abusers who share photos in the dark corners of the web frequently have better security protocols than our financial institutions. While I hope that systems like CSAM may stop abusers, I do not have confidence that they will have any significant impact.

On the other hand, I can see governments and corporations salivating over a system that would catch data leaks at the device level. After all, a pdf file is just an enhanced JPEG within a multipart compressed file. Imagine a sales rep’s phone locking up because he had attempted to send a confidential price list to a competitor he was interviewing with. As long as modern employees have to juggle multiple data silos and constantly channel hop to drive their days we will see personal images, PII and other data contamination in corporate discovery.

Greg Buckles wants your feedback, questions or project inquiries at Greg@eDJGroupInc.com. Contact him directly for a free 15 minute ‘Good Karma’ call. He solves problems and creates eDiscovery solutions for enterprise and law firm clients.

Greg’s blog perspectives are personal opinions and should not be interpreted as a professional judgment or advice. Greg is no longer a journalist and all perspectives are based on best public information. Blog content is neither approved nor reviewed by any providers prior to being published. Do you want to share your own perspective? Greg is looking for practical, professional informative perspectives free of marketing fluff, hidden agendas or personal/product bias. Outside blogs will clearly indicate the author, company and any relevant affiliations. 

 

See Greg’s latest pic on Instagram.

0 0 votes
Article Rating