A new system introduced by Apple that performs phone scans to search for sexually abusive images involving children has raised concerns regarding privacy. The tech works by searching for matches of known abusive content before it is uploaded into the iCloud storage service. Critics of the new tech have warned Apple users that it could be used to spy on them, resulting in a petition against its use, which has been signed by several organizations and more than 5,000 people.
While Apple stated it would not expand the technology, advocates for digital privacy fear that governments could use the system to monitor the activities of dissidents or create regimes against LGBTQ people. The company said it would not comply with any government request for expansion of the system beyond what it currently is. In a Q & A document published by Apple, it said there were a number of safeguards to block use of the tech other than to detect images of child sexual abuse. The company added it had “steadfastly refused” government demands that violate user privacy before and would continue to do so going forward, even though it has conceded at times in order to keep its business going in other countries.
Apple said its new technology does not scan user photo albums and will only search for shared images on iCloud. Private images are never seen by the company. During a scan, the tool will seek out images that are known and hashed, based on information from child safety organizations. The company also emphasized that it was “nearly impossible” to accuse innocent people of sharing such images, with the odds being less than one in one trillion in a year. All positive matches also undergo a careful and thorough human review.
Privacy advocates are not convinced. They argue that aside from Apple’s promise that the tech will not be used in any other way, there is no real barrier to keep it from happening. The Electronic Frontier Foundation expressed its concern by saying that even a small expansion of the tool’s machine learning capability would allow it to search for other content types, and that it is a full system waiting for any kind of change.
The company also reassured its users that a new feature, which uses linked accounts within a family and warns both children and parents about explicit photos being received or sent, uses a separate technology. It does not and will not ever have access to a user’s private communications, such as call logs, messaging and private images stored on a device or in the cloud.
While privacy advocates pushed back against Apple regarding the new scanning tech, the political world saw it in a more positive light. In the United Kingdom, Secretary of Health Sajid Javid said the technology was useful, and it was time for other tech giants, such as Facebook and Google, to adopt similar measures. It is currently unclear whether that will happen either now or in the near future.