Apple steps back from image scanning algorithms on user devices
07-09-2021 | By Robin Mitchell
Recently Apple announced that it would integrate algorithms to detect illicit images on user devices but has since turned away from this. What was Apple’s original plan, why has Apple made a U-turn, and what does this tell us about user privacy?
What was Apple’s original plan?
Recently, Apple has been surrounded by controversy with its plan to introduce image-scanning algorithms onto user devices. The idea was that to prevent the spread of illicit materials being uploaded to Apple’s cloud system, a hash-like algorithm would produce a fingerprint of each image uploaded to Apple’s cloud. This fingerprint would then be sent to a remote server. This fingerprint is then compared to other fingerprints stored on a database that correspond to fingerprints from known illicit materials.
While some would say that this can help protect minors, others would say that this is a gross violation of user privacy. Because the algorithm scans all content being uploaded to the cloud, the algorithm is accessing all private data, whether it be a screenshot of bank details, a user’s home address, or pictures of a private nature. Of course, the algorithm doesn’t actually upload the raw content of a file, but this doesn’t mean that the nature of the image can’t be determined.
For example, there are significant similarities between images of naked individuals, and these files will produce fingerprints that are to each other. As such, the algorithm can determine if someone is uploading content that includes nude images. Therefore, a successful cyber-attack could determine if there are naked images of the user.
This is not the first time companies have scanned material images (Google mail scans user emails for keyword searches). What makes this situation different is that apple as a company prides itself on user privacy, making Apple hypocritical.
Why has Apple made a U-turn?
When Apple announced, its plans to introduce the algorithm is expected to be given praise in its attempt to fight against illicit content. However, the reaction from the general public was closer to outrage than praise, and news reports coving the algorithm would unanimously touch on user privacy.
The idea that a business could scan all user content to create digital fingerprints and then have them sent to a remote server outside of their control was taking invasion of privacy too far.
Furthermore, the algorithm could also be adapted to ban content that the company and/or government deems inappropriate, from memes to political jokes. This is currently a primary concern for countries such as China, where human rights are already violated daily.
As such, Apple announced that it would be holding back on deploying its algorithm in future devices. While this does not mean that the algorithm won’t eventually be used, it does demonstrate that user privacy is already highly valued by the general public.
What does this tell us about user privacy?
If electronics engineers can learn anything about this incident, the general public highly respects its privacy and will want it protected as much as possible. The desire for user privacy combined with GDPR means that engineers need to consider protecting user data when designing electronics.
For example, a design that uses FLASH memory to store personal data should allow the user to wipe the data (or at least encrypt it and throw away the key). This can be achieved in either hardware or software, but what is essential is that even the simplest devices can hold private data.
Privacy is an essential metric in modern designs also leaves room for engineers to capitalize on privacy protection. A product with a level of privacy guarantee could be seen as a more desirable product, thereby creating a competitive edge.
Overall, Apple has demonstrated that providing privacy is more important than providing security. The last thing that customers want from electronic products is for those products to perform surveillance on them. How can one use a device if they can’t trust it?