After backlash, Apple delays the plan to scan devices for child abuse

Detection technology could scan the device for child abuse material stopped due to privacy criticism

Pirvacy issues made Apple to stop the plan for new featureThe update that was supposed to come out this year gets improved further when privacy issues with child abuse material detection were indicated by various groups

Technology giant Apple has paused its plans to scan devices for child sexual abuse and exploitation material as it caused a major backlash among users and privacy groups. Such features were announced in August and were intended for inclusion in iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. However, many reactions showed that an on-device tracking set is a dangerous thing.

Features would observe the Messages application, with client-side machine learning implemented to scan and alert when sexually explicit images are sent. User input would be required, as they would choose whether or not they want to view the material.[1] Apple now states that more time is needed for improvements on controversial safe tools.

Additionally, changes would be implemented on Siri and Search. Updates included the features providing additional information for parents and children to warn them about unsafe situations, as well as to intervene if a user performed a search for Child Sexual Abuse Material (CSAM).

Finally, Apple planned to implement a CSAM-scanning tool to protect children from predators who use communication tools to exploit them. The tool would be used to help limit the spread of CSAM online while maintaining user's privacy. Company officials state that CSAM could help provide valuable information to law enforcement.[2]

New controls sparked alarm among privacy groups and researchers

Whatever plans and good ideas company had, all was stopped due to huge concern and backlash from customers and human rights, privacy groups. Apple came out and released a statement saying that based on negative feedback from customers, advocacy groups, and researchers they have decided to take additional time to make improvements before releasing new features.[3]

The feedback was mainly negative, and The Electronic Frontier Foundation even announced that it had amassed more than 25,000 signatures from consumers to stop the new feature from rolling out. Around 100 policy and rights groups, including the American Civil Liberties Union, asked Apple to abandon or reconsider plans to roll out the technology.

Security experts actively point out that governments could brutally abuse this new feature to implicate innocent people or manipulate the system in their own need. Later, the company cleared up the situation and stated that new technology is limited to detecting CSAM and won't be implemented upon the government’s request.

With growing concerns about cybersecurity, US White House officials keep the mindful narrative surrounding privacy and civil liberties. It is stated that there are no plans to seek additional monitoring on US-based networks. The administration is said to focus on tighter partnerships and improved information-sharing with the private-sector companies with broad visibility into the domestic internet.[4]

NeuralHash technology presents more questions than answers

NeuralHash technology is based upon collaboration with law enforcement officials. As law agencies maintain a database of known child sexual abuse images, they translate those images into hashes, numerical codes that positively identify the image but cannot be used to reconstruct them.

Apple would be using a similar database with NeuralHash, but the database would simply be stored on iPhones. To put it clearly, NeuralHash is a hashing algorithm that is insensitive to small changes in the input image. However, as it is a relatively new technology, the full extent of its vulnerabilities is perhaps not fully explored.[5]

The feature was due to launch later this year, but the child abuse material detection technology and its privacy issues had to be considered.

Last month we announced plans for features intended to help protect children from predators who use communication tools.. Based on feedback, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

About the author
Ugnius Kiguolis
Ugnius Kiguolis - The mastermind

Ugnius Kiguolis is a professional malware analyst who is also the founder and the owner of 2-Spyware. At the moment, he takes over as Editor-in-chief.

Contact Ugnius Kiguolis
About the company Esolutions

References
Files
Software
Compare