Apple's CSAM scanning system could spark worldwide persecution


 More than ninety rights groups around the world have signed a letter condemning Apple's plans to scan devices for Child Sexual Abuse Material (CSAM). According to human rights activists, the tech giant can introduce "censorship, surveillance and persecution on a global basis."


We will remind, earlier Apple announced the development of a system for scanning photos of iPhone users for illegal images. The images will reportedly be viewed by the program in order to search for prohibited photographic materials, such as child pornography or other exploitation of minors. The system is designed to run on Apple users' devices. If suspicious content is found in the gallery, the images are sent to employees for verification.


The US-based nonprofit Center for Democracy and Technology wrote an open letter urging Apple to abandon its approach to mass scanning. Signatories include organizations such as Liberty and Big Brother Watch, the Tor Project and Privacy International.


The letter raises concerns about the accuracy of Apple's technology, as these kinds of algorithms "tend to mistakenly tag artwork, health information, educational resources, propaganda messages, and other images." In addition, there may be repercussions from government intervention.


“Once this capability is built into Apple products, the company and its competitors will face tremendous pressure and possibly legal requirements from governments around the world to scan photos not only for CSAM, but other images that the government deems undesirable. ", The letter says.


According to human rights activists, the images may contain human rights violations, be associated with political protests, alleged terrorist or extremist materials, or even unflattering images of the very politicians who will put pressure on the company. This pressure can apply to all images stored on the device, not just those uploaded to iCloud. In this way, Apple will lay the foundation for worldwide censorship, surveillance, and harassment.


User distrust of the new system only increased when cybersecurity researchers unveiled a collision with the built-in iOS hash function. The issue affects the hashing algorithm NeuralHash, which allows Apple to check user content for child pornography images that match images from the National Center for Missing & Exploited Children (NCMEC) databases. At the same time, the company will not have access to photos that are not in the databases, as well as to other user information.

Previous Post Next Post