The feature that will search for photos of child abuse on users’ iPhones has not been unanimously received in Apple’s ranks.
The apple company, which for years has made privacy the basis of its marketing campaigns, announced the development of algorithms capable of running on users’ iPhone , which will be in charge of searching for files related to child abuse, such as pornography .
Although Apple has enthusiastically presented the functionality and claims that it has received private support from organizations and institutions, internally it is a divided company. Reuters has obtained the testimonies of Apple employees, who speak of negative and positive reactions to this functionality.
Like many other companies during the growth of telecommuting, Apple has Slack channels that employees can use to communicate with each other. One of these channels has been overwhelmed, with more than 800 messages in a week , a surprising volume and duration according to internal sources, since most channels focus on maintaining social contact, and secrecy about products in development forms part of the company culture.
Most of the messages come from employees who do not work in security and privacy related roles, who will likely have found out about the new role at the same time as the rest of the world. Many have expressed concerns, such as that the function could be exploited by repressive governments to censor material and make arrests.
Other messages warn that Apple’s image can go wrong, and that its reputation as a company that protects the privacy of users is being damaged with this new functionality.
However, these messages are not unanswered, with some employees claiming that Apple’s solution is a “reasonable response” to pressure from governments to stop trafficking in illegal material. File scanning is only activated if the user uploads them to iCloud, but the algorithm does not run on the servers but on the device itself; the only files that are sent for review, and possible reporting to the police, are those that have tested positive after a certain quota.
This is precisely what gives some Apple employees hope that this functionality is the beginning of a greater commitment to privacy, since it would allow to implement stronger encryption in iCloud , which cannot be easily broken by attackers or governments . However, Apple has not announced anything similar yet.
From the outset, the measure was rejected by cybersecurity experts, who see this functionality as a step “towards surveillance and control” of users. The great fear is that authoritarian governments may force Apple to search for other types of files , such as those containing religious or LGBT references, facilitating the persecution of minorities. In response, Apple has said that it simply “will not accept” such demands from governments, but the question is what it will do if it is required by law.