Apple: The controversial child abuse material detection (CSAM) feature announced by Apple in August, may have been abandoned. This Wednesday (15), Apple Removed references to the tool from the page devoted to child safety, according to MacRumors, suggesting a change of direction regarding the technology.
This possible withdrawal from the source is yet another chapter involving the system, which includes scanning photos stored on iCloud for images depicting child sexual abuse. Since its release, criticism of big tech has been incessant.
One of the critics of the photo digitization was Edward Snowden, a former US National Security Agency (NSA) official, who called the feature an attempted mass surveillance. Security researchers, politicians and even Apple officials have also taken a stand against the tool.
In most cases, critics said this kind of technology poses risks to users’ privacy and could be used for other purposes by authoritarian governments. Many also pointed out that there is no evidence of the system’s effectiveness in detecting images of children being abused.
Statements were not accepted
In an effort to allay concerns about the feature to detect abusive material against children, the Cupertino giant has released documents and interviews with executives detailing how the system works. Creating FAQs was another step to reassure iCloud users.
However, the explanation did not have the desired effect and the company eventually decided to postpone the launch in September. Back then, Apple said the decision was made based on feedback from researchers, users and advocacy groups, who chose to develop the system further before releasing it to the general public.
This statement has now disappeared from the website, along with all other information about CSAM. Is the company giving up on the project because of all the controversies that have been registered in recent months, or is the removal of the content for some other reason?
So far, the company has not issued a statement on the matter.