Apple Makes Privacy Changes to Protect Children
Home > Tech > Article

Apple Makes Privacy Changes to Protect Children

Photo by:   Laurenz Heymann, Unsplash
Share it!
Andrea Villar By Andrea Villar | Editorial Manager - Fri, 08/06/2021 - 13:35

Apple rolled out new tools to more effectively detect images of sexual content involving children on iPhone, iPad devices and its iCloud server in the US. “We want to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of child pornography,” the company said in a statement.

The company will use encryption tools to scan and link images it deems suspicious. It will label those images with a specific number. It also plans to compare photos uploaded to iCloud with those stored in an archive managed by the National Center for Missing and Exploited Children (NCMEC), a private US corporation that works to reduce child sexual exploitation. 

Following Apple's announcement, some privacy protection organizations and cybersecurity providers expressed concern about the move and the potential for misuse. The changes “represent a significant departure from long-established privacy and security protocols,” the Center for Democracy and Technology (CDT) said. The CDT added that the Tim Cook-led company was replacing its existing end-to-end encrypted messaging system with a surveillance and censorship infrastructure "vulnerable to abuse and misuse not just in the US but around the world." 

To allay concerns, Apple said it does not have direct access to the images. When a photo looks like one in the archive, Apple will manually review it, disable the user's account if necessary and send a report to NCMEC.

The Cupertino-based company said it is also planning new tools to warn children and parents who have a Family Sharing account when explicit photos are sent to them. Those photos will be blurred and include a warning message indicating that they are not obliged to open the image, while parents can opt to receive a message when their child opens photos. Similar protections will be implemented for children sending sexually explicit photos.

Voice assistant Siri can also intervene when users search for images of child pornography by warning them that such content is problematic. These tools will gradually become available with upcoming updates to mobile and desktop operating systems. It is not yet known whether this tool will be available in other countries including Mexico, where the incidence of child pornography, ranging from production, distribution and consumption, increased 16 percent in 2020 compared to 2019, according to data from the National Guard.

Photo by:   Laurenz Heymann, Unsplash

You May Like

Most popular

Newsletter