- Advertisement -
World

Apple update will check iPhones for images of child sexual abuse

Apple's iPhone messaging app will also use machine learning to recognise and warn children and their parents when receiving or sending sexually explicit photos.

AFP
1 minute read
Share
A man looks at his phone as he walks past a store of US tech giant Apple in a retail district in Beijing, in this Dec 13, 2019 file photo. Photo: AP
A man looks at his phone as he walks past a store of US tech giant Apple in a retail district in Beijing, in this Dec 13, 2019 file photo. Photo: AP

Apple on Thursday said that iPhones and iPads will soon start detecting images containing child sexual abuse and reporting them as they are uploaded to the iCloud.

The software tweak to Apple’s operating systems will monitor pictures, allowing Apple to report findings to the National Center for Missing and Exploited Children, according to a statement by the Silicon Valley-based tech giant.

“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM),” Apple said.

The new technology will allow the phones’ operating systems to match abusive photos on a user’s phone against a database of known CSAM images provided by child safety organisations, then flag the images as they are uploaded to iCloud, Apple said.

The feature is part of a series of tools heading to Apple mobile devices, according to the company.

Apple’s iPhone messaging app will additionally use machine learning to recognise and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.

And personal assistant Siri will be taught to “intervene” when users try to search topics related to child sex abuse, according to the company.