Apple drops references to CSAM detection technology |
Apple has updated the Child Safety Features website to remove all references to the controversial CSAM detection feature, which was first announced in August.
It looks like this change will happen between December 10 and 13. However, despite the changes to its website, the company said that its plans for this feature have not changed.
Two of the three security features released earlier this week with iOS 15.2 are still available on the page titled "Advanced Child Protection."
However, reference to the more controversial CSAM discovery was removed and publication of the discovery was postponed after strong opposition from data protection officials.
A company spokesperson said the company's position has not changed since the delay in launching CSAM Discovery was first announced in September.
Based on feedback from customers, groups, researchers, and others, the company's September statement reads as follows. We've decided to invest more time in the coming months collecting feedback and making improvements before we roll out these important child safety features.
Apple's statement doesn't mean that this feature is entirely outdated. Documents explaining how the job works can always be found on their website.
The CSAM detection feature caused controversy when it was announced because it involved scanning iCloud photos and comparing them to a database of known child sexual abuse photos.
The company claims that while it knows users are uploading child abuse images, this method allows users to report it to authorities without compromising the general privacy of its customers. It also indicates that the encryption of user data is not affected and that the scan is performed at the device level.
Apple said the feature was delayed rather than canceled
However, critics believe that Apple's system is at risk of end-to-end encryption. Some people call this system a back door. You said that governments around the world could get the company to expand content beyond CSAM. For its part, the company said it will not accept government requests to expand it beyond CSAM.
Although the CSAM discovery function has not yet received a new release date. Apple released two parental controls in August.
One is designed to warn children when they receive nude pictures in messages. The second provides additional information if you search for child exploitation terms using Siri, Spotlight, or Safari.