Apple has hinted that it may not revive its controversial efforts to search for CSAM (Child Sexual Abuse Material) images any time soon. Mac rumors Notes Apple has removed all references to the screening feature on its child safety site. Visit now and you’ll just see iOS 15.2’s optional nude photo detection in messages and get involved when people search for child exploitation terms.
It’s not certain why Apple pulled the references. We have asked the company for comment. This does not necessarily represent a complete withdrawal from CSAM screening, but at least it indicates that a rollout is not imminent.
The CSAM detection feature has drawn criticism from privacy advocates because it indicates on-device scans that can be sent to law enforcement. While Apple has stressed the presence of multiple safeguards, such as caps on flags and its reliance on segmentation from private organizations, there have been concerns that the company may continue to produce false positives or expand surveys under pressure from authoritarian governments.
Apple has indefinitely postponed product launch in order to “make improvements”. However, it is now clear that the company is in no hurry to complete those changes, nor does it want to set expectations to the contrary.
All products recommended by Engadget are handpicked by our editorial team, independently of the parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.