Apple quietly deletes details of derided CSAM scanning tech from its Child Safety page without explanation
Non-consensual on-device image analysis perhaps was difficult to reconcile with privacy protests
Apple evidently has decided against forcing customers to run its sex crime detection software on their iPhones in order to refer those stashing illegal child abuse images in iCloud to authorities.…
from The Register
No comments