Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers' devices.
Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.
Apple said that if a match is found a human reviewer will then assess and...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register. By continuing to use this site, you are consenting to our use of cookies.