Please Follow us on Gab, Minds, Telegram, Rumble, Gab TV, GETTR
Governments were already discussing how to misuse CSAM scanning technology even before Apple announced its plans, say security researchers.
The biggest concern raised when Apple said it would scan iPhones for child sexual abuse materials (CSAM) is that there would be spec-creep, with governments insisting the company scan for other types of images, and there now seems good evidence for this …
Apple insisted that it had a solid safeguard in place to protect privacy and prevent misuse. It would only match images against known CSAM databases; it would check at least two databases and require the image to be in both; action would only be triggered on 30 matching images; and there would be a manual review before law enforcement was alerted...
To read more visit 9To5MAC.
Subscribe to our evening newsletter to stay informed during these challenging times!!