![]() “Previously, we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Privacy experts objected, noting that the very concept of scanning a user’s photo library for prohibited material was a privacy violation in itself, one that could expand to include other material once the precedent had been set. The company noted that it would do this on-device, for maximum privacy. The update would have implemented an on-device scanning feature aimed at sweeping through a user’s iCloud photo library for CSAM material. ![]() ![]() However, the company did not include its controversial iCloud Photos child sexual abuse material scanning feature and has appeared to have scrubbed all mention of its existence.īy way of recap, iOS 15.2 was originally intended to come with a child sexual abuse material (CSAM) detection feature. Fitbit Versa 3Īpple this week released iOS 15.2 with a host of new features, including a communication safety feature for the Messages app that’s focused on child protection.
0 Comments
Leave a Reply. |