Apple New Child Safety Features - UPDATED
Category: News & PoliticsBy: evilgenius • one month ago • 11 comments
In the upcoming iOS 15 update Apple announced it would introduce 2 new features for child safety. The first is for minors on a family iCloud account. The system's Messages feature would filter incoming and outgoing images. If sexually explicit content is detected it would blur the image and ask the child if they really want to send or view it. It also has an option for parental notification as well. Apple states that iMessage conversations will remain protected with end-to-end encryption keeping them private.
The second one is a bit more concerning for privacy advocates. Here Apple states they will auto scan images in iCloud photos against a database of known Child Sexual Abuse Material (CSAM) and report instances of matches to the National Center for Missing and Exploited Children (MCMEC). How this is done is really technical using hashing technology and a database of images and Apple claims a high level of accuracy and a failure rate of less than 1 in a trillion. It's notable to mention that Facebook, Twitter, Reddit and many other companies already scan users; files against hash libraries and report CSAM to NCMEC. It's not a new concept or practice.
Some people online are conflating and blowing these new features out of proportion. Claims of law enforcement getting notified of "babies first bath" or "kid's black eye" will not happen. What does concern privacy advocates is that this opens a security back door into an iPhone. Adding openings only allows others to exploit it later. Another is that Apple is scanning client data in the first place. It doesn't matter that Apple claims it will only scan for child exploitation, there is nothing to stop authoritarian governments to push Apple into their version of objectional content. Apple says it will refuse any calls that would abuse the system in these ways and boasts of it's encryption safeguards, but Apple can modify these safeguards at will.
On Friday September 3rd, Apple announced it would delay the rollout of these child safety measures. Responding to criticism from privacy groups the company announced, "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features." What this means and how long, the "step back" is remains to be seen.