Apple New Child Safety Features - UPDATED

  

Category:  News & Politics

By:  evilgenius  •  2 months ago  •  11 comments

Apple New Child Safety Features - UPDATED

In the upcoming iOS 15 update Apple announced it would introduce 2 new features for child safety. The first is for minors on a family iCloud account. The system's Messages feature would filter incoming and outgoing images. If sexually explicit content is detected it would blur the image and ask the child if they really want to send or view it. It also has an option for parental notification as well. Apple states that iMessage conversations will remain protected with end-to-end encryption keeping them private. 

The second one is a bit more concerning for privacy advocates. Here Apple states they will auto scan images in iCloud photos against a database of known Child Sexual Abuse Material (CSAM) and report instances of matches to the National Center for Missing and Exploited Children (MCMEC). How this is done is really technical using hashing technology and a database of images and Apple claims a high level of accuracy and a failure rate of less than 1 in a trillion. It's notable to mention that Facebook, Twitter, Reddit and many other companies already scan users; files against hash libraries and report CSAM to NCMEC. It's not a new concept or practice.

Some people online are conflating and blowing these new features out of proportion. Claims of law enforcement getting notified of "babies first bath" or "kid's black eye" will not happen. What does concern privacy advocates is that this opens a security back door into an iPhone. Adding openings only allows others to exploit it later. Another is that Apple is scanning client data in the first place. It doesn't matter that Apple claims it will only scan for child exploitation, there is nothing to stop authoritarian governments to push Apple into their version of objectional content. Apple says it will refuse any calls that would abuse the system in these ways and boasts of it's encryption safeguards, but Apple can modify these safeguards at will.

UPDATE: 

On Friday September 3rd, Apple announced it would delay the rollout of these child safety measures. Responding to criticism from privacy groups the company announced, "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features." What this means and how long, the "step back" is remains to be seen.


Tags

jrDiscussion - desc
[]
 
evilgenius
Professor Guide
1  author  evilgenius    2 months ago

There is a lot of hysteria out there on these new features that simply isn't warranted...yet.

 
 
 
Kavika
Professor Principal
2  Kavika     2 months ago

I'm happy that my kids are no longer kids.

 
 
 
evilgenius
Professor Guide
2.1  author  evilgenius  replied to  Kavika @2    2 months ago

My son will be 27 tomorrow, but the granddaughter is due Sep 1st.

 
 
 
Kavika
Professor Principal
2.1.1  Kavika   replied to  evilgenius @2.1    2 months ago

Congratulations on the new granddaughter.

 
 
 
evilgenius
Professor Guide
2.1.2  author  evilgenius  replied to  Kavika @2.1.1    2 months ago

Thanks! We will be going down at the end of Sep.

 
 
 
Trout Giggles
Professor Principal
2.1.3  Trout Giggles  replied to  evilgenius @2.1    2 months ago

Congratulations!

My oldest is 28 and my son will be 27 next month. No grandkids on the horizon....

 
 
 
Ender
Professor Principal
3  Ender    2 months ago

Why do people think all their data is private anyway.

I would venture to guess that none of it is.

 
 
 
evilgenius
Professor Guide
3.1  author  evilgenius  replied to  Ender @3    2 months ago

Critics of Apple's end-to-end encryption was often that the iPhone was the device of choice for pedophiles because LEO couldn't get into it without 3rd party help and weeks to months of time.

 
 
 
Ender
Professor Principal
3.1.1  Ender  replied to  evilgenius @3.1    2 months ago

It kinda sounds like Apple is trying to address that.

I just don't know if I would really trust algorithms.

 
 
 
evilgenius
Professor Guide
3.1.2  author  evilgenius  replied to  Ender @3.1.1    2 months ago
It kinda sounds like Apple is trying to address that.

Most certainly what they are stating. The detractors are only saying once the cat is out of the bag there is not putting it back AND that cat could turn into a tiger. IF Apple is true to their word then everything is gravy, but it may only take a dump truck of cash or two to have the code tweaked to identify something like say political opponents.

I just don't know if I would really trust algorithms.

Algorithms can only do what they are programed to do. It's just code. I'd trust well tested algorithms more than I'd trust a well tested human. Algorithms can't have a sleepless night or an argument with a co-worker.

 
 
 
evilgenius
Professor Guide
4  author  evilgenius    one month ago

No "child safety features" in iOS 15? At least not at first...

 
 
Loading...
Loading...

Who is online



20 visitors