Apple sued over abandoning CSAM detection for iCloud

  • Thread starter Thread starter Anthony Ha
  • Start date Start date
A

Anthony Ha

Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit argues that by not doing more to prevent the spread of this material, it’s forcing victims to relive their trauma, according to The New York Times. The suit describes […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Continue reading...
 
Back
Top