A child safety group pushed Apple on why the announced CSAM detection feature was abandoned, and the company has given its most detailed response yet as to why it backed off its plans.
I think Apple’s ultimate decision on this is the correct one. The world is an ugly place and there’s no silver bullet that solves a problem like CSAM and ensures it can’t be abused.
Wish it weren’t as this likely would have made a huge impact against child abusers, but thankfully degrading every Apple user’s privacy isn’t the only effective way to fight against it.
I think Apple’s ultimate decision on this is the correct one. The world is an ugly place and there’s no silver bullet that solves a problem like CSAM and ensures it can’t be abused.
Wish it weren’t as this likely would have made a huge impact against child abusers, but thankfully degrading every Apple user’s privacy isn’t the only effective way to fight against it.