Tech giant Apple will roll out a system for checking photos for child abuse imagery on a country-by-country basis, depending on local laws, the company said on Friday.

In a media briefing on Friday, Apple said it would make plans to expand the service based on the laws of each country where it operates.

Buy Me a Coffee

The company said nuances in its system, such as “safety vouchers” passed from the iPhone to Apple’s servers that do not contain useful data, will protect Apple from government pressure to identify material other than child abuse images.

Apple has a human review process that acts as a backstop against government abuse, it added. The company will not pass reports from its photo checking system to law enforcement if the review finds no child abuse imagery.

READ
FBI Warns of North Korean IT Workers Exploiting Remote Jobs to Steal Data and Extort Companies