Apple has confirmed that it scans user images in an effort to detect evidence of child abuse, but the company has revealed little about how the scans work, piquing concerns about data privacy and the reach of intrusive tech firms.
While it’s unclear when the image scans started, Apple’s chief privacy officer Jane Horvath confirmed at an event in Las Vegas this week that the company is now “utilizing some technologies to help screen for child sexual abuse material.”
Apple initially suggested it might inspect images for abuse material last year – and only this week added a disclaimer to its website acknowledging the practice – but Horvath’s remarks come as the first confirmation the company has gone ahead with the scans.
A number of tech giants, including Facebook, Twitter and Google, already employ an image-scanning tool known as PhotoDNA, which cross-checks photos with a database of known abuse images. It is unknown whether Apple’s scanning tool uses similar technology.
READ MORE: https://on.rt.com/a8qe
Subscribe to RT! https://www.youtube.com/channel/UCpwv…
Check out http://rt.com
Like us on Facebook http://www.facebook.com/RTnews
Follow us on VK https://vk.com/rt_international
Follow us on Twitter http://twitter.com/RT_com
Follow us on Instagram http://instagram.com/rt
Follow us on Soundcloud https://soundcloud.com/rttv
#RT (Russia Today) is a global #news network broadcasting from Moscow, London, Paris and Washington studios to over 100 countries. RT is the first news channel to break the 1 billion YouTube views benchmark.