PhotoDNA was developed byMicrosoft Research andHany Farid, professor atDartmouth College, beginning in 2009. From a database of known images and video files, it creates uniquehashes to represent each image, which can then be used to identify other instances of those images.[4]
The hashing method initially relied on converting images into a black-and-white format, dividing them into squares, and quantifying the shading of the squares,[5] did not employ facial recognition technology, nor could it identify a person or object in the image.[citation needed] The method sought to be resistant to alterations in the image, including resizing and minor color alterations.[4]Since 2015,[6] similar methods are used for individualvideo frames in video files.[7]
In 2016, Hany Farid proposed to extend usage of the technology toterrorism-related content.[25] In December 2016, Facebook, Twitter, Google and Microsoft announced plans to use PhotoDNA to remove extremist content such as terrorist recruitment videos or violent terrorist imagery.[26] In 2018 Facebook stated that PhotoDNA was used to automatically removeal-Qaeda videos.[13]
By 2019,big tech companies including Microsoft, Facebook and Google publicly announced that since 2017 they were running theGIFCT as a shared database of content to be automatically censored.[2] As of 2021,Apple was thought to be usingNeuralHash for similar purposes.[27]
In 2022,The New York Times covered the story of two dads whose Google accounts were closed after photos they took of their child for medical purposes were automatically uploaded to Google's servers.[28] The article compares PhotoDNA, which requires a database of known hashes, with Google's AI-based technology, which can recognize previously unseen exploitative images.[29][30]
In 2021, Anish Athalye was able to partially invert PhotoDNA hashes with a neural network, which raises concerns about the reversibility of a PhotoDNA hash.[41]
^Douze, Matthijs; Tolias, Giorgos; Pizzi, Ed; Papakipos, Zoë; Chanussot, Lowik; Radenovic, Filip; Jenicek, Tomas; Maximov, Maxim; Leal-Taixé, Laura; Elezi, Ismail; Chum, Ondřej; Ferrer, Cristian Canton (February 21, 2022). "The 2021 Image Similarity Dataset and Challenge".arXiv:2106.09672 [cs.CV].Image fingerprints, such as PhotoDNA from Microsoft, are used throughout the industry to identify images that depict child exploitation and abuse
^Reuter, Markus; Rudl, Tomas; Rau, Franziska; Hildebr, Holly."Why chat control is so dangerous".European Digital Rights (EDRi). RetrievedAugust 21, 2022.
^Abelson, Hal; Anderson, Ross; Bellovin, Steven M.; Benaloh, Josh; Blaze, Matt; Callas, Jon; Diffie, Whitfield; Landau, Susan; Neumann, Peter G.; Rivest, Ronald L.; Schiller, Jeffrey I.; Schneier, Bruce; Teague, Vanessa; Troncoso, Carmela (2024). "Bugs in our pockets: The risks of client-side scanning".Journal of Cybersecurity.10.arXiv:2110.07450.doi:10.1093/cybsec/tyad020.
^Hill, Kashmir (August 21, 2022)."A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal".The New York Times.ISSN0362-4331. RetrievedAugust 21, 2022.A bigger breakthrough came along almost a decade later, in 2018, when Google developed an artificially intelligent tool that could recognize never-before-seen exploitative images of children. [...] When Mark's and Cassio's photos were automatically uploaded from their phones to Google's servers, this technology flagged them.
^Roth, Emma (August 21, 2022)."Google AI flagged parents' accounts for potential abuse over nude photos of their sick kids".The Verge. RetrievedAugust 28, 2022.Google has used hash matching with Microsoft's PhotoDNA for scanning uploaded images to detect matches with known CSAM. [...] In 2018, Google announced the launch of its Content Safety API AI toolkit that can "proactively identify never-before-seen CSAM imagery so it can be reviewed and, if confirmed as CSAM, removed and reported as quickly as possible." It uses the tool for its own services and, along with a video-targeting CSAI Match hash matching solution developed by YouTube engineers, offers it for use by others as well.