[“A dad took photos of his naked toddler for the doctor. Google flagged him as a criminal”](https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html) by Kashmir Hill for *The New York Times* ([Apple News+ Link](https://apple.news/ARIOAqbokTkW4ZEBxNn7JZA) for *The Seattle Times* republication) > In December, Mark received an envelope in the mail from the San Francisco Police Department. It contained a letter informing him that he had been investigated as well as copies of the search warrants served on Google and his internet service provider. An investigator had asked for everything in Mark’s Google account: his internet searches, his location history, his messages and any document, photo and video he’d stored with the company. Abby and I were already conscious of the [consequences of sharing pictures of children](https://youtu.be/pDXcika12xE). It’s something we’ve discussed with our own little girl [[The Next Generation|on the way]]. An angle I never considered was a risk of being flagged for producing CSAM (Child Sexual Abuse Material). I think back to Apple’s plan for assisting with finding CSAM on devices that they [detailed](https://daringfireball.net/linked/2021/08/09/apple-csam-faq) and then [delayed](https://daringfireball.net/linked/2021/12/15/apple-child-safety-csam) last year. John Gruber’s [thoughts](https://daringfireball.net/linked/2022/08/22/hill-csam) on this New York Times story do a good job explaining the two main methods companies use to flag CSAM—hashes against known material and AI to discover new material. Throw in to the mix server-side or on-device scanning, which was the crux of discussions surrounding Apple’s proposed method last year. > The on-device vs. on-server debate is legitimate and worth having. But I think it ought to be far less controversial than Google’s already-in-place system of trying to identify CSAM that isn’t in the NCMEC known database. Now I am even more aware that the photos I take of my kid could harm them and myself. This is a difficult and delicate issue. I certainly will not by sharing photos via tele-medicine, if asked. > Mark did not remember this video and no longer had access to it, but he said it sounded like a private moment he would have been inspired to capture, not realizing it would ever be viewed or judged by anyone else. > > “I can imagine it. We woke up one morning. It was a beautiful day with my wife and son, and I wanted to record the moment,” Mark said. “If only we slept with pajamas on, this all could have been avoided.”