Yahoo's Efforts to Combat Child Sexual Abuse Material (CSAM)

Yahoo prohibits child sexual abuse material (CSAM) under our Terms of Service and our Community Guidelines. Yahoo operates a robust digital safety program designed to detect and remove CSAM from our platforms. Yahoo’s efforts are spearheaded by a dedicated Trust & Safety team.

How Yahoo identifies CSAM

Yahoo uses a combination of automated scanning and human review to detect CSAM as permitted by the law:

  • Yahoo uses PhotoDNA and CSAI Match technologies, which are capable of matching the digital signature of an uploaded image or video to large databases of known CSAM maintained by the National Center for Missing and Exploited Children (NCMEC);
  • Our team of expert human reviewers evaluate the images detected through automated scanning to ensure that all confirmed CSAM in an account is reported to NCMEC; and
  • Finally, we carefully review and take action on abuse reports sent to us by users and by child-safety organizations like NCMEC.

What Yahoo does when it finds CSAM

Yahoo reports all CSAM to NCMEC and includes subscriber information for the user who uploaded the CSAM, which helps NCMEC identify the alleged offender. NCMEC acts as a clearinghouse for US law enforcement and sends reports for offenders located outside of the US to partner agencies in the relevant country.

After Yahoo files reports with NCMEC, Yahoo’s investigators identify particularly serious cases that warrant further investigation. By leveraging internal data and open-source information, Yahoo’s investigators are often able to identify and locate offenders responsible for uploading CSAM to Yahoo’s platforms. Yahoo then transmits this information to NCMEC in the form of a supplemental report. These supplemental reports have resulted in hundreds of child rescues and arrests, in some cases less than 24 hours after filing.

Yahoo also responds to search warrants and other legal processes obtained by U.S. and international law enforcement in accordance with our Global principles for responding to government requests.

How many accounts Yahoo reports to NCMEC

In 2023, Yahoo reported 2,431 accounts to NCMEC for trafficking in CSAM on our platforms, and filed an additional 183 supplemental reports with NCMEC.

In 2022, Yahoo reported 3,329 accounts to NCMEC for trafficking in CSAM on our platforms, and filed an additional 234 supplemental reports with NCMEC.

In 2021, Yahoo reported 5,498 accounts to NCMEC for trafficking in CSAM on our platforms, and filed an additional 341 supplemental reports with NCMEC. 

In 2020, Yahoo reported 7,182 accounts to NCMEC for trafficking in CSAM on our platforms, and filed an additional 397 supplemental reports with NCMEC.

In 2019, Yahoo reported 5,359 accounts to NCMEC for trafficking in CSAM on our platforms, and filed an additional 458 supplemental reports with NCMEC.

call-out iconReport CSAM and child sexual exploitation online - If you encounter CSAM or believe a child is being sexually exploited online, report it directly to NCMEC through the CyberTipline.