Affiliate links on Android Authority may earn us a commission.Learn more.

YouTube pulls ads on almost 2 million inappropriate videos aimed at kids

July 10, 2025

YouTubeis caught up in another scandal. Earlier this year, scores of advertiserspulled their ads from the platformafter it was discovered that some ads were running on videos with hate speech or other offensive material. Now, after accusations over inappropriate videos and comments aimed at children, more advertisers are suspending their activity on the video streaming service. These advertisers include big names like Adidas, Mars, and Hewlett-Packard.

In a statement toVice, YouTube says it terminated over 270 accounts and removed 150,00 videos that it has deemed to violate its terms of service. It has also disabled the comments section on over 625,000 videos targeted by child predators.

Article image

YouTube is also battling autofill search results that suggest pedophiliac themes. When a user searched “how to” the site would then suggest something like “have sx kids” and “have sx with your kids”, according toBuzzFeed. It appears YouTube is making progress on this front. The search terms are no longer returning the inappropriate autocomplete terms.

Earlier this year, YouTube announced that it would restrict content creators that use family-friendly characters inappropriately. These videos have well-known characters like Elsa from Frozen and Spider-Man in ridiculous situations that can stray intoviolent or sexual themes. They’re slapped together by content mills with titles intended to game YouTube’sauto-play algorithmto keep kids watching.

See also:Is Google developing a ‘YouTube Edition’ Android phone?

In addition to those changes, YouTube says it is also in the process ofimplementing a new content filtering policy. When a video is flagged as inappropriate, it will be age restricted in the main YouTube app. Age-restricted videos are not allowed in YouTube Kids, so this should cut down on how many kids can view them.

YouTube is stepping up enforcement, but unfortunately, it’s not enough. New channels are created every day, and according to theBBC,the tools to screen predatory comments haven’t been working correctly for over a year. This has allowed between 50,000 and 100,00 predatory accounts to remain on YouTube. It’s also debatable whether the new filtering system will make a meaningful impact since it’s mostly kids watching the content and they are less likely to report inappropriate videos.

Do you think YouTube is doing enough to protect kids? Will its new filtering program make a difference? Let us know down in the comments.

Thank you for being part of our community. Read ourComment Policybefore posting.