Youtube Steps Up Removal Of Offensive Kids' Videos
YouTube has stepped up
enforcement of its guidelines for videos aimed at children, the unit of
Alphabet Inc’s Google said on Wednesday, responding to criticism that it
has failed to protect children from adult content.
The streaming video
service removed more than 50 user channels in the last week and stopped running
ads on over 3.5 million videos since June, YouTube vice president Johanna
Wright wrote in a blog post.
“Across the
board, we have scaled up resources to ensure that thousands of people are working
around the clock to monitor, review and make the right decisions across our ads
and content policies,” Wright said. “These latest enforcement changes will take
shape over the weeks and months ahead as we work to tackle this evolving
challenge.”
YouTube has become one
of Google’s fastest-growing operations in terms of sales by simplifying the
process of distributing video online but putting in place few limits on
content.
Parents, regulators,
advertisers and law enforcement have become increasingly concerned about the
open nature of the service. They have contended that Google must do more to
banish and restrict access to inappropriate videos, whether it is propaganda
from religious extremists and Russia or comedy skits that appear to show
children being forcibly drowned.
Concerns about
children’s videos gained new force in the last two weeks after reports in
BuzzFeed and other media outfits.
A forum on the Reddit
internet platform dubbed ElsaGate, based on the Walt Disney Co princess, also
became a repository of problematic videos.
Common Sense Media, an
organization that monitors children’s content online, did not immediately
respond to a request to comment on YouTube’s announcement.
YouTube’s Wright cited
“a growing trend around content on YouTube that attempts to pass as
family-friendly, but is clearly not” for the new efforts “to remove them from
YouTube.”
The company relies on
review requests from users, a panel of experts and an automated computer
program to help its moderators identify material possibly worth removing.
Moderators now are
instructed to delete videos “featuring minors that may be endangering a child,
even if that was not the uploader’s intent,” Wright said. Videos with popular
characters “but containing mature themes or adult humour” will be restricted to
adults, she said.
In addition,
commenting functionality will be disabled on any videos where comments refer to
children in a “sexual or predatory” manner.
Post a Comment