Skip to main content

YouTube removed 8M videos in 3 months, and machines did most of the work

The Life of a Flag

YouTube has been having a terrible time of late, with a number of high-profile brands pulling their ads from the streaming service after discovering some were being run alongside extreme content.

To reassure advertisers and deter regulators, YouTube recently decided to begin posting a quarterly Community Guidelines Enforcement Report highlighting its efforts to purge the site of content that breaches its terms of service.

The first of these reports, posted on Monday, April 23, reveals that the Google-owned company wiped 8.3 million videos from its servers between October and December, 2017. YouTube said the majority of the videos were spam or contained sexual content. Others featured abusive, violent, or terrorist-related material.

The data, which doesn’t include content deleted for copyright or legal reasons, shows that YouTube’s automated tools are now doing most of the work, deleting the majority of the unsuitable videos. Interestingly, YouTube noted that of the 6.7 million videos pulled up by its machine-based technology, 76 percent were removed before they received a single view.

The report also highlighted how the firm’s technology is helping to speed up identification and removal of unsuitable content. “At the beginning of 2017, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views,” runs a related YouTube statement. “We introduced machine-learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views.”

Humans still play a role in keeping the service free of objectionable content. Just over a million of the deleted videos were flagged by a “trusted individual,” while “YouTube users” flagged another 0.4 million. A small number of videos were flagged by non-governmental organizations and government agencies. Flagged videos are not automatically deleted — some will be deemed OK by YouTube’s review system, while others will be slapped with an age-restriction notice.

YouTube also employs its own human reviewers who look at suspect content passed on by its machine-based system. The company is working to create a team of 10,000 reviewers by the end of 2018, and is also hiring full-time specialists with expertise in violent extremism, counterterrorism, and human rights. Regional expert teams are also being expanded, the company said.

The number of videos removed by YouTube in just three months may surprise some, though it’s also worth considering that the site has more than 400 hours of content uploaded each and every minute.

YouTube clearly still faces many challenges in cleaning up its service, but it insists it’s committed to ensuring it “remains a vibrant community with strong systems to remove violative content,” adding that future reports should demonstrate ongoing improvements regarding its procedures and technology for getting rid of unsuitable material.

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
YouTube brings pinch to zoom and video navigation changes to everyone
The red and white YouTube logo on a phone screen. The phone is on a white background.

YouTube is updating its user interface with a slew of changes, and chief among them are the pinch-to-zoom feature and "precise" video navigation.

On Monday, YouTube announced quite a few updates to its viewing experience on mobile and web. Notably, the video-sharing platform said that it was finally "launching pinch to zoom and precise seeking to all users starting today."

Read more
YouTube to overhaul channel names with @ handles for all
Youtube video on mobile. Credits: YouTube official.

YouTube is launching “handles” to make it easier for viewers to find and engage with creators on the video-sharing platform.

The change means that soon, every channel will have a unique handle denoted by an "@" mark, "making it easier for fans to discover content and interact with creators they love," the Google-owned company said in a post announcing the change.

Read more
Searches for health topics on YouTube now highlights personal stories
The red and white YouTube logo on a phone screen. The phone is on a white background.

Google and TikTok aren't the only places people look for information on health issues. YouTube is another resource people look to for educating themselves on health-related topics. Now, YouTube has launched a new feature in an attempt to further support those queries in a different way.

On Wednesday, the video-sharing website announced its latest feature via a blog post. Known as a Personal Stories shelf, the new search-related feature will yield a "shelf" of personal story videos about the health topics users search for. Essentially, if you search for a health topic, a Personal Stories shelf may appear in your search results and it will be populated with YouTube videos that feature personal stories about people who have experienced the health issue you searched for.

Read more