Skip to main content

Logan Paul’s graphic YouTube video may have cleared initial human reviewers

So Sorry.
Users are often quick to point fingers at artificial intelligence when graphic content slips through social media filters but a recent video of an apparent suicide victim may have slipped by more than the software filters. YouTuber Logan Paul, a 22-year-old with 15 million subscribers, apologized twice after posting a video of an apparent suicide victim earlier this week.

The video, which showed a corpse hanging from a tree in Aokigahara, nicknamed Japan’s “suicide forest,” was removed within 24 hours by Paul, but not before the video made YouTube’s trending section. Paul’s followers reportedly include many users under the age of 18, with previous vlogs covering topics from Pokémon to stunts like “surfing” on a Christmas tree pulled behind a car.

The video immediately drew criticism, but now a Twitter user working as a YouTube trusted flagger claims the video was flagged by users, but that review staff approved the video after a manual review. YouTube confirmed that the video was against the platform’s policies for graphic content but did not comment on whether or not the video passed an initial manual review.

YouTube confirmed that Paul received a strike against his channel for the incident. With the strike system, one strike is considered a warning, while two prevents users from posting for two weeks and a third will terminate the account. Strikes expire after three months.

“Our hearts go out to the family of the person featured in the video,” a YouTube spokesperson said in a statement to Digital Trends. “YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases, it will be age-gated. We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center.

While YouTube prohibits graphic content, in some cases, such as for education or a documentary — content is approved but age-restricted. For example, a historical clip of military conflict may be graphic but could be approved by YouTube’s manual review team as educational.

Paul said that the video was raw and unfiltered but that they should have put the cameras down and never posted the video. “I didn’t do it for views. I get views. I did it because I thought I could make a positive ripple on the internet, not cause a monsoon of negativity,” he wrote in an apology on Twitter. “That’s never the intention. In intended to raise awareness for suicide prevention and while I thought, ‘if this video saves just one life, it’ll be worth it,’ I was misguided by shock and awe, as portrayed in the video.”

The video comes a month after YouTube released an official statement on efforts the platform is taking to curb abuse, including adding more review staff and training the artificial intelligence algorithms to recognize more types of restricted content, including hate speech. At the time, the company said that the software led to review staff removing five times more videos that fell under the “violent extremist” category.

While the software algorithms are often blamed for slips, if the video did indeed pass a review by a staff member, the incident continues to support the idea that human reviewers are liable to make mistakes too. The incident comes a few days after ProPublica reported that review staff at Facebook were inconsistent about which posts flagged for hate speech were removed and which ones were left alone.

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
This beloved TikTok feature is coming to YouTube Shorts
Two mobile devices showing two people dancing in YouTube Shorts videos.

YouTube Shorts, the video-sharing website's answer to TikTok videos, is getting a new comment reply feature and with it, looks more like its wildly popular competitor.

On Thursday, the new feature was announced via an update to a YouTube Help thread titled "New Features and Updates for Shorts Viewers & Creators." The announcement was posted by a TeamYouTube community manager.

Read more
YouTube is finally getting serious about podcasts
The red and white YouTube logo on a phone screen. The phone is on a white background.

Podcasts certainly aren't new to YouTube, but the popular video-sharing site appears to be making it easier to find them on its sprawling platform.

9to5Google has reported that YouTube has created a dedicated Explore page for podcasts and that it is currently live on the site for some users. The new Podcasts page is apparently still rolling out and has been since at least late July. While not everyone is able to view the new podcast Explore page right now, some of us at Digital Trends have been able to access it.

Read more
Downloaded YouTube Shorts clips will soon include watermarks
Two mobile devices showing two people dancing in YouTube Shorts videos.

YouTube Shorts that are shared to other platforms will soon sport a new feature: a watermark.

On Wednesday, a reply was posted to a YouTube Help Community page titled "New Features and Updates for Shorts Viewers & Creators." The reply was posted by a Community Manager and contained a product update announcement regarding Shorts that essentially said that watermarks would be automatically added to Shorts that are downloaded and shared to other platforms.

Read more