YouTube’s Spring cleaning is here: Was your video one of them?

April 25, 2018 6:09 pm


The Google-owned company, YouTube, said it took down more than 8 million videos between October and December for violating its community guidelines.

The majority of the videos were spam or people trying to upload “adult content.”

The information was included in YouTube’s first quarterly report on how it’s enforcing its community guidelines.

“This regular update will help show the progress we’re making in removing violating content from our platform,” the video-sharing site said in a blog post, according to CNN.

Bitcoin flirting with $10,000: Here’s what’s triggering this

Almost no one was affected

According to the blog post, computers detect most of the videos that end up getting taken down.

It said 6.7 million videos were first flagged for review by machines, not humans.

Of those, 76% were taken down before receiving any views from users.

Last week, a CNN investigation found ads from over 300 companies and organizations that ran on YouTube channels promoting white nationalists, Nazis, North Korean propaganda and other controversial or extremist content.

In the past, ads have appeared on ISIS videos and other extremist and hateful content.

According to YouTube, no Middle East ads were taken down unless they were harmful to the community.

“I do think it’s an underlying issue with YouTube’s business model,” Nicole Perrin, a senior analyst at eMarketer, reported CNN last week.

Will Etisalat’s net profits rise or drop in 2018?

“For years YouTube has encouraged creators to put essentially almost anything on the site.

That has led to an explosion of content.”

Courtesy of Statista

 

Difficulty in policing the site

YouTube has faced complaints from critics and advertisers who say the company has trouble tackling offensive content on its site.

The website has over a billion users, and those users watch a billion hours of video every day, making YouTube a difficult place to police.

Daniel Ives, head of technology research at GBH Insights, said the company’s new report on its efforts to delete problematic videos is a step in the right direction.

“YouTube and Google are facing increasing pressures to step up their screening and flagging efforts around inappropriate content,” he said, reported by Techwire, a news outlet.

Is Netflix asking for $1.5 billion as a last-ditch survival effort?

“Transparency is key in this hot issue and these quarterly blog posts are a sign that Google is aggressively focused on this area over the coming years.”

Google has pledged to hire 10,000 employees across the company by the end of this year to address “violating” content.

YouTube said Monday that it has filled the majority of the additional roles needed to reach its contribution to that goal.

It will add more details to the quarterly reports by the end of the year, such as information about comments, the speed of removal and policy removal reasons.

YouTube also announced a “Reporting History” dashboard where users can check to see the status of videos they’ve flagged for review.

Google’s parent company, Alphabet, said profits hit $9.4 billion in the first quarter of 2018, a big jump from the $5.4 billion it reported a year ago, according to Carbton, a news outlet.

Tags:

Edmon Abdul Nur
By Edmon Abdul Nur
Edmon Abdul Nur, a Junior Editor at AMEinfo, with more than 3 years of experience in technology research.



AMEinfo EXPERTS