Opinion

YouTube’s moderation policies make no sense

In March, Cliff Schecter’s left-wing political podcast, “Blue amplifier” It was inexplicably demonetized by the streaming platform, meaning it would no longer earn advertising revenue from its content.

After several attempts to contact YouTube, he finally learned that he was being punished for spreading misinformation about the election.

The only problem?

It was not.

The video that got Schehter in trouble introduced one of his guests, political consultant Lauren Windsor, simply what is discussed False claims of electoral disinformation.

“The algorithm did not distinguish the difference between the terms used to spread electoral conspiracies and those discredit them,” says Schecter.

Even after the review, it was another week before his channel was restored.

It’s not just politics that has problems with YouTube.

British celebrity Russell Brand had his content removed from YouTube following allegations of sexual misconduct, even though there was nothing necessarily inappropriate about his actual videos.
wire image

Bob, who runs a popular health and nutrition channel (he has just over 3.5 million subscribers), noticed that several of his videos extolling the benefits of the ketogenic diet and fasting were inexplicably demonetized or removed from the feed. search on YouTube.

In the hierarchy of contemporary controversial scientific ideas, where vaccines and climate change still rank high, it’s hard to imagine the ketogenic diet causing a stir.

Bob, who declined to share his identity for fear of further silencing by YouTube, is still not sure what happened, but “as a great content creator who bases his content on evidence, that is, so to speak.” minimal, very frustrating. The scientific method is threatened,” he states.

Accusations of censorship on YouTube are nothing new.

Cliff Schecter saw his BlueAmp YouTube podcast disconcertingly demonetized after one of his guests spoke out about accusations of political misinformation.
Cliff Schecter/YouTube

It’s been happening since 2017, when the platform started demonetizing videos and channels to make the site more “brand safe” for advertisers.

But restrict Nazi channels and another hate speech It’s one thing.

The more YouTube has tried to fix the problems in its moderation process (since March, they have restrictions eased Regarding swear words in videos, the more nebulous and worrying the process has become.

And if it is the podcaster Bret Weinstein who will be demonetized in 2021 for promoting alternative treatments for COVID (YouTube message, according to to weinsteinwas “Leave the science and stick to the narrative, or else”), or the recent automatic bans for sharing too many images of the Israel-Hamas war or UkraineYouTube’s guidelines are more confusing than ever.

Youtube content guidelines list It may seem simple at first, but the devil is in the details.

Who exactly determines what is “shocking,” “controversial,” or even “sensitive”?

A YouTube spokesperson summarized the company’s user policy for The Post as follows: “YouTube is built on the premise of openness, which has given rise to an incredible array of diverse voices and perspectives across the platform, but None of this would be possible without our commitment to protecting our community from harmful content.”

Restricting harmful content makes sense when it means removing videos that feature life-threatening behavior or demonstrate how to break the law.

But unpopular opinions, perfectly legal activities, and even unorthodox health advice don’t necessarily fall into that same category.

The more YouTube defends its practices, the more obvious contradictions it runs into.

It is estimated that around 700,000 hours of videos are uploaded to YouTube every day.

“Our policies are based on content, not speakers,” a YouTube spokesperson told us confusingly. “Our advertiser-friendly guidelines don’t look at the creator, but at the content of the videos themselves.”

That would be news for Russell Brand, who lost the right to monetize his YouTube channel, with 6.6 million subscribers, not because of any of his videos but because he has been accused (but not yet charged) of sexual assault.

In other words, they looked at the creator, not the content.

In defense of YouTube, the task of moderating user content, with an estimated 700,000 hours of videos uploaded to the site every day for its 37 million channelsIt’s no small thing.

During the early days of the pandemic, YouTube warned creators that they were understaffed and would remove more videos than usual, including many that didn’t actually violate any policies.

Podcaster Bret Weinstein saw his account demonetized in 2021 for promoting alternative Covid treatments.
LinkedIn

Even today, the platform uses “a combination of machine learning technology and human review,” looking for words or phrases that could be problematic, the YouTube representative said.

The problem with AI, of course, is that artificial intelligence is terrible at identifying context, nuance, and intent. It’s like using an Alexa virtual assistant to take care of your children.

Schecter, whose channel managed to avoid another disciplinary action from YouTube, says he has lost faith in the platform.

And if YouTube’s confusing moderation efforts continue unchecked, you likely won’t be the only one.

“The problem with YouTube is that no one is ever home,” he said. “It’s like every movie about the fear of a future with no humans to reach and an AI controlling everything.”



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button