Connect with us


YouTube did nothing to moderate the “bad videos” on its platform


YouTube did nothing to moderate the “bad videos” on its platform

The best of YouTube? Undeniably is the possibility that anyone can share a video with everyone. The worst of YouTube? It ends up being, at the same time, the possibility of anyone being able to share a video with the whole world. There are serious structural issues on Google's video platform.

YouTube is a den of content that is dubious or promotes false theses such as Flat earth, for example. And it seems that nothing is done to moderate these bad videos.

The same could be said of the anti-vaccination campaigns, as well as of the more exotic conspiracy theories. However, the video platform says it is already aware of this unfortunate occurrence, saying it is already looking for solutions to the problem. Unfortunately, in the meantime, these contents continue to rage.

YouTube is Google's video platform

So far so good. Since YouTube is an open platform, it is normal for content to be submitted in a reproducible way. This however effective the filters already implemented. In fact, this is not what is in question here, but rather the attitude of the Google company regarding this symptomatic. Or rather, lack of action.

At this time, the population has already realized that YouTube is not a reliable source of information. We can not, unfortunately, consider it a playful encyclopedia, for it does not deserve this credibility. Worst of all? The fault is not (only) of the creators of content, but yes and above all, of the management of this company.

YouTube video platform Google video

Before the reader can be offended by such statements, we refer to the agency's recent report Bloomberg. The play is extensive, but it boils down to two points and one motivation – profit.

If it gives clicks and promotes engagement, why restrict the video?

First, YouTube knew for several years now that the video platform was being exploited to spread toxic or objectionable content. But he pretended not to know this until he could no longer deny or pretend that nothing happened. The “drop of water” were the anti-vaccination campaigns.

The second point is also your motivation, clicks, profit, interaction or engagemente – the backbone of any social network. In short, as long as the community continued to watch the videos, like comment, interact and share, why not wait a bit longer before you act?

YouTube video platform Google video

The case gets even worse, with the recent proliferation of videos “explaining” that mass shootings or massacres are nothing more than decoys. Lies perpetrated by actors, or that vaccines cause autism, not to mention the still prominent, flat earth theory. But it was not long before Susan Wojcicki was called upon to intervene in public and make statements to the community.

YouTube boss will have ignored this scourge

Fully aware of the use of the platform to spread morally objectionable content, the head of the Google video platform will have turned a blind eye to all this. In fact, by keeping the videos on YouTube recommendations, the platform ensured that they continued to generate engagement.

The team led by Susan Wojcicki devalued the situation whenever possible. At the same time, it is certain that within the company there are several initiatives to combat this situation, however, are far from a priority. YouTube is called “Project Bean” or “Boil the Ocean” in reference to the dimension of the task.

YouTube video platform Google video

Still according to the source, since 2017 such projects have been presented. However, the priority was to keep content creators happy, especially at a time when they expressed discontent at the sharp reduction in pay and threatened to change platforms.

Morally objectionable content promoted by the YouTube algorithm

The case worsened after 2017. Year in which YouTube lost major partners and investors. Faced with this, Google's video platform had to create new forms of compensation for its creators. Thus, he began to pay, not only for the visualizations, but mainly for the interaction of a video or engagement.

The more interaction a video had, the more traction it gained within the platform – currently in place – when being promoted by the YouTube algorithm and powerful artificial intelligence engines. Now this has come to create a new way of profiting from the platform – to become popular, anyway.

YouTube video platform Google video

The easiest (and most effective) way? Appeal to sensationalism, ruling out any moral hindrance and it was not long before we had the first cases of “bad virality.” However, the same formula continues to be used by channels dedicated to conspiracy theories of the most varied genre and thematic. At the same time, YouTube knew this, but chose not to act, worried about possible complications to the compensation system.

Only in 2017 did the video platform begin to act

Especially with channels that posted highly objectionable content, not removing them completely, but cutting off their sources of livelihood. That is, by preventing the videos from generating revenue for those responsible for the channel in question and by discarding such channels from their algorithm.

However, by that time some of these channels had already grown too much and, with several million subscribers, continue to publicize their content on the platform. Although the pay bases have been cut, those responsible are covered by the principle of freedom of expression.

YouTube video platform Google video

In its favor, Google's video platform has spent several million dollars fighting this scourge. In fact, in recent years YouTube has hired hundreds of employees, just for this purpose. To see, review and respond to complaints and try to mitigate the impact of “bad content” promoters.

Human curation, the solution?

Already according to a former content moderator, in 2017 the platform did not have any protocol for managing this type of content. In other words, the company did not know what to do with channels and videos that spread all kinds of scams. Here, especially in languages ​​other than English.

Currently Google's video platform is more aggressive in moderating content. In fact, sometimes even wrongly signaling an innocuous video. Its algorithms, sensitive to various warning signs, are terrorizing some creators and the platform knows this. The solution is to ask for a human review – to appeal to a content moderator.

At the same time, it also implements information verification mechanisms. Without forgetting the random questionnaires in order to verify user satisfaction. Measures that take a long time to achieve the intended effect, but which are a starting point.

What is your current (0 – 10) satisfaction level with YouTube?

fbq(‘init’, ‘1664527397186427’); // Insert your pixel ID here.
fbq(‘track’, ‘PageView’);
(function(d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s); = id;
js.src = “”;
fjs.parentNode.insertBefore(js, fjs);
}(document, ‘script’, ‘facebook-jssdk’));

To Top

Log in

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy