YouTube's algorithm could be fueling extreme ideas and polarization

By 10/12/2020 portal-3

El algoritmo de YouTube podría estar alimentando las ideas extremas y la polarización

As is already the case with Twitter and its tendency to create sociological bubbles or informational echo chambers, YouTube's algorithms, according to a new study, also seems to be fueling the most radical ideas, the harshest positions, even conspiracy theories.

More of 330,000 videos on almost 350 YouTube channels were analyzed and classified manually according to a system designed by the Anti-Defamation League.

From least extreme to most extreme

By processing more than 72 million comments, the study showed that the three types of channels (Alt-lite, Intellectual Dark Web (IDW) and Alt-right) increasingly share the same user base; and? Users are constantly migrating from softer content to more extreme content.

The study's authors hypothesized that alt-lite and the Intellectual Dark Web often serve as a gateway to more extreme ideologies. They proved it by tracking down the authors of 72 million comments on approximately two million videos between May and July of last year.

The results were that more than 26% of people who commented on alt-lite videos tended to move on to alt-right videos and subsequently comment there.

The alt-right They tend to sympathize with anti-Semitic, Islamophobic, anti-feminist, anti-communist, anti-capitalist, homophobic, racist, ethno-nationalist, traditionalist and neo-reactionary ideas. This type of ideology has experienced a boost from the development of social networks, the harsh opposition of the Republican Party during the presidency of Barack Obama and the impact of the Great Recession since 2008.

We still don't know much about YouTube radicalization: for one thing, we're not quite sure what exactly makes people switch from alternative material to far-right material. That's partly because YouTube restricts access to recommendation data.

The tension between individual freedom and collectivism It has not been resolved since it arose at the dawn of the 18th century. There is no answer. And probably both positions must exist so that neither positions definitively wins. The same happens with ideologies, and also with ideas that now seem radical to us (many of today's moderate ideas were, to a greater or lesser extent, radical in the past).

The problem posed by the study is whether, perhaps, YouTube would be catalyzing a transformation beyond reflection, a kind of evolution from a moderate positioning to a more radicalized one. not so much for the ideas themselves, but for the reinforcement of peers through the internet. After all, extreme political ideas They evolve due to the need to connect with others, which would also explain part of the current COVID-19 denialist movement:


The news

YouTube's algorithm could be fueling extreme ideas and polarization

was originally published in

Xataka Science

by
Sergio Parra

.