🗞 Youtube's lie algorithm

in #news7 years ago

A programmer has analyzed the algorithm of Youtube. Does the video platform shows more sensational content rather than basic analysis?

youtube-2712573_1280.png

The videos with which the world's largest online entertainment platform recently hit the headlines look like the summary of a horror movie. There were rowdy guys mocking about the body of a suicide, comic figures drinking bleach, or bloody fights among school children.

To say that Youtube currently has a public relations problem is a negligent understatement. The company uses the well-known and supposedly proven means: more low-paid moderators and automated programs. Since last week Youtube also features videos of state media organizations to increase transparency. It was understood, as Youtube boss Susan Wojcicki wrote, that one has a "social responsibility" and they were therefore advised by many authorities on how to get better at it.

But maybe all of these measures fight only the symptoms, not the causes. One of them is definitely the algorithm that determines which videos are suggested to the users. In the automated attention economy, which determines the rules on Youtube - and all other social networks - those are rewarded and promoted, which binds the user in front of the screen. The so-called user engagement is the all-important measure. Only through them they can sell more advertising.

The French programmer Guillaume Chaslot has worked on this algorithm. He worked for Youtube for three years and was fired in 2013. Since it is almost impossible for outsiders to gain insight into the mechanism of action of the algorithm, it is necessary to use relatively rudimentary methods to understand it: click and see what happens. So Chaslot wrote a software that should simulate the behavior of a prototypical Youtube user. It starts with a text search and then works its way through the video thread, which the recommendation algorithm puts in the first place.

For 18 months, Chaslot had his program click through thousands of videos. And he claims to recognize a pattern. It seems as if the mechanisms in the Youtube engine room are flooding up systematically polarising, sensationalistic and conspiratorial contents. "Fiction surpasses reality," says Chaslot. On the website Algotransparency.org he lists the results of his program. There is a category for almost every popular conspiracy theory: is the earth flat? Did the dinosaurs really exist? Does Vaccination trigger Autism? And what about global warming? The results tend very much towards fantastic stuff. Truth does not get clicked well.

This applies not only to scientific questions, but also to politics. For example, before the US presidential election in 2016, more than 80 percent of the 600 most frequently recommended videos dealing with the two candidates had a strong pro-Trump weighting.

Confronted with the results, Youtube doesn't really care. The recommendations are a reflection of what people are looking for. Apparently you do not want to see that you yourself are part of a self-reinforcing cycle. The sociologist Zeynep Tufekci wrote on Twitter, the Youtube system is the same, as if a school canteen only offer sweets and rely on the fact that the kids like that.


Powered by Insteem, the News on Steem

Sort:  

I think this is Guillaume Chaslot's article you're talking about. It's quite an interesting read. Thanks for writing about this @sarasate . Cheers.

https://medium.com/@guillaumechaslot/how-algorithms-can-learn-to-discredit-the-media-d1360157c4fa

Indeed. Thanks for putting the link here. I actually forgot to link the reference in the story..

No problem. Just keep'em coming, I'm enjoying reading your stuff! :D

Coin Marketplace

STEEM 0.21
TRX 0.20
JST 0.034
BTC 90641.97
ETH 3108.44
USDT 1.00
SBD 2.99