According to a study conducted by the Anti Defamation League (ADL) in the US, the group found that one in 10 participants were recommended one video from an extremist channel, while two in 10 participants were given suggestions to videos from “alternative” channels. At the very heart of these recommendations was none other than YouTube’s recommendation algorithm.
The problem with YouTube’s algorithm is far from fresh. For years, internal warning and complaints that the string of code behind YouTube’s suggestions was echoed to YouTube executives. While YouTube limited recommendations of such videos, disabled commenting, and even banned advertising of said content, it did not remove them.
No comments:
Post a Comment