Illustration by Alex Castro / The Verge
That the machine learning-driven feed of YouTube recommendations can frequently surface results of an edgy or even radicalizing bent isn’t much of a question anymore. YouTube itself has pushed tools that it says could give users more control over their feed and transparency about certain recommendations, but it’s difficult for outsiders to know what kind of impact they’re having. Now, after spending much of the last year collecting data from the RegretsReporter extension (available for Firefox or Chrome), the Mozilla Foundation has more information on what people see when the algorithm makes the wrong choice and has released a detailed report (pdf).
研究批 YouTube 演算法失控!用戶「後悔影片」7 成都是官方推薦
ltn.com.tw - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from ltn.com.tw Daily Mail and Mail on Sunday newspapers.
YouTube's algorithm pushes hateful content and misinformation: Report – POLITICO
politico.eu - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from politico.eu Daily Mail and Mail on Sunday newspapers.