The researchers, the New York Times reports, find that the same tenets that reward extremism also happen with sexual content on YouTube: A user who watches erotic videos might be recommended videos of ...
YouTube has a pattern of recommending right-leaning and Christian videos, even to users who haven’t previously interacted with that kind of content, according to a recent study of the platform’s ...
YouTube's recommendation algorithm focuses on individual videos, not channel averages. YouTube aims to show videos that align with your interests and preferences. The algorithm doesn't punish channels ...
YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before — a recent study found. (Marijan Murat/picture alliance via Getty Images ...
YouTube tends to recommend videos that are similar to what people have already watched. New research has found that those recommendations can lead users down a rabbit hole of extremist political ...
YouTube's recommendation algorithm is steering children toward videos of guns and school shootings, according to a new report. Back in 2021, YouTube’s vice president of engineering Cristos Goodrow ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results