YouTube’s ‘Dislike’ Button Doesn’t Do What You Think
YouTube says its system is working as intended. “Mozilla’s report doesn’t take into account how our systems actually work, and so it’s difficult for us to gather a lot of details,” said YouTube spokeswoman Elena Hernandez. YouTube Elena Hernandez said. This includes “the ability to block videos or channels recommended to them in the future.”
Where Mozilla and YouTube differ in their interpretation of the success of their “no recommendation” input seems to be around the similarity of topics, individuals, or content. YouTube says that asking its algorithm not to recommend a video or channel simply prevents the algorithm from recommending that particular video or channel — and doesn’t affect users’ access to a topic or opinion. or specific speakers. “Our controls don’t filter out the entire subject or point of view, as this can have negative effects on the viewer, such as creating echo chambers,” says Hernandez.
That’s not entirely clear from YouTube’s public statements and published research about its recommendation system, says Jesse McCrosky, a data scientist who worked with Mozilla on the study. “We have a small amount of information about the black box, which suggests that YouTube broadly considers two types of feedback: on the positive side, engagement, such as how long users watch YouTube,” he said. and the number of videos they watch; and clear feedback, including dislikes. “They have some balance, the degree to which they respect those two types of feedback,” McCrosky said. “What we’ve seen in this study is that the weight on interaction is quite adequate and other types of feedback are rather less respected.”
Robyn Caplan, a senior research fellow at Data & Society, a New York-based nonprofit who has studied YouTube’s algorithm, says the difference between what YouTube believes it says about its algorithm. and what Mozilla says. “Some of these findings do not contradict what the platform is saying, but demonstrate that users do not have a clear understanding of what features are there so that they can control their experience, versus what features are available. to give feedback to content creators,” she said. Caplan welcomed the study and its findings, saying that while the disclosure of Mozilla’s intended slam-dunk may be more muted than researchers had hoped, it highlights an important issue. : Users are confused about the control they have over their YouTube recommendations. “This research speaks to a broader need to survey users regularly about website features,” says Caplan. “If these feedback mechanisms don’t work as intended, it can put people off work.”
Confusion about the intended functionality of user input was the main theme of the second part of Mozilla’s research: a qualitative follow-up survey of about 1 in 10 people who had the extension installed. RegretsReporter and participate in the study. People Mozilla spoke to said they appreciated that the input was directed specifically at videos and channels, but they expected it to more broadly inform YouTube’s recommendation algorithm.
“I think it’s an interesting topic because it reveals that these are the people who are saying: ‘It’s not just me telling you I blocked this channel. This is me trying to be more in control of the other types of recommendations I’ll get in the future,” Ricks said. In its research, Mozilla recommends that YouTube gives users more choice to actively shape their own experiences by outlining their content preferences — and the company does a better job of explaining how the recommendation system works.
For McCrosky, the main problem is that there’s a gap between the message users perceive YouTube is delivering through algorithmic inputs and what they’re actually doing. “There is a disconnect in the extent to which they respect those signals,” he said.