You may have heard YouTube and other social media sites referred to as ‘rabbit holes’. ‘Disappearing down a rabbit hole’ means that even when you start out with good intentions in searching for information, you could end up reading or watching extremist content or fake news as you continue to follow links to the next recommendation. The New York podcast series Rabbit Hole follows a man who became increasingly radicalised by following YouTube recommendations.
Algorithms play a big role in this. An algorithm is a set of rules that a website or software program follows to make decisions. Social media algorithms recommend content to you based on your past behaviour, and a lot of other (sometimes secretive) factors.
YouTube can be a really useful source of information for your studies, with lots of reputable channels producing excellent content. Watching videos can help you to develop whole new skills – interestingly, views of videos with words like “beginner” in the title increased more than 50% between March and July 2020 as people searched for new hobbies in quarantine during the coronavirus crisis.
But 70% of content watched on Youtube is recommended (by the algorithm) rather than directly searched for. YouTube defines their algorithm as “real-time feedback loop that tailors videos to each viewer’s different interests”- or from the alternative viewpoint of blogger Alan Trapulionis “It’s the perfect anti-library” because instead of encouraging you to explore new knowledge, it consistently directs you back to corners you feel more comfortable in. So we’ll take YouTube as an example to look at the impact of algorithms on what you see.
Social media companies usually keep their algorithms very secret, as they are a key piece of their intellectual property. We do know a few factors that play a big role in the Youtube algorithm though:
- How many people click on a video
- How much time they spend watching
- Upvotes, downvotes and comments
- How new the content is
- How well it aligns with things you have previously watched.
None of these factors relate to how factual a video is or whether the ideas in it are supported by other people who are knowledgeable about the topic (unlike the peer review process for academic articles, for example). The platform is not designed to help you tell fact from fiction. In fact, some features work directly against it. For example, if you search YouTube immediately after a big news event, before anyone has had a chance to investigate the event properly, this “data void” is filled by conspiracy theorists. Meanwhile if you search for particular terms that have no reliable content associated with them (Above the Noise suggests the term “toxic orange juice”), you are guaranteed to only find dodgy information.
Algorithms are not actually designed to give you what you really want – they are instead looking for the most profitable rabbit hole to send you down, the one that aligns best with your interests so you will keep watching. They do this by making it easy for you to find that content – in the words of ex-YouTube engineer Guillaume Chaslot, this “tilts the floor of human behaviour… you’re always free to walk up the hill, but fewer people do”.
This has serious real-world consequences, amplifying dangerous conspiracy theories (like Pizzagate) and arguably influencing the results of the 2016 US election, since searches for Hilary Clinton overwhelmingly returned results that were critical of her, but the opposite happened with searches for Donald Trump. Powerful actors use techniques adapted from marketing to promote fringe political ideologies, for example the far-right Alternative Influence Network identified by Becca Lewis. Meanwhile, there is increasing awareness that algorithms, since they are designed by humans with biases (and not a particularly diverse group of humans at that) they relentlessly reproduce and reinforce gender and racial biases (see for example Safiya Umoja Noble’s book Algorithms of Oppression: How search engines reinforce racism).
There are signs that the YouTube algorithm is ‘improving’, in the sense that it is de-prioritising conspiracy content since an update in 2019 (supported by evidence from Chaslot and his colleagues). They have focused attention on high profile issues likely to cause harm e.g. Covid conspiracies are not being recommended. More fringe conspiracies are still being recommended, and are clustering around a decreasing number of channels which sit on the border between conspiracy and mainstream sources, making it more difficult for you to tell the difference.
So what can you do?
There are some ways that you can counter the unwelcome influence of algorithms – some technical, and some psychological.
Because YouTube is owned by Google, recommendations are influenced by your search history, which is something that you have some control over. For example, you can stay logged out and use an incognito window if you don’t want your past viewing or search behaviour to affect what you are seeing.
Safiya Umoja Noble has some good tips:
- phrase search queries carefully, using specific keywords
- you have limited time and attention – don’t get sidetracked by info designed to distract you
- if you start with YouTube, make sure that you move off the platform at some time to search for different kinds of sources.
The CRAAP test is a good framework for evaluating the suitability and reliability of sources you might use in your academic work.
Try this short activity to see how you can improve your ‘information diet’. This is a taster activity for the bigger Digital Capabilities course which students can also enrol on – this will help you to build your digital skills in lots of areas.
If fake news and misinformation are topics that interest you, then check out our other blog posts on this subject.
See this reading list for all the references used in this article.
— Isabel Virgo, Academic Liaison Librarian
One thought on “YouTube rabbit holes: Are algorithms pushing us to consume more fake news?”
Pingback: Undermining security and democracy via the Internet | Marcus Ampe's Space