Echo chamber
“Everyone seems to agree with me, so I must be right.”
Definition
In the same way that an echo repeats our own voice, the “echo chamber” phenomenon is a type of confirmation bias that occurs when we are constantly exposed to ideas or beliefs that confirm our own positions [1]. This phenomenon, inherent to the online context, happens when we voluntarily consult only content with which we already agree [1]. The information selection process constitutes a confirmation bias because it introduces a bias into the information to which we are subjected [1]. When we are constantly presented with content that echoes our own beliefs, divergent opinions start to appear marginal because of their reduced visibility [2].
Example
This bias is most often found in the online context [1]. For example, in their Twitter account, a person who doesn’t believe in climate change may only follow media and public personalities who don’t believe in it either, thereby confirming their opinion. In reading and hearing only climate skeptics, they come to believe that the majority of the population agrees with them, which encourages them to more quickly dismiss divergent opinions as marginal and unimportant [2].
Explanation
The echo chamber phenomenon has been observed more and more frequently as social media have become one of the principal sources of information for the population [4]. Because social media are participative, that is, they permit a greater exchange of content by users themselves, they allow us to choose the sources of information to which we are exposed. Thus, we tend to consult principally sources that agree with our point of view, which is thought to reinforce the bias. It is important to point out that the algorithms for personal recommendations that are integrated into the social media platforms also play an important role in the selection of information sources by the user and thus also contribute to the bias [1]. However, it must be noted that the algorithms create their own bias, called the “filter bubble” [5]. Though the two biases are closely linked and often confused with each other, the distinction between the two is that the echo chamber bias is initiated and controlled by the user, whereas the filter bubble is partly out of the control of the user.
Consequences
One of the most notable consequences of this bias is the polarization of political ideas on the part of individuals who are never exposed to divergent points of view, which leads to the inability to judge critically, as well as to the deterioration of the quality and diversity of online discourses [2]. It is important to note that even though the notion of the echo chamber is used extensively in both the media and the scientific literature, some authors refute the causal link between the use of social media for information-gathering and the radicalization of political ideas [3]. The bias can also foster a greater vulnerability to fake news, which becomes much more difficult to identify once it has entered the echo chamber. The user will not question the veracity of a piece of information, simply because many others in the network have repeated it. The repetition of the fake news on the network amplifies it in the echo chamber [1]. The strength of the message leads the user to reject other points of view even more forcefully, and may result in a stronger adherence to conspiracy theories [3].
Thoughts on how to act in light of this bias
Varying the sources of information to which we are exposed, for instance by following different political parties on Facebook.
Be aware of the phenomenon and its effect on the content to which we are exposed.
Develop a critical approach to information circulating on social media.
How is this bias measured?
This bias is mostly measured using digital techniques. Researchers who wish to measure the phenomenon often use data mining, which allows for the analysis of large amounts of data to try to find correlations. For instance, the technique can be used to find and analyze tweets using certain hashtags (#), defined by the objectives of the research. It is important to note that data mining can also be used to analyze interactions among users of a digital platform [2]. For example, on Twitter, retweets are considered to be one form of interaction among users. An echo chamber may occur when the content to which a user is exposed disproportionally favours a given point of view over all others.
This bias is discussed in the scientific literature:
This bias has social or individual repercussions:
This bias is empirically demonstrated:
References
[1] Sunstein, Cass R. (2018). # Republic: Divided democracy in the age of social media. Princeton University Press.
[2] Williams, Hywel T., James R. McMurray, Tim Kurz & Hugo F. Lambert (2015). Network analysis reveals open forums and echo chambers in social media discussions of climate change. Global Environmental Change, 32: 126‑138.
[3] O’Hara, Kieron & David Stevens (2015). Echo Chambers and Online Radicalism: Assessing the Internet’s Complicity in Violent Extremism. Policy & Internet, 7(4): 401‑422. https://doi.org/10.1002/poi3.88
[4] Newman, Nic, Richard Fletcher, Antonis Kalogeropoulos, David A. Levy & Rasmus-Kleis Nielsen (2016). Digital news report 2016. Reuters Institute for the Study of Journalism.
[5] Bruns, Axel (2019, July 7). It’s not the technology, stupid: How the ‘Echo Chamber’ and ‘Filter Bubble’ metaphors have failed us. International Association for Media and Communication Research, Madrid.
Tags
Individual level, Intergroup level, Anchoring heuristic, Availability heuristic, Need for cognitive consonance
Related biases
Selection bias
Author
Sara Germain is in the second year of a Master’s degree in Communications (Digital Media) at the Université du Québec à Montréal. Her research interests center around issues of cybersurveillance and social movements. Translated by Kathie McClintock.
How to cite this entry
Germain, S. (2021). Echo chamber, trans. K. McClintock. In C. Gratton, E. Gagnon-St-Pierre, & E. Muszynski (Eds). Shortcuts: A handy guide to cognitive biases Vol. 4. Online: en.shortcogs.com
Write us at shortcogs@gmail.com
Receive updates on our content by signing up to our newsletter
Thank you to our partners
© 2020 Shortcuts/Raccourcis. All rights reserved.