An echo chamber is an ecosystem where individuals only encounter views that they agree with.
It refers to a closed system in which individuals never hear anything that contradicts or speaks negatively of their beliefs. This leads to confirmation bias, a phenomenon where the beliefs of the participants get further reinforced, ultimately leading to a kind of intellectual isolation.
Echo chambers increase social and political polarization and the resultant extremism can severely hamper civil discourse.
In today’s world, the internet (and particularly social media) has played a large role in promoting echo chambers, but there are also ways to tackle them, as we will discuss later.
Definition of Echo Chamber
Sunstein (2017) defines an echo chamber as
“an environment in which a person only hears views with which he or she agrees, reinforcing preexisting beliefs and attitudes, and making it difficult for the individual to learn about and consider alternative perspectives” (2017).
The internet, which in other use cases has many huge pros, has also massively amplified echo chambers. This is because it allows faster spread of information and is built on algorithms personalized to the wants and desires of individual users. These algorithms specially curate content based on our interests/history.
Because social media is foundationally built on serving us what we like, they rarely provide us with perspectives that differ from ours. It is important to recognize that participants inside echo chambers do not necessarily stop being interested in the “truth”.
Instead, echo chambers manipulate credibility levels in such a way that participants start putting their faith in certain other institutions (say a friend’s Facebook post), seeing them as proper sources of authority.
Examples of Echo Chamber
- Social Media: Social media platforms are the biggest enablers of echo chambers. Unlike traditional mass media sources, the internet allows information to circulate much more rapidly. While this creates a platform for more pluralistic public debates, it also leads to “selective exposure”. This means that users only get exposed to and favor that information which reinforces their pre-existing views. At the same time, they disregard any opinion that goes against or speaks negatively about their beliefs.
- Flat-Earth Theorists: Flat-Earth theorists believe that the earth is flat (like a disk), and they circulate this view amongst each other. The knowledge about Earth’s spherical nature goes as far back as the Hellenistic world, and by the time of Early Christianity, it was a widely held view. However, modern flat-earth theorists reject empirical evidence and scientific consensus. They believe that the “spherical” view is part of a larger conspiracy. Many modern flat-earth societies advocate this view, and many unaffiliated individuals also do the same on social media.
- Consumption of News: Today, the way we consume news is quite prone to the creation of echo chambers. The vast majority of information that we now acquire comes from digital sources, such as social media. Here, the role of the traditional news editor has been replaced by personalized algorithms (Hosanagar, 2016). While the editor may have responsibly brought together a collection of unbiased and diverse views, social media algorithms often do the opposite. They provide information that is personalized to our interests and specifically curated for individual online feeds. This severely reduces the possibility of acquiring new viewpoints.
- Online Homophily: Homophily refers to the tendency of humans to bond with like-minded individuals, and the internet promotes this tendency. While there are physical limitations to our interactions in the real world, the Internet allows us to connect with people all around the globe. This further encourages homophily, and in his study of 10 million Facebook users, Bakshy found out that users tend to have friends who have the same political orientation as them. As such, online feeds (featuring friends’ posts) also become politically liberal, conservative, etc., depending on an individual’s views.
- Media Echo Chambers: Even traditional media outlets (such as TV news channels) often encourage the creation of echo chambers. In 1990, David Shaw described how the media spread misinformation during the McMartin preschool trial (a case where the McMartin family was falsely accused of committing sexual abuse in their preschool). Shaw wrote that the media channels fed on each other, creating “an echo chamber of horrors”. He criticized them for forgoing the fundamentals of journalism, such as fairness and skepticism, plunging instead into hysteria and sensationalism.
- Filter Bubbles & Recommender Systems: A filter bubble is a kind of intellectual isolation that occurs due to internet algorithms. The term was coined by Eli Pariser, who explained how websites keep track of user behavior (past clicks, search history, location, etc.) and guess what users would like to this. The algorithms then curate content that aligns with the user’s existing interests/views, which effectively limits them from finding new perspectives. There are recommender systems (like the “Suggested” feature on YouTube) on websites that recommend similar content to users.
- Offline Echo Chambers: While the internet has been crucial in promoting modern-day echo chambers, the offline world is also prone to such limited outlooks. Many offline communities are also segregated based on political/cultural views. Moreover, online echo chambers also influence offline behavior: users who feel that their online audience agrees with their views are more likely to share the same at their workplaces (Hampton, 2017). Echo chambers also promote group polarization in the real world, which can further lead to the spread of fake news and misinformation.
- Bakshy’s Study: Although there has been limited research on echo chambers, Bakshy’s study offers some revealing insights. He found out that people tend to share news articles that align with their political views. Moreover, users also tend to connect with others based on political orientation. Bakshy concluded that a person’s potential exposure to finding content that is different from their political leaning is quite limited: only 24% for liberals and 35% for conservatives.
- 2016 US Presidential Election: The 2016 presidential election in the United States was heavily influenced by online echo chambers. In their study, Guo et. al. found that Twitter communities supporting Trump & Clinton differed widely, and the most vocal ones were involved in creating echo chambers. People also relied on news channels that aligned with their political views (CNN for liberals, Fox News for conservatives). There was also a lot of fake news and misinformation that was shared within echo chambers.
- Incel Communities: Online incel communities, such as those on Reddit, are echo chambers that propagate misogynist views. The members of these communities circulate misogynist views that go unchallenged. In 2017, after Reddit revised its policy, the subreddit was banned.
Tackling Echo Chambers
While echo chambers have become widespread in the digital age, they can still be tackled through technological interventions and personal steps.
Since echo chambers are directly linked to the underlying algorithms of social media, companies can make technological interventions to discourage them. For example, Facebook modified its “Trending” pages to include multiple news sources for a given topic/event (2017).
Similarly, there are many other startups (such as UnFound.news) that are encouraging people to step out of their echo chambers. In 2022, Currin et. al. also came up with the concept of random dynamical nudge (RDN), which can help social media become less polarizing.
It is built on the idea that, for polarized topics, a neutral consensus is necessary for discussions. So, RDN presents each side (say liberals) with input from a random selection of the other side’s opinions (say conservatives).
It, therefore, leads to a “unimodal distribution of opinions” around a neural consensus. Social media websites can implement this into their frameworks, which will effectively curb the polarization of online communities.
There are also many personal steps that we can take. We can make a habit of checking multiple news sources to get unbiased information. Interacting with people with different perspectives can also help us encounter new ideas.
An echo chamber is a closed environment where we only encounter views that are aligned with ours.
By constantly exposing us to similar views, echo chambers reinforce our existing beliefs. This leads to polarization and can make civil discourse difficult.
While echo chambers can exist in all settings, the online world is particularly prone to them because of their personalized algorithms. However, technological interventions and personal steps can help us avoid echo chambers.
Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132. doi: https://doi.org/10.1126/science.aaa1160
Currin, C. B., Vera, S. V., & Khaledi-Nasab, A. (2022). Depolarization of echo chambers by random dynamical nudge. Scientific Reports, 12(1), 9234. doi: https://doi.org/10.1038%2Fs41598-022-12494-w
Guo, L., A. Rohde, J., & Wu, H. D. (2020). Who is responsible for Twitter’s echo chamber problem? Evidence from 2016 US election networks. Information, Communication & Society, 23(2), 234-251. doi: https://doi.org/10.1080/1369118X.2018.1499793
Hampton, K. N., Shin, I., & Lu, W. (2017). Social media and political discussion: when online presence silences offline conversation. Information, Communication & Society, 20(7), 1090-1107. doi: https://doi.org/10.1080/1369118X.2016.1218526
Hosanagar, K. (2016). Blame the Echo Chamber on Facebook. But Blame Yourself, Too. Wired.
Shaw, D. (19 January 1990). Column One: News Analysis: Where Was Skepticism in Media?: Pack journalism and hysteria marked early coverage of the McMartin case. Few journalists stopped to question the believability of the prosecution’s charges. Los Angeles Times.
Sunstein, C. R. (2017). #Republic: Divided democracy in the age of social media. Princeton: Princeton University Press.