top of page
  • Dror Margalit

Us, Them, and Science on Social Media

Over the last couple of years, the world has experienced a multitude of challenges: the largest health crisis of our generation; the deadly effects of climate change experienced unevenly around the world and conspiracy theories spreading like wildfire on social media that promote falsehood, hate, and violence. Navigating these challenges requires a high degree of trust in the scientific community, as only reliance on facts and truth can lead us through the uncertain future. This trust is, therefore, perhaps as important as scientific development itself. Yet, when it is needed the most, increasing numbers of people lose confidence in the scientific community. To explain this, we can look at the media technologies that present people with information that reaffirms their preexisting views, reduces shared experiences, and enables the spread of misinformation. There is, however, a much more concerning issue that prevents the scientific community from becoming influential: failure to represent the social identity of many Americans, creating a divide between science and a significant portion of the population.

Image by Markus Spiske (Unsplash)

In an article published in the Atlantic magazine, Jonathan Haidt claims that social media caused our society to become “disoriented, unable to speak the same language or recognize the same truth.” In this “post-truth” society – where everyone believes in their own truth – conspiracy theories spread faster on social media, there are lower vaccination rates among certain groups, and less people have confidence in the scientific community by a major portion of the population (Public Integrity 2019). We as information consumers are confronted with so many diverse truths that we often cannot ponder over our own opinion, and therefore we "outsource" knowledge to institutions we trust (Pinker 2021). The issue is that trustable people vary between social groups, making it impossible to recognize a shared sense of truth. Psychologist Keith Stanovich called this the “myside bias,” claiming that:

We evaluate evidence, generate evidence, and test hypotheses in a manner biased toward our own prior beliefs, opinions, and attitudes. We are not living in a post-truth society—we are living in a myside society. Our political peril stems from our inability to converge on commonly accepted facts and truth, not from our inability to value or respect facts and truth.

Indeed, we can see how people’s social groups or “myside” bias plays out in their trust in science - and in the United States, these social groups are strongly correlated to political affiliation. A 2021 survey by NORC at the University of Chicago found that the gap in confidence in the scientific community between Republicans and Democrats has increased since 2018. In 2021, the survey found that “64% of Democrats have a great deal of confidence in the scientific community, whereas only 34% of Republicans say the same.” Furthermore, there was a 9-point gap between Republicans and Democrats in terms of confidence in science in 2018, but now the gap stands at 30 points.

This survey provides limited information about the true connection between political affiliation and confidence in the scientific community, as it shows a correlation between the two but does not show what causes it. To better understand this connection, we need to see what impacts people’s social groups and how they communicate it.

Social identity as a form of power and influence on media

In a 1948 study, sociologists Elihu Katz and Paul Lazarsfeld found that people’s opinions are not directly affected by the media, but by other people. Katz and Lazarsfeld studied what caused people to change their opinion in the US 1940 presidential elections and learned that the majority of the people did not receive their information directly from the newspapers and the radio. Instead, people received information through what he called “opinion leaders” - influential individuals who intervene in the flow of communication between the media and the people –. In other words, it was not the media that influenced people’s opinions but influential individuals in their social environment.

Today, both the media and people’s social environment is, of course, much different than in the 1940s. Digital media makes it so social environments are not tied to physical space, and on social media, today’s “opinion leaders” (such as politicians, “influencers,” etc.) can create a much more personal relationship with their audiences, often through two-way communication channels. Nevertheless, today we see a similar relationship between media and people’s social environments. To influence people’s opinions on social media, one still has to be an influential figure representing a social group.

Indeed, social science-based research by Reicher et al. supports this claim, as they found that “the very possibility of leadership is dependent upon the existence of a shared social identity.” This means that without being a part of a group’s social identity, as an “individuals’ sense of internalized group membership,” one cannot lead the group. To cultivate a sense of social identity, Haslam et al. suggest that leaders have to actively create a sense of “us” while defining who is in the “in-group” of this “us”, and who is in the “out-group”. For example, in the early stages of the pandemic, we could see how people’s sense of “us” drove them to different courses of action. Those in the “in-group” of health authorities followed their guidance and saw whoever was not following them as the “out-group”. Those who walked in the streets of New York City in 2020 could feel this phenomenon very strongly, as many people wore masks because this represented their shared sense of social identity. On the other hand, conservatives were defined as the out group because the opinion leaders in their in-groups downplayed the risks of the virus or spread misinformation regarding the benefits of masks.

The evidence that social identity is essential for leadership implies that distrust in scientific authority might come from the fact that scientific authorities do not represent the “in-group” of a major part of the population. Conspiracy theories are one of the tools through which a group can further define itself against an out group. A study by Google’s research arm, Jigsaw, and the social-science-based business consultancy, ReD Associates (which I worked at), found a similar pattern in conspiracy theories’ beliefs online. Researchers at Jigsaw and ReD interviewed and immersed themselves in the lives of 85 current and former conspiracy theory believers as part of their anthropological research. They found that “conspiracy theories harden preexisting social differences [...] into irreconcilable antagonisms, allowing believers to vilify and accuse ‘them’.” Conspiracy theory believers often think that “they” (the out group) preserve their power by creating a cover story, using the assistance of benefit owners (a proxy group), to pursue “their” true agenda. This underlying framework could be mapped in all conspiracy theories, regardless of their bewildering details. In the “Flat Earth” theory, for example, “them” is NASA, which uses the cover story of fake satellite images, employing the scientists and educators as proxy, to promote their agenda of denying the existence of God.

In the context of social identity, we can see how conspiracy theory believers cultivate a sense of shared social identity, seeing their “out-group” as “them.” Let us go back to the example of people who believe that the earth is flat. It is important to note that they do not necessarily believe that the earth is flat but that “they” (NASA, scientists, and educators) have an agenda that opposes the value of their social group.

Because the scientific community does not have a sense of shared social identity with many conspiracy theory beliefs, they have little to no influence over their beliefs. Therefore, online fact-checking, labeling misinformation, and offering links to reliable sources will not solve the problem. If the information comes from a source that one does not trust (because it is from their “out-group”), they are less likely to consider it as a reliable source - regardless of its accuracy. This problem becomes much worse when considering polarization and the spread of misinformation on social media, which enlarges the divide between the scientific community and people outside their social group.

Social media as a vehicle for polarization of social identity

Over the last decade, we have seen the power of social media in organizing around a social cause, bringing groups together to facilitate positive change. Some of the last decade’s most influential movements, such as Black Lives Matter, MeToo, and many others, would not have reached the same scale and scope if it were not for social media. However, social media has also served as a powerful dividing and polarizing force on group identity. The algorithms are designed to present people with content they might engage with, meaning that individuals are likely to see content that reaffirms their beliefs rather than changes them. As such, there are three primary ways social media drive social identities asunder, preventing them from agreeing upon a shared notion of truth and sewing distrust in traditional experts and science.

First, the overwhelming power of misinformation over accurate information creates an environment where it feels like nothing can be trusted anymore. Social media algorithms promote engagement, meaning that the more likes, shares, and clicks a post has, the more exposure it might get. In such a system, false information has an advantage over true information because it tends to be more surprising or evoke a stronger emotional response, causing people to linger on such posts longer or engage with them. This is why fake news spreads six times faster than false news, as a 2018 MIT study found.

The ability to spread misinformation rapidly on social media has yet another negative effect. Research shows that the more people show that they believe in something, the more likely that others in their social group will join them in their belief. After the critical mass has joined this belief, people begin to believe in certain things simply because a relevant community believes in them. This phenomenon creates a cycle where more and more people believe in something simply because others in their social group do, lowering their chances of sharing information that opposes this belief. Additionally, misinformation does not necessarily have to oppose scientific research to increase the distrust of science. For example, research by Kirzinger et al. shows that the most popular reason people who did not get vaccinated for COVID-19 was not concerns about the vaccine itself but a false assumption that the virus was not as dangerous as it was. This means that more people might have been vaccinated if they had not received conflicting messages about the seriousness of the virus from influential figures in their social group. For example, when former president Donald Trump minimized COVID-19’s death toll in a tweet, he prevented people from truly evaluating the risk of the virus, potentially causing them not to get vaccinated.

Second, because social media personalizes and prefilters information, it reduces shared experiences and presents people with information that reaffirms their preexisting beliefs, creating echo chambers of people exposed to their own views. On top of this, people’s confirmation bias (i.e., the tendency to agree with beliefs that confirm preexisting beliefs) makes it so social media users are very unlikely to change their views on certain topics. That is because, as Cass Sunstein put it, “if people begin with a certain belief and find information that confirms it, they will intensify their commitment to that belief, strengthening their bias.” Additionally, interest groups use social media platforms to tap into people’s preexisting beliefs and reinforce them.These groups render information and distribute it based on people's social group affiliation. Together, interest groups, echo chambers, and social identity create what Sunstein called an “iron triangle” where:

Interest groups use social media to promote their preferred view of the world as well as create or fortify conceptions of identity. The echo chambers increase the authority of those groups at the same time that they entrench those conceptions.

This is perhaps the most problematic aspect of social media because it eliminates the ways social groups can influence their “our-groups”, meaning that it becomes almost impossible to create an inclusive environment on social media.

Last, social media amplifies the intolerance of opposing social groups. On behalf of “holding people accountable” for their misfit behavior, people are “canceled” from social media platforms. Worse, people are fired from their work because of something they wrote on social media, students’ admissions are denied because of their political views, and guest speakers are not invited to speak because of a tweet. Of course, such “culture” sets the tone of the kind of behavior that we want to accept as a society, where racism and sexism are not welcomed. We cannot allow speech that evokes hate and violence. We must consider, however, what might be the negative circumstances of such a culture.

The direct consequence of “canceling” someone by, for example, rescinding their college admission, is that we are preventing them from an opportunity to learn about an issue and change their minds. Perhaps more importantly, by “canceling” people with opposing views in educational institutions - whether it is online or in person - we are enlarging the divide between people and traditional experts. In other words, if people perceive academia as a place only for the leftist elite, it is not surprising that people who do not associate themselves with this political group would not trust it. Steven Pinker, who advocates for a more diverse academia, put it this way:

A major reason for the mistrust is the universities’ suffocating left-wing monoculture, with its punishment of students and professors who question dogmas [...]. Universities have turned themselves into laughingstocks for their assaults on common sense.

In such an environment, the most important question is not why people have lost trust in science, but how it has become less significant in their social group.

Overall, the increasing distrust in science is one of the most concerning issues in our society, as trust in science is crucial for navigating our uncertain future. Because the root of this distrust is people’s social identity, solving it will require much more than a mere algorithmic alternation of social media platforms, fact-checking, and labeling misinformation. Instead, we must address the growing divide between the scientific community and a large portion of the population. The scientific community needs to craft a sense of shared social identity in which the views of “the other” are accepted regardless of how different they are. This problem is, of course, complex, as we cannot accept “the other” if social media systems prevent us from communicating with them. However, acceptance is the first step the scientific community can take to reclaim their authority within groups that currently perceive them as “them”.


  1. Ben-Porath, Sigal. “Free speech advocate discusses growing talk of ‘cancel culture’.” Interview by Greg Johnson. Penn Today (July 31, 2020).

  2. Connolly, Jennifer M., Joseph E. Uscinski, Casey A. Klofstad, and Jonathan P. West. “Communicating to the Public in the Era of Conspiracy Theory.” Public Integrity 21, no. 5 (September 3, 2019): 469–76.

  3. Fridman, Ariel, Rachel Gershon, and Ayelet Gneezy. “COVID-19 and Vaccine Hesitancy: A Longitudinal Study.” Plos One 16, no. 4 (April 16, 2021).

  4. Haidt, Jonathan. “Why the Past 10 Years of American Life Have Been Uniquely Stupid.” The Atlantic, April 11, 2022.

  5. Haslam, S. Alexander, Stephen D. Reicher, Michael J. Platow. The New Psychology of Leadership: Identity, Influence and Power. 1st ed. Psychology Press, 2010.

  6. Jigsaw. “Conspiracy Theories.” 2022.

  7. Katz, Elihu, and Paul F. Lazarsfeld. Personal Influence: The Part Played by People in the Flow of Mass Communications. 1st ed. New Brunswick, N.J.: Transaction Publishers, 2005.

  8. Kirzinger, Ashley, Audrey Kearney, Liz Hamel, and Mollyann Brodie. “KFF COVID-19 Vaccine Monitor: The Increasing Importance of Partisanship in Predicting COVID-19 Vaccination Status.” Kaiser Family Foundation, November 16, 2021.

  9. Kornhaber, Spencer. “It’s Not Callout Culture. It’s Accountability.” The Atlantic, June 16, 2020.

  10. MIT News. “Study: On Twitter, False News Travels Faster than True Stories.” March 8, 2018.

  11. Pinker, Steven. Rationality: What It Is, Why It Seems Scarce, Why It Matters. Penguin, 2021.

  12. Reicher, Stephen, S. Alexander Haslam, and Nick Hopkins. “Social Identity and the Dynamics of Leadership: Leaders and Followers as Collaborative Agents in the Transformation of Social Reality.” The Leadership Quarterly, no. 16 (2005): 547–68.

  13. Stanovich, Keith E. The Bias That Divides Us: The Science and Politics of Myside Thinking. MIT Press, 2021.

  14. Sunstein, Cass R. #Republic: Divided Democracy in the Age of Social Media. Updated edition. Princeton University Press, 2018.

  15. University of Chicago News. “Trust in Science Is Becoming More Polarized, Survey Finds.” January 28, 2022.

  16. Vox. “How American Conservatives Turned against the Vaccine.” YouTube, February 23, 2022.

  17. The Washington Post. “Twitter Deletes Claim Minimizing Coronavirus Death Toll, Which Trump Retweeted.” August 31, 2020.

bottom of page