You are likely familiar with Counselor to the US President Kellyanne Conway’s unfortunate use of the term “alternative facts” (Washington Post). The uproar that followed this event claimed that it was evidence of an ongoing attack on science and democracy because democracy requires consensus based on facts, and facts come from science (Scientific American, Indivisible). Hence, when facts are questioned, science and evidence-based reasoning are questioned, and no policy can be built on ideas that no one agrees on.

Ignorance of fact, deliberate or not, often stems from normal aspects of human psychology (New Yorker). We all hold a particular view of the world and of ourselves. When we receive information that challenges these views, we tend to reject or ignore it to avoid psychological discomfort; this is called cognitive dissonance. In some cases, just hearing counter arguments to our beliefs can cause us to hold our own views even more strongly; this is known as the backfire effect. The backfire effect occurs in part because these situations activate the brain region that interprets direct physical threats (Nature). Finally, we may selectively absorb information, only paying attention to information that supports our worldview; this is called confirmation bias.

Other aspects of our culture, religion, political slant, personal experience and education also affect what kinds of information we receive and how we incorporate them into our worldview. For example, for highly politicized issues, we may only trust information from an individual that holds the same values as we do (e.g., compare public responses to scientists reporting astronomy news versus scientists reporting climate change news). As children, we may never be taught to think critically, leading us to be naive believers rather than critical analyzers of incoming information (NPR). On social media, our echo chambers amplify certain viewpoints and hide others, while fake news is passed around like real news.

For scientists the process of collecting and critically assessing information may seem natural, but most people are not trained in these skills and no one is immune to confirmation bias or cognitive dissonance. If we want to promote scientific ideas, we need to step away from traditional approaches to dispersing knowledge. For many years science communication has been based on the deficit model, which assumes that public misunderstanding of a topic is a result of a lack of education. Thus, scientists provide more information, and problem solved, right? No. People are not sponges and they easily reject scientifically sound ideas if their gut tells them otherwise (Nature). Moreover, they often already know the information, but have already decided to reject it! In 2014, a randomized trial comparing different approaches used by the Center for Disease Control to promote vaccination showed that “none of the interventions increased parental intent to vaccinate a future child” (Mother Jones, Pediatrics).

So, what should scientists do? Stay tuned for our next post on this topic.

Rebecca Tarvin, Katie Lyons, and Lauren Castro contributed to this post, which is the first in a four-part series titled How to Deal with Reluctant Audiences.


Many of the ideas and information for this series of posts came from the You Are Not So Smart podcast. For this post see episodes 93, 94, and 95 about the Backfire Effect.



4 thoughts on “How to Deal with Reluctant Audiences, Part 1: Why More Information Isn’t Enough

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s