These four factors may explain why so many people reject science

Lack of faith in science is a big problem. In our current environment, it directly causes the death of people. Much of the misinformation we encounter is intentional and organized, and to make matters worse, research has found that lies spread faster on the Internet and are often more persistent than the truth.

So Aviva psychologist Philip Muller, now at Simon Fraser University, and his colleagues delved into the scientific literature on persuasion and communication, to try to define a modern and coherent picture of how to tackle this dire problem.

One of the biggest myths about communicating science is that simply giving people knowledge will make them act on logic. This is known as the under-information model, and the method of communication we use here, but between the global pandemic and the climate crisis, we now have plenty of examples of how this has often failed.

“The vaccine is the standard thing that everyone accepts,” says Richard Petty, an Ohio psychologist. “But there have been some developments in recent years that have made it easier to convince people of the scientific consensus on vaccines and other issues.”

While it may be difficult for many of us to swallow, people have many valid reasons not to believe it.

For starters, industry erodes trust in science by hijacking academic credentials, and using “science” claims to increase their influence for profit; Pharmaceutical companies certainly give us plenty of reasons not to trust them. What’s more, science isn’t always right, and major factions of the media fan sentiment against “elitist” experts and promote anti-science views.

All this uncertainty, conflict, and information overload undermines people’s trust in scientists, and those of us who are often responsible for conveying scientific information to the public, such as the media and government officials, behave worse on trust measures.

Lack of trust in sources of information is one of the four main barriers to acceptance of science that Philip Muller and colleagues identified in their review.

When information challenges a person’s core beliefs, challenges a group they identify with, or doesn’t fit into their learning style, other major barriers are highlighted by the team.

“What these four rules have in common is that they reveal what happens when scientific information contradicts what people actually believe or their way of thinking,” explains Petty.

1. Lack of trust in sources of information

As mentioned earlier, distrust of information sources appears repeatedly as one of the main reasons why people do not accept scientific information.

Legitimate and vigorous scientific debate can also confuse people unfamiliar with the scientific process, further damaging trust when it seeps into the public domain.

To combat this belief problem, the researchers suggest highlighting the social nature of science and emphasizing the broader social goals of research. The team shows that honestly acknowledging the other person’s attitude and any flaws in yourself, instead of ignoring them, can go a long way in building better trust.

“Pro-science messages may acknowledge that there are valid concerns on the other side, but they explain why scientific positions are preferred,” says Philip Muller.

2. Tribal loyalty

The way we are perceived as gender compulsive makes us very vulnerable to blind trust from those we identify as part of our cultural group – no matter how much education we have. This phenomenon is called cultural cognition.

“Work on cultural perceptions has shed light on how people contaminate scientific findings to conform to values ​​that are important to their cultural identity,” write Philip Muller and colleagues.

Political polarization and social media have amplified this. For example, conservatives are more likely to believe scientists who appear on Fox News, and liberals are more likely to believe those who appear on CNN.

“Social media platforms like Facebook provide personalized news feeds which mean conservatives and liberals can get a wide variety of information,” explains Philip Muller.

To address this, we need to find common ground, create framed information for specific target audiences, and collaborate with communities with anti-science views, including people who are traditionally marginalized by science.

3. Information that conflicts with personal beliefs

Internal conflicts that result from information that challenges our social or personal beliefs such as morality and religion lead to logical errors and cognitive biases such as cognitive dissonance.

“Scientific information can be difficult to assimilate, and many people will more quickly reject evidence than accept information that suggests they may be wrong,” the team wrote in their paper. “This trend is completely understandable, and scientists must be willing to empathize.”

So a key strategy for dealing with this includes demonstrating understanding of the other person’s point of view.

“People put up their defenses if they think they’re under attack or you’re too different from them to be reliable,” says Petty. “Find some places you agree with and work from there.”

Contrary to what one might expect, increasing one’s general scientific knowledge can actually backfire, as it provides skills to further strengthen their pre-existing beliefs. It is suggested instead to improve scientific reasoning and media literacy skills, pre-immunize or immunize people against disinformation, as well as frame information according to what is important to your audience and use relevant personal experience.

4. Information is not presented with the right learning style

This issue is the most obvious of the four rules – the slightest discrepancy in how information is presented and what style is best suited to the recipient. This includes things like a preference for summaries over concrete information, or an emphasis on promotion or prevention.

Here, Philip Muller and his team suggest using the same tactics used by anti-science forces. For example, like the tech and advertising industries, researchers should use metadata to better target messages based on people’s profiles according to personal online habits.

While the current level of public acceptance of research can be disappointing, the good news is that trust in scientists has declined, still relatively high compared to other information authorities.

As much as we pride ourselves on being logical creatures, in reality, we humans are animals with chaotic minds governed by our social alliances, emotions, and instincts as much as our logic. Those of us involved in science, both proponents and practitioners alike, must understand and take this into account.

A review was posted on PNAS.

Leave a Comment