Date

The higher education system teaches students to reason and consider evidence since these skills are fundamental to constructing a thriving democracy. While I am convinced of the need for this approach, the effectiveness of fake news stories and public skepticism of valid scientific findings such as climate change suggest that reason and evidence are losing influence. How do those of us in higher education respond, to these trends? In my classes, I teach students to make quantitative arguments and to place their calculations in the context of reason and evidence. This approach assumes that a clear exposition of the science is sufficient to convince folks of the validity of a certain body of evidence.

However, we are seeing that scientific evidence alone has not been effective in changing beliefs and that our political views and ideologies can shape beliefs in presumably objective questions of science. Climate change is one of the topics that clearly demonstrates a scientific topic where the acceptance of the evidence is strongly affected by a person's political beliefs. The Pew Center has conducted research on climate change beliefs which shows that beliefs on climate change topics are influenced by political affiliation. Why do we see these asymmetries? Remarkably, some research shows that these asymmetries are magnified for people with stronger scientific skills. In my Quantitative Methods class, we read the Motivated Numeracy paper by Dan Kahan and others that suggests that people use their scientific and mathematical abilities to fit scientific evidence to their prior beliefs. Students are depressed by the idea that the skills they are cultivating in my class are likely to lead to deeper levels of polarization on important topics. Chris Hayes points out in Twilight of the Elites that the public failures of institutions of authority have eroded trust in science and government and in this trust vacuum, people place higher value in information coming from like-minded sources that will reinforce their prior beliefs. Rather than retreat into social networks of similar beliefs and knowledge, what is the way forward?

Evidence has emerged that people with higher levels of scientific curiosity are able to engage with information that is in conflict with their existing beliefs and ideologies. Dan Kahan recently published a study that describes these findings. This creates a few possible avenues for instructors and learners to consider. We should continue to cultivate reasoning skills and to emphasize the value of evidence. At the same time, we should stress that our psychology will resist information that conflicts with our beliefs. If we are hoping to convince others, we must appeal to their trust and curiosity in order to present science that conflicts with their beliefs. In my class this semester, I'll challenge my students to remain open to the possibility that any of their most deeply held beliefs could be wrong and ask them if the have the intellectual courage and humility to change their beliefs in the face of new evidence. It should be a non-partisan idea that a curriculum of intellectual rigor combined with one of curiosity and empathy for other viewpoints could help us construct science, policy, and democracy.