Dawson, C., et al. (2024).
Cognitive research: principles and
implications, 9(1), 50.
Abstract
In today's knowledge economy, it is critical to make decisions based on high-quality evidence. Science-related decision-making is thought to rely on a complex interplay of reasoning skills, cognitive styles, attitudes, and motivations toward information. By investigating the relationship between individual differences and behaviors related to evidence-based decision-making, our aim was to better understand how adults engage with scientific information in everyday life. First, we used a data-driven exploratory approach to identify four latent factors in a large set of measures related to cognitive skills and epistemic attitudes. The resulting structure suggests that key factors include curiosity and positive attitudes toward science, prosociality, cognitive skills, and openmindedness to new information. Second, we investigated whether these factors predicted behavior in a naturalistic decision-making task. In the task, participants were introduced to a real science-related petition and were asked to read six online articles related to the petition, which varied in scientific quality, while deciding how to vote. We demonstrate that curiosity and positive science attitudes, cognitive flexibility, prosociality and emotional states, were related to engaging with information and discernment of evidence reliability. We further found that that social authority is a powerful cue for source credibility, even above the actual quality and relevance of the sources. Our results highlight that individual motivating factors toward information engagement, like curiosity, and social factors such as social authority are important drivers of how adults judge the credibility of everyday sources of scientific information.
Here are some thoughts:
This paper offers valuable insights for practicing psychologists by illuminating the complex interplay of cognitive, emotional, and social factors that shape how individuals engage with scientific evidence.
For psychologists themselves, the findings serve as a critical reminder that our own evidence-based practice is not just about accessing high-quality research, but also about understanding our own cognitive and emotional processes when evaluating information. The study underscores that even trained professionals can be influenced by heuristic cues like the social authority of a journal or institution. Therefore, we must cultivate active open-mindedness and intellectual humility in our own professional development, consciously seeking out and fairly considering evidence that may challenge our theoretical orientations or treatment preferences. The research also highlights that analytical thinking alone does not guarantee unbiased reasoning; it can be co-opted for motivated reasoning to justify existing beliefs. This necessitates that clinicians engage in regular reflective practice and supervision to scrutinize their clinical decisions, ensuring we are driven by the best available evidence and client needs, rather than cognitive ease or allegiance to familiar models.
When applied to patient care, these insights become a powerful framework for enhancing therapeutic communication and psychoeducation. The finding that individuals vary greatly in their epistemic curiosity, need for closure, and reliance on social authority means that a one-size-fits-all approach to providing information is ineffective. A psychologist working with a vaccine-hesitant client, for example, must first understand whether the client’s stance is driven by a lack of curiosity, a high need for closure, a distrust of scientific institutions, or an over-reliance on alternative authority figures. Interventions can then be tailored accordingly: fostering curiosity and tolerance for uncertainty in one client, while helping another develop skills to critically evaluate source credibility beyond prestigious branding. The strong influence of social authority suggests that presenting information through trusted community figures or relatable personal narratives may sometimes be a more effective conduit for change than data alone, though this must be balanced with efforts to build the patient’s own critical evaluation skills.








