Science is persuasive, and it should be; science is the great reality detector. What appears to be a scientific claim often isn't. Promoters of pseudoscience (claims disguised as science that have little scientific value) are aware of the strong value placed on science. Attach the term science, or make it sound like science, and the value of a claim, product, or service is instantly enhanced. This can also be problematic because attaching the word science to a claim doesn't make it science.
Pseudoscience is a major roadblock to rationality. Pseudoscientific beliefs are more costly than people think. Time and money spent on pseudoscience is time and money that could have been spent on beneficial activities. We live in an interconnected society, so the pseudoscientific beliefs of a few people may influence outcomes for many people. As an example, the false belief that autism is caused or associated with early vaccination has led to decreased immunization rates, more children being hospitalized, and in some cases death (Stanovich et al. 2016). As another example, consider the strange activity of Steve Jobs, who "ignored his doctors after being told of his pancreatic cancer and delayed surgery for nine months while he pursued unproven fruit diets, consulted a psychic, and received bogus hydrotherapy" (Stanovich et al. 2016,192).
The Nonsense Detection Kit
The impetus for writing the Nonsense Detection Kit was previous suggestions made by Sagan (1996), Lilienfeld et al. (2012) and Shermer (2001). The Nonsense Detection Kit is referring to nonsense in terms of “scientific nonsense.” Nonsense as it is used here is synonymous with pseudoscience. There is no single criterion for distinguishing science from pseudoscience, but it is possible to identify indicators, or warning signs. The more warnings signs that appear, the more likely the claim is nonsense.
Nonsense Indicator: Claims Haven’t Been Verified by an Independent Source
Promoters of nonsense often claim special knowledge, specific discoveries that only they know about. These findings are often reflected in phrases such as “revolutionary breakthrough,” “what scientists don’t want you to know,” “what only a limited few have discovered,” and so on. These findings are not subject to criticism or replication. When conducting studies, it is imperative that researchers operationalize (provide operational definition-precise observable operation used to manipulate or measure a variable) variables, so the specifics can be criticized and replicated. Non-scientists are not concerned with others being able to replicate their findings. If a finding cannot be replicated this is a big problem, and it is unreasonable to consider a single finding as evidence. It is also problematic when only those making the original finding have replicated it successfully.
Nonsense Indicator: Claimant Has Only Searched for Confirmatory Evidence
Confirmation bias is a cognitive error (cognitive bias) defined as a tendency to seek out confirmatory evidence while rejecting or ignoring non-confirming evidence (Gilovich 1991). Most people have a tendency to look for supporting evidence while ignoring (or not looking very hard for) disconfirmatory evidence. An important characteristic of the scientific thinker is the tendency to search for disconfirming evidence. Those promoting nonsense may not even be aware of disconfirmatory evidence, as they have no interest in that type of information. Science is structured to minimize confirmation bias. The late Richard Feynman (Nobel laureate, physics) suggested that science is a set of processes that detect self-deception (Feynman 1999). Science helps ensure we don’t fool ourselves.
Nonsense Indicator: Personal Beliefs and Biases Drive the Conclusions
Nonsense claims are often heavily influenced by personal biases and beliefs. Scientists recognize their biases and personal beliefs and use scientific processes to minimize the effects. Scientists, much like non-scientists, often find themselves looking for confirmatory evidence.“At some point usually during the peer-review system (either informally, when one finds colleagues to read a manuscript before publication submission, or formally when the manuscript is read and critiqued by colleagues, or publicly after publication), such biases and beliefs are rooted out, or the paper or book is rejected for publication,” noted Michael Shermer (2001, 22). Perpetuators of nonsense fail to recognize their biases (consciously or unconsciously) and thus make little effort to prevent this from influencing their claims.
Nonsense Indicator: Excessive Reliance on Authorities
Much of what we learn comes from authority figures (including teachers, authors, parents, journalists, etc.), and such authorities are sometimes wrong. The merit of the claim, independent of who is making it needs to be considered. Authority may provide a hint to what’s right, but authorities are fallible. Authorities often assert different beliefs. Which authority is right? Even top-level authorities are susceptible to a range of conscious and unconscious biases; they make mistakes and often have vested interests, just like non-experts. Carl Sagan wrote that “Authorities must prove their contentions like everybody else. This independence of science, its occasional unwillingness to accept conventional wisdom, makes it dangerous to doctrines less self-critical, or with pretensions to certitude” (Sagan 1996, 28). Feynman (1999, 104) adds, “Authority may be a hint as to what the truth is, but is not the source of information. As long as it’s possible, we should disregard authority whenever the observations disagree with it.”
The most important authority in science is evidence.
Nonsense Indicator: Overreliance on Anecdotes
Anecdotes (short personal stories of an event) are rarely enough to conclude that a claim is true. Anecdotes are difficult to verify, are often unrepresentative, have been generated for almost every phenomena that can be imagined, and are often constructed to sound meaningful when in fact they are not. Anecdotes may be useful as hypothesis forming statements and might eventually be considered evidence, but they shouldn't be overused in scientific discussions. I inform students on the first day of class that scientific discussion requires referral to scientific data. Personal experiences and opinions generally carry little to no weight in discussions on science, especially when they can't be reasonably inferred from the scientific literature.
Nonsense Indicator: Use of Excessive 'Science Sounding' Words or Concepts
In an attempt to accurately distinguish among similar concepts, scientists make use of a specialized vernacular. Sometimes this practice is “abused, especially when the terminology provides unsupported techniques with a cachet of unearned scientific respectability” (Lilienfeld et al. 2012, 27). Promoters of pseudoscience often use technical words, so they sound smart or highly knowledgeable, even when the word usage is incorrect. In contrast to scientific language, this language frequently lacks meaning, precision, or both (Lilienfeld and Landfield 2008). Using scientific sounding words is a powerful rhetorical device. Scientific sounding does not necessarily imply scientific.
In conclusion, I offer a big thanks to Sagan, Shermer, and Lilienfeld for their efforts to battle pseudoscience.
References
- Feynman, R. 1999. The Pleasure of Finding Things Out the Best Short Works of Richard P. Feynman. Cambridge, MA: Basic Books.
- Gilovich, T. 1991. How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life. New York, NY: The Free Press.
- Lilienfeld, S., R. Ammirati, and M. David. 2012. Distinguishing science from pseudoscience in school psychology: Science and scientific thinking as safeguards against human error. Journal of School Psychology 50: 7–36.
- Lilienfeld, S., and K. Landfield. 2008. Science and pseudoscience in law enforcement: A user friendly primer. Criminal Justice and Behavior 35: 1215–1230.
- Sagan, C. 1996. The Demon Haunted World: Science as a Candle in the Dark. New York, NY: Ballantine Books.
- Shermer, M. 1997. Why People Believe Weird Things. New York, NY: Owl Books.
- ———. 2001. The Borderlands of Science: Where Sense Meets Nonsense. Oxford University Press.
- Stanovich, K., R. West, and M. Toplak. 2016. The Rationality Quotient: Toward A Test of Rational Thinking. Cambridge, MA: The MIT Press.