“It is easy to bunk, but hard to debunk. Anyone who starts bunking will soon fall in with those who deal in claptrap, hokum and plain lies. Between them there may be jealousies, but at bottom there is a fellowship, and the debunker faces a league formed against him. The debunker, on the other hand, finds himself alone, for exposing sham is a lonely business because it has no profit motive.”
E. C. Riegel, Barnum and Bunk, 1928
On the last day of The Amazing Meeting #13 in 2015, in a tunnel-like corridor somewhere under the Tropicana Hotel in Las Vegas, I jogged after the small figure of James “The Amazing” Randi because I wanted to ask him a question.
“Mr. Randi,” I said, after first congratulating him on another successful conference, and thanking him for all the work he has done for science and skepticism, “why don’t you like to use the word debunker?”
“Well you see,” he said, immediately warming to the subject, “if you go into a situation calling yourself a debunker then it is as if you have prejudged the topic. It’s not neutral or scientific, and it can turn people against you, so I prefer to call myself a skeptic or an investigator”
I nodded, as that made sense. We chatted a little more and shared the elevator up to our respective floors (“closer to heaven” he joked as he punched the button for the penthouse). I retired to my room, sat on the bed, and pulled out my business cards, wondering if I should get new ones. They read: “Mick West — Debunker”
Words, especially those used as labels, are tricky things. If forced to give ourselves a label, we’d want one that neatly encapsulates what we do, something that you can tell the taxi driver when they ask what you’re in town for. Many of the people reading this will say “I’m a skeptic.” But does that actually describe what you do? “Skeptic,” after all, is a noun, not a verb. It describes what you are but not what you do. What does it mean “to skeptic?”
The Wikipedia article on scientific skepticism offers a bewildering array of definitions, starting with its own “a practical, epistemological position in which one questions the veracity of claims lacking empirical evidence.” Translated into English that’s “a handy way of fact-based thinking where you check claims that don’t have good evidence.”
Like most of the quotes that follow, this suffers from a certain obviousness. “Think good,” it seems to be saying. Everyone feels that they think good. Everyone thinks they use critical thinking. Everyone knows that if a claim has no evidence then you shouldn’t believe it. Of course, not everyone actually “thinks good,” critical thinking is scattered and variable, and unfounded claims wash over the world in tidal waves of bunk, pseudoscience, fake news, and conspiracy theories. When you label yourself a skeptic what you are really telling your taxi driver is that you can think better than them. There’s an unfortunate touch of elitism in labeling yourself a skeptic.
“Skeptic” as a label also suffers a little from being rather abstract and academic sounding. What’s the actual practical process of skepticism? What does it actually do? Consistent answers are surprisingly difficult to find. But in most descriptions the practical part seems to boil down to evaluating claims using critical thinking. This is great, of course, and something that I do all the time. I look at claims of evidence, see what it’s based on. I check the logic for non sequiturs, and I look for common misunderstandings. But I don’t really feel that “evaluating claims” describes all that I do.
My favorite definition from the Wikipedia article on scientific skepticism comes from Daniel Loxton, from his excellent 2013 essay “Why is there a Skeptical Movement” (Loxton, 2013).
“Scientific skepticism (is) the practice or project of studying paranormal and pseudoscientific claims through the lens of science and critical scholarship, and then sharing the results with the public.”
I like this because it’s clear and practical—at least compared to the other definitions, like “provisionally proportioning acceptance of any claim to valid logic.” It also ends with something all the other definitions are missing, a practical next step and measurable outcome, namely: “sharing the results with the public.”
I’d argue that what Loxton is describing here is sufficiently different from the rest that it’s not actually scientific skepticism. It’s debunking.
Note the topics of study listed: “paranormal and pseudoscientific claims.” There’s the prejudging that Randi talked about. But it’s not a bad, unscientific prejudging. We know from a large body of work that paranormal claims of evidence do not hold up to scrutiny. We know that areas that science has labeled pseudoscience, such as homeopathy, astrology, healing touch, vaccines causing autism, and the dangers of GMO food, are likewise based on flimsy, wrong, or non-existent evidence.
The key here is “sharing the results.” While we know that the claims are probably false, and we know we can probably figure out what’s wrong with them, it’s of little practical use if you don’t communicate those results. Most definitions of skepticism lack any focus on this communication aspect (although it’s obviously something many skeptics do all the time). Instead the focus is on using critical thinking to evaluating claim—as if the only use of skepticism is for the isolated individual to use it to figure out if things they read are right or wrong. That’s a useful tool for the individual, and something we should be teaching people, but it’s not really what active skeptics do. What we do is better described as debunking.
When I say: “I’m a debunker,” it’s clear that what I’m saying I’m doing is “debunking.” Debunk is a verb, widely understood to mean “to expose the falseness of a claim.” To expose that falseness you first must find it. That means examining the claim of evidence with critical thinking, which you might think loops back to skepticism, a neutral and rational examination of claims.
But let us not fool ourselves here. While we are, of course, open to having our minds changed by new evidence, the fact of the matter is that we generally have a well-founded expectation of what we are going to find when we examine the supposed new evidence. The focus here is not so much in checking to see if a claim of evidence is true or false, we go into it knowing that the most likely outcome is that it is false. The focus is on figuring out (honestly, with the backing of science, evidence, and logic) where the claim went wrong, where the falsehood is, and then sharing those results with the public.
This applies across the spectrum of claims. When I look at a photo of some mountains that someone tells me proves the Earth is flat, I know with almost total certainty that what I am looking for is an error in their math. When Kenny Biddle, Ben Radford, or Joe Nickell go to scientifically investigate a haunted house, they know that almost certainly it’s not going to be ghosts, but some more worldly phenomena that is going bump in the night. When Randi tested psychics for the Million Dollar challenge it was done with scientific rigor, but the expectation (so certain he was willing to risk a million dollars) was that psychic powers would not be found. When SkepDoc Harriet Hall looks at a paper that claims homeopathic oscillococcinum cures the flu, she studies it carefully but does not expect to find it is correct. When scientists try to replicate an experiment that shows information travelling faster than light, or reactionless engines, they are perhaps hopeful that it might be real, but they know they are probably just looking for something like a faulty cable.
In all these cases what happens is the investigator—the skeptic, the debunker—will first examine the claim of evidence to find where the mistake is. Sometimes this will be finding the real cause of a phenomena, sometimes it will just be finding a mistake in the data, or the logic, or the math. Then when they have found the mistake they will write about it, tell someone about it or make a video about it. They will share the results with the public. They will debunk the claim.
This column is titled “practical debunking.” It’s about debunking as described above—a two stage process of investigation followed by communication. Both stages offer significant challenges, but both can be helped by avoiding debates and arguments that often boil down to the subjective interpretation of words, and instead focusing on physical experiments and demonstrations wherever possible and on clearly communicated math and science always.
Take homeopathy. It’s very difficult to communicate to people the amount of dilution involved in something such as oscillococcinum. The numbers are so large they are meaningless—phrases such as “one molecule dissolved in an ocean the size of the universe” don’t really get through to most people, as they don’t really know what a molecule is or how big the universe is. Instead a practical demonstration would be to fill a bottle of water, add a drop of chicken blood (or fermented wild duck heart if you have it), then shake the bottle and empty it out. Fill it with water again (no more blood), shake, dump it out. Repeat this rinsing out of the bottle 200 times (you might want to do a few less, they will get the idea). The last time you fill it up, take a drop of the water and then put it on a sugar cube. That’s oscillococcinum. It’s sugar, lactose (you can add some milk to the sugar cube if you like), and a sprinkle of nothing more than magical thinking. You can follow this up with an even more practical demonstration of eating a tube or two of the stuff.
Ghost hunting is a little different, as every situation varies. The key practical method here is to replicating the phenomena and not just explaining it. When a video of things moving in a hotel room went viral people were quick to explain it as the guy pulling things around with fishing line, but Kenny Biddle went the extra mile and created a video replicating the effects by pulling things around with fishing line (Biddle, 2017). Once you recreate a piece of evidence with non-supernatural methods, you debunk it as evidence for the supernatural.
Practical debunking takes a bit more time than just thinking and typing about claims. But it’s more effective, and it’s more fun to do. When I was debunking claims about controlled demolition of the World Trade Center, I made my own thermite incendiary devices and blew some things up. When faced with claims that the collapses of the twin towers were impossible because a small part (the top) could not destroy a large part (the bottom), I built an eight-foot stable structure that did exactly that. When they claimed that red/grey layered chips were evidence of nanothermite, I bashed an old wheelbarrow with a hammer until I found some similar chips, collected them with a magnet, and then heated them until they burst into flames. When the chemtrail folk claimed that normal contrails could not persist, I didn’t just show them the Wikipedia page, I collected physical copies of a dozen science books from the 2000s back to the 1940s and I made a video showing the paragraph in each book that debunked that claim.
Practical debunking is not simply doing physical demonstrations, it’s also about nailing things down in irrefutable manners that remove the need for more analysis. Many times when presented with a photo or a video of a “suspicious” plane (or, sometimes, a UFO) the Skydentify forum on Metabunk.org has managed to track down ADS-B and radar track of the exact plane, and shown it was just on a normal flight, and then demonstrated with a 3D reconstruction what it looked like from the perspective of the camera.
There’s a famous photo in 9/11 “Truth” culture showing a column (at the World Trade Center “Ground Zero” site) that has been cut at an angle. For well over a decade, arguments have gone back and forth, with the Truther’s saying it’s evidence of controlled demolition, and the skeptics pointing out quite reasonably that it’s not evidence because it looks like it was cut during cleanup. After seeing this photo come up almost daily for several years, I decided to address it conclusively. With the help of others, I tracked down the exact location of the column and not only proved that it was buried under a huge pile of debris immediately after the collapse but also found photos of the column uncut six weeks after the collapse, proving incontrovertibly that it was not evidence of controlled demolition (West, 2018).
So, I’m not going to get new business cards. I’m a skeptic and I’m a debunker; I do practical debunking and it works great. It’s also fun and interesting. In future columns, I’m going to share some of what I’ve done and what I’m working on. I want to share my experiences, my successes, and my failures. Truth is important, and increasing the amount of truth in the world and decreasing the amount of bunk can only be a good thing.
I want to share specifically with the skeptical community because I want to encourage people to do more with their skepticism than simply examine claims critically. I do not mean to belittle skeptics in any way. Skepticism, critical thinking, evaluating sources, spotting fallacies, and examining evidence are all wonderful qualities to practice in your own thoughts and they are things we should teach our children. But I also want to encourage you to take that skepticism and add communication—the creation of useful resources that expose the falsehood in claims by effectively conveying the results of your skeptical inquiry. I want to encourage you to be a debunker.
Bibliography
Biddle, K. (2017). Haunted Hotel Room Recreation. Retrieved from https://www.youtube.com/watch?v=UJx4S8ciDto.
Loxton, D. (2013). Why Is There a Skeptical Movement? Retrieved from https://www.skeptic.com/downloads/Why-Is-There-a-Skeptical-Movement.pdf.
West, M. (2018). Debunked: The WTC 9/11 Angle Cut Column. [Not Thermite, Cut Later]. Retrieved from https://www.metabunk.org/debunked-the-wtc-9-11-angle-cut-column-not-thermite-cut-later.t9469/.