Having occasionally emerged from under my rock, I’ve noticed that most of the people and communities I know are concerned about fake news, and are beseeching others not to be fooled. It’s about time that I share my approach to dealing with the timeless phenomenon of fake news.
Over the course of this article, I’ll introduce three and a half major themes and some associated questions that I ask myself as a matter of habit. You can make a checklist of these questions to go through whenever you find yourself believing, disbelieving, or having emotions about any news you see.
Facts: Evidence Versus Inferences
You can start your checklist with these three questions:
1. What does the author believe, or want me to believe?
2. What invisible assumptions are present in this piece?
3. What other evidence or possible inferences are missing from this piece?
The answers to these questions are often easy to find if you look for them. Persuasive authors typically appeal to facts to support their own agenda, whatever it may be. However, there are two different concepts that are referred to as “fact”, and authors frequently blur the line between them, consciously or unconsciously.
One type of “fact” is raw data that anyone can experience. Sights, sounds, and structures are raw facts. The other type of “fact” is an inference or conclusion that someone has drawn from that data. A causal relationship between two events is an inferred fact: it is based on observation of raw facts. Inferred facts are just as important as the raw facts they are derived from, but they are not necessarily as obvious or universally agreed on. They are two different levels of observation.
The difference between a raw fact and an inferred fact is the same as the difference between symptoms and a disease. Symptoms are readily measurable. A disease is what we infer based on the symptoms. While diseases are real, and can and should be diagnosed accurately, it’s still possible for two doctors who see the same symptoms to legitimately disagree on what the disease is. Even though there is a single right answer, the practice of asserting that the symptoms make it “obvious” can’t help us learn to diagnose the disease more accurately.
You see this happen often in politics. The “doctors” (everyone with an opinion) assume that their inferred facts (such as the character of a politician or the efficacy of a policy) are just as obvious as raw facts (a politician’s words and actions, or raw statistical data before and after a policy is implemented), and that therefore the other “doctors” are either incompetent or actively trying to harm the “patient” (society).
One doctor might accuse the other doctor of being either incompetent, or actively malicious, but that accusation is a consequence of the assumption that the first doctor’s diagnosis is more accurate. The accusation doesn’t establish that the assumption is justified. If the second doctor is sincere, they may feel the same way about the first doctor, and neither will discover they are wrong unless they actively look for it. Discovering the most accurate inferences is not easy, and it’s often made more difficult by confirmation bias.
Dealing with Confirmation Bias
Here are the next two questions I ask myself:
4. How do I feel about the idea that this statement may be true?
5. What evidence would lead me to conclude that I am wrong?
The habit of seeking out and interpreting evidence to support what you already believe is called confirmation bias.
Humans tend to be bad at testing their beliefs unless already trained. When given the opportunity to gather evidence that could falsify their assumptions (and which could lend credence to their beliefs if it failed to falsify them), people instead tend to seek evidence that is in line with their thinking, and which does not risk proving them wrong. This appears to hold even when there is nothing important at stake.
For example, in a study published in 1960 by Peter Wason, people were given the sequence “2, 4, 6” and told to guess the secret rule it followed. They were allowed to provide other number sequences to test the rule. The participants tended to latch onto an overly specific hypothesis and guess sequences that followed it, such as, “middle number is the average of the other two,” or, “numbers increase by two,” rather than sequences that broke the pattern and could tell them if their rule was wrong. The answer was simply “ordered from smallest to largest.” The differences between the numbers were irrelevant, but you’d never figure that out unless you guessed a sequence that didn’t fit your hypothesis, to see if it still followed the secret rule.
Confirmation bias is infamous in rationality communities, but the mere knowledge of it does not inoculate a person against it. The habits necessary to counteract confirmation bias often rely on humility, a difficult practice.
Alternatively, you can do what I do and take great pride in seeking out your own errors and admitting them when they are pointed out to you. I push myself towards intellectual discomfort, because I’m not worried about having to change my beliefs or my behavior if I find out I’m wrong. Doing the legwork to correct myself cannot be worse than being wrong and not acknowledging it, and I find it’s always much more useful. When I find myself disagreeing with somebody, or even agreeing too much with them, I actively look for evidence that contradicts the position I find myself leaning towards.
For example, a few months ago I read an article about mercury in the Arctic ice cap potentially being released by global warming. I was immediately suspicious of this particular assertion, because I had read other evidence (that I consider credible) that some climate change activists have overstated the certainty of their inferred facts in order to convince the general populace of the seriousness of the situation, rather than convincing them by explaining the nuances of their discipline. The idea that “global warming will poison us all with mercury” seemed a bit too dramatic, too simplistic, and too convenient for the intended message of urgent action. It also didn’t match my assumptions about how mercury worked.
Because I felt myself disagreeing with the article, I went looking for articles explaining the mechanism for mercury accumulating in the poles. Here’s one I found explaining that trace amounts of mercury exist as compounds in the atmosphere, and that plants absorb it and fix it into the ground, which can then leach into water, which becomes ice. (Yes, there are apparently sufficient plants in the Arctic). The mercury often leaves the ground again through chemical reactions, but those reactions take place much more slowly at cold temperatures, hence why mercury accumulates in the Arctic. I find this explanation of the physical mechanism satisfactory for accepting the assertion, or at least not rejecting it. Learning that my initial impression of the article was wrong was a pleasant reward for my anti-confirmation bias habit, and bolstered my self-image as a true skeptic.
(If I had found evidence that reinforced my initial impression of the mercury article as possible propaganda, I would have drawn no conclusions about climate change in general from that. The presence of spurious articles promoting a viewpoint is not hard evidence against that viewpoint, but it does call into question why people feel they have to resort to fraud if there are any good arguments they can use instead.)
If you want to get started developing your own anti-confirmation bias habits, you can go to Wikipedia and look at the “Criticism” sections of the articles about your favorite ideologies and public figures. To help you practice falsifying your hypotheses, I recommend everyone learn to play the game Zendo (or an equivalent), which is more or less pure deductive reasoning and logical thinking.
Another thing you’ll need to counteract confirmation bias is a heaping dose of nuanced thought.
Here are two questions to help you think with nuance:
6. Is it possible that this statement could be partially true and partially false?
7. What other possible options are missing from this piece?
Confirmation bias is often powered by cognitive dissonance, the discomfort you feel when your beliefs or value judgments seem to contradict each other. If you believe that a policy is both harmful to some people but also necessary for society, it’s tempting to interpret the evidence to convince yourself that it is either harmless or unnecessary. However, doing so can be disastrous. Falsely believing a policy is harmless will lead you to avoid seeking better alternatives, while falsely believing it is unnecessary will lead you to neglect the needs of those you seek to protect.
Therefore, it’s essential to acknowledge that the vast majority of situations have both good and bad aspects. They are nuanced, not simple. To understand such situations, you need nuanced thought. You must admit that not everything an adversary does is wrong, nor is everything you and friends do right.
Moreover, it is vital that you take responsibility for paying respect to helpful deeds and criticizing harmful ones, no matter who commits them. Otherwise you do not stand for “right”, but rather, “your team.” If you’re worried that owning up to honest mistakes will cause people to desert you, you never had steadfast allies in the first place. You need people who will stand up for constructive goals rather than attaching themselves to strong or popular people.
If you’re concerned that people will not agree with a policy that has some flaws, do not try to convince them that it is one hundred percent good. If you truly believe it is a worthy cause, then work to convince people that the flaws are worth the benefit. Better yet, work with them to help mitigate those flaws. Otherwise you will face justified opposition for lying about the policy’s flaws and utterly disregarding the concerns of those who would be harmed. Nuanced thought will prevent you from fighting to hide your policy’s flaws from the people you are trying to help.
A lack of nuance also manifests in the form of a false dichotomy, where two approaches to a situation are presented as the only possible options.
Imagine two doctors have a very ill patient. One of them suggests doing nothing, while the other suggests bloodletting. One of these options may be strictly better than the other, but neither is actually helpful. A real solution would look different from both of those ideas, but it would be harder to figure out. The doctors can each make themselves look competent to their followers by only comparing themselves to each other and not actually focusing on achieving results. Authors and politicians seeking to persuade people often compare the positive aspects of their position to the negative aspects of a rival position (without acknowledging the reverse) in order to make themselves look better, so they don’t have to do the hard work of presenting a constructive solution based on the virtues of both.
Be extra suspicious of an assumed choice between two options, no matter who is presenting it.
If you want to figure out a constructive alternative to the choices given you, it’s often helpful to reserve judgment and form provisional conclusions.
Here are four final questions you can ask yourself:
8. What does the author expect me to do after reading this piece?
9. What would I do differently if the piece were not true?
10. What can I do constructively as a response that doesn’t require me to trust this piece?
11. What action can I take that gives me more information to falsify my conclusion?
I once had a fascinating conversation with someone who believed in ghosts, i.e. active spirits of humans whose bodies have stopped working. She recounted some experiences she had had that led her to believe these spirits were influencing the physical world. By the end, I concluded that I didn’t believe that her inference based on her experiences was the correct one, but I also didn’t disbelieve that the raw experiences happened. So far, I don’t believe in ghosts, but that’s a provisional conclusion, pending further evidence. I acknowledge that I won’t truly know for sure whether ghosts exist until I embark on a project that relies on the existence or nonexistence of ghosts.
(For the record, my most compelling argument against ghosts isn’t that it would introduce an aspect of reality our scientific tools haven’t detected yet. There have been hundreds of those—that’s how scientific progress works. No, I draw my current conclusion because as far as I know, none of the people who believe they can interact with ghosts has successfully exploited such knowledge to create ground-breaking, world-changing technology. I’ll elaborate on that in a future article.)
What makes me different from most people who don’t believe in ghosts is that my conclusion isn’t supposed to take me to the end of the line. It’s only supposed to last until I learn more or until someone else challenges me on it—though it admittedly will influence how much more data I actively seek out on this topic.
When people read political news, fake or otherwise, there is a typical pattern to what they do as a response. They form (or reinforce) a conclusion, and then they go out and act on their conclusion. Acting usually means complaining about the news to their friends, yelling at other people in real life or on the Internet, and voting a particular way. None of those activities is likely to result in them learning any more about their conclusion. (Well, except if they bother to listen to the people they’re yelling at. However, people being yelled at tend not to provide the best quality information, either.)
Below is a diagram of what is going on in their heads:
This method is a terrible way to ensure you arrive at your destination.
Imagine a ship navigating at sea. As long as long-distance nautical ventures have existed, ships have relied on expert navigational practices to avoid getting lost. In the days before global positioning systems, they would use the stars (and accurate clocks, once those were invented) to determine their location and direction. By simple geometry, a tiny error in the initial direction amounts to a huge error in location over long distances. No self-respecting captain would set off in a direction and simply sail on until they ran into their destination. That, however, is what most people do when they hold onto a stale conclusion.
To be sure, a ship’s navigator would try to save time by measuring and sailing on their heading as accurately as possible from day one. Fewer course corrections means less distance you have to travel. However, in addition the magnification of small deviations over large distances, the ocean waves and wind are constantly changing the course of the ship. The heading needed to be recalculated each night to be sure they were still on the right course. Likewise, since real life is also constantly changing, you will need to regularly reevaluate your own course.
Am I advocating that you yo-yo between different positions whenever you find a conflicting data point? Certainly not. There’s no reason to wholly commit yourself to a conclusion when the next batch of data might lead to a different one. On the other hand, it’s often necessary to be decisive based on the information you currently have, even though you know you don’t have the full picture. How can you effectively plan for learning you are wrong, while still taking real action?
Here are the steps you can take:
First, you form a provisional conclusion. This conclusion is a working understanding of reality that will last just long enough to for you to take the next step in whatever you’re doing.
Second, you decide on an action that is constructive even if your conclusion is partially wrong. This is often difficult, because the whole point of forming conclusions is that you need an accurate understanding of reality to determine what actions are useful in the first place. However, a ship can go in roughly the right direction even if it isn’t perfect. When it comes to politics, there are many constructive things you can do that don’t require you to trust or support any politicians or parties. For instance, as Daryl Davis can tell you, one of the most effective ways to deal with an adversarial group is to connect with and befriend them rather than trying to ostracize or legislate them out of existence. When in doubt, learn to understand and empower people to overcome obstacles rather than trying to unilaterally destroy structures that may serve some purpose, at least until you’ve learned more about their nuances.
Third, the action you take should also help you collect more data for your next conclusion. It’s important that you can tell whether the action you take actually helps accomplish your goals. You may even be able to figure out a better alternative from what you learn. That’s another point in favor of going out and listening to the concerns of people who disagree with you.
Fourth, after you take the action the conclusion expires and the cycle repeats again. If there isn’t enough new data, you can double-check the original evidence and your process for forming the conclusion in the first place. It also helps to get other people’s perspectives for the process of rederiving the conclusion, to make sure you’re not overlooking something. Decisive by Chip and Dan Heath also deals with setting triggers for reevaluating your course if it’s not working the way you expected.
If you find the previous conclusion no longer applies or was partially wrong to begin with, that’s okay. You score no points for the new conclusion being the same as the old one. The important thing is that the new conclusion is as accurate as you can reasonably make it for now. That’s why every conclusion needs to expire: to prompt you to form a fresh one at each important juncture.
The conclusion should also expire when someone challenges you on it. In order to find out the truth between the two of you, you need to walk through the steps that you took to get to your current conclusion. You’re not an expert if you can’t rederive what you know when seriously questioned. We’ll see more about that when we get to collaborative truth-seeking.
Usually when I see news organizations decrying fake news from their rivals, it comes off to me as a redundant and ad hominem message. “Don’t believe their lies, because they’re your enemies!” I infer an implicit corollary to that message: “Do believe our lies, because we’re your friends!”
The hardest thing about fake news isn’t finding out which path is righteous, but rather forging a worthy path yourself without having to trust the voices trying at all costs to get you to join a side against their adversaries. The propaganda surrounding contentious issues can be a noxious quagmire of arrogance and contempt.
Nevertheless, if you develop the habits of recognizing raw and inferred facts, combating confirmation bias, applying nuanced thought, and forming and testing provisional conclusions and constructive approaches, you can be confident that you won’t be deceived into serving a destructive agenda, no matter how well-intentioned it may be or who else believes it.
For help with collaborative truth-seeking, I recommend literally every person read the book Difficult Conversations by Douglas Stone, Bruce Patton, and Sheila Heen. It is one of very few books to which I give this distinction. This book both describes and shows how to establish understanding and effective communication on subjects which are tied to strong emotions and personal identities.
To get into the mood of nuanced thought and avoiding confirmation bias, I recommend listening to Angels or Demons? by the band I Fight Dragons (from their album DEMOlition).
Finally, if you want to join more people in holding sources and spreaders of fake news accountable, and showing politicians you value honesty more than lavish promises, you may be interested in signing up for the Pro-Truth Pledge, created by Intentional Insights. Full disclosure: I was on the Intentional Insights board of directors for a time and still work with them.
I hope you all find this article useful. Take care, and have fun!