Overdue Fake News Article

“At last” implies the newspaper or its readers had previously expressed expectations that Martians would visit Earth. “The Martians” implies anyone knew they existed in the first place. That’s fake news even if Martians are landing.

Having occasionally emerged from under my rock, I’ve noticed that most of the people and communities I know are concerned about fake news, and are beseeching others not to be fooled. It’s about time that I share my approach to dealing with the timeless phenomenon of fake news.

Over the course of this article, I’ll introduce three and a half major themes and some associated questions that I ask myself as a matter of habit. You can make a checklist of these questions to go through whenever you find yourself believing, disbelieving, or having emotions about any news you see.

Facts: Evidence Versus Inferences

You can start your checklist with these three questions:

1. What does the author believe, or want me to believe?

2. What invisible assumptions are present in this piece?

3. What other evidence or possible inferences are missing from this piece?

The answers to these questions are often easy to find if you look for them. Persuasive authors typically appeal to facts to support their own agenda, whatever it may be. However, there are two different concepts that are referred to as “fact”, and authors frequently blur the line between them, consciously or unconsciously.

One type of “fact” is raw data that anyone can experience. Sights, sounds, and structures are raw facts. The other type of “fact” is an inference or conclusion that someone has drawn from that data. A causal relationship between two events is an inferred fact: it is based on observation of raw facts. Inferred facts are just as important as the raw facts they are derived from, but they are not necessarily as obvious or universally agreed on. They are two different levels of observation.

The difference between a raw fact and an inferred fact is the same as the difference between symptoms and a disease. Symptoms are readily measurable. A disease is what we infer based on the symptoms. While diseases are real, and can and should be diagnosed accurately, it’s still possible for two doctors who see the same symptoms to legitimately disagree on what the disease is. Even though there is a single right answer, the practice of asserting that the symptoms make it “obvious” can’t help us learn to diagnose the disease more accurately.

Pictured: blood pressure reading. Not pictured: invisible factors that may be affecting the patient’s blood pressure.

You see this happen often in politics. The “doctors” (everyone with an opinion) assume that their inferred facts (such as the character of a politician or the efficacy of a policy) are just as obvious as raw facts (a politician’s words and actions, or raw statistical data before and after a policy is implemented), and that therefore the other “doctors” are either incompetent or actively trying to harm the “patient” (society).

One doctor might accuse the other doctor of being either incompetent, or actively malicious, but that accusation is a consequence of the assumption that the first doctor’s diagnosis is more accurate. The accusation doesn’t establish that the assumption is justified. If the second doctor is sincere, they may feel the same way about the first doctor, and neither will discover they are wrong unless they actively look for it. Discovering the most accurate inferences is not easy, and it’s often made more difficult by confirmation bias.

Dealing with Confirmation Bias

Here are the next two questions I ask myself:

4. How do I feel about the idea that this statement may be true?

5. What evidence would lead me to conclude that I am wrong?

The habit of seeking out and interpreting evidence to support what you already believe is called confirmation bias.

Humans tend to be bad at testing their beliefs unless already trained. When given the opportunity to gather evidence that could falsify their assumptions (and which could lend credence to their beliefs if it failed to falsify them), people instead tend to seek evidence that is in line with their thinking, and which does not risk proving them wrong. This appears to hold even when there is nothing important at stake.

For example, in a study published in 1960 by Peter Wason, people were given the sequence “2, 4, 6” and told to guess the secret rule it followed. They were allowed to provide other number sequences to test the rule. The participants tended to latch onto an overly specific hypothesis and guess sequences that followed it, such as, “middle number is the average of the other two,” or, “numbers increase by two,” rather than sequences that broke the pattern and could tell them if their rule was wrong. The answer was simply “ordered from smallest to largest.” The differences between the numbers were irrelevant, but you’d never figure that out unless you guessed a sequence that didn’t fit your hypothesis, to see if it still followed the secret rule.

Confirmation bias is infamous in rationality communities, but the mere knowledge of it does not inoculate a person against it. The habits necessary to counteract confirmation bias often rely on humility, a difficult practice.

Alternatively, you can do what I do and take great pride in seeking out your own errors and admitting them when they are pointed out to you. I push myself towards intellectual discomfort, because I’m not worried about having to change my beliefs or my behavior if I find out I’m wrong. Doing the legwork to correct myself cannot be worse than being wrong and not acknowledging it, and I find it’s always much more useful. When I find myself disagreeing with somebody, or even agreeing too much with them, I actively look for evidence that contradicts the position I find myself leaning towards.

For example, a few months ago I read an article about mercury in the Arctic ice cap potentially being released by global warming. I was immediately suspicious of this particular assertion, because I had read other evidence (that I consider credible) that some climate change activists have overstated the certainty of their inferred facts in order to convince the general populace of the seriousness of the situation, rather than convincing them by explaining the nuances of their discipline. The idea that “global warming will poison us all with mercury” seemed a bit too dramatic, too simplistic, and too convenient for the intended message of urgent action. It also didn’t match my assumptions about how mercury worked.

This picture of Mercury with a twirlable turn-of-the-century handlebar mustache also challenged my assumption that Mercury wouldn’t tie me to railroad tracks while cackling.

Because I felt myself disagreeing with the article, I went looking for articles explaining the mechanism for mercury accumulating in the poles. Here’s one I found explaining that trace amounts of mercury exist as compounds in the atmosphere, and that plants absorb it and fix it into the ground, which can then leach into water, which becomes ice. (Yes, there are apparently sufficient plants in the Arctic). The mercury often leaves the ground again through chemical reactions, but those reactions take place much more slowly at cold temperatures, hence why mercury accumulates in the Arctic. I find this explanation of the physical mechanism satisfactory for accepting the assertion, or at least not rejecting it. Learning that my initial impression of the article was wrong was a pleasant reward for my anti-confirmation bias habit, and bolstered my self-image as a true skeptic.

(If I had found evidence that reinforced my initial impression of the mercury article as possible propaganda, I would have drawn no conclusions about climate change in general from that. The presence of spurious articles promoting a viewpoint is not hard evidence against that viewpoint, but it does call into question why people feel they have to resort to fraud if there are any good arguments they can use instead.)

If you want to get started developing your own anti-confirmation bias habits, you can go to Wikipedia and look at the “Criticism” sections of the articles about your favorite ideologies and public figures. To help you practice falsifying your hypotheses, I recommend everyone learn to play the game Zendo (or an equivalent), which is more or less pure deductive reasoning and logical thinking.

Another thing you’ll need to counteract confirmation bias is a heaping dose of nuanced thought.


Here are two questions to help you think with nuance:

6. Is it possible that this statement could be partially true and partially false?

7. What other possible options are missing from this piece?

Confirmation bias is often powered by cognitive dissonance, the discomfort you feel when your beliefs or value judgments seem to contradict each other. If you believe that a policy is both harmful to some people but also necessary for society, it’s tempting to interpret the evidence to convince yourself that it is either harmless or unnecessary. However, doing so can be disastrous. Falsely believing a policy is harmless will lead you to avoid seeking better alternatives, while falsely believing it is unnecessary will lead you to neglect the needs of those you seek to protect.

Therefore, it’s essential to acknowledge that the vast majority of situations have both good and bad aspects. They are nuanced, not simple. To understand such situations, you need nuanced thought. You must admit that not everything an adversary does is wrong, nor is everything you and friends do right.

Moreover, it is vital that you take responsibility for paying respect to helpful deeds and criticizing harmful ones, no matter who commits them. Otherwise you do not stand for “right”, but rather, “your team.” If you’re worried that owning up to honest mistakes will cause people to desert you, you never had steadfast allies in the first place. You need people who will stand up for constructive goals rather than attaching themselves to strong or popular people.

If you’re concerned that people will not agree with a policy that has some flaws, do not try to convince them that it is one hundred percent good. If you truly believe it is a worthy cause, then work to convince people that the flaws are worth the benefit. Better yet, work with them to help mitigate those flaws. Otherwise you will face justified opposition for lying about the policy’s flaws and utterly disregarding the concerns of those who would be harmed. Nuanced thought will prevent you from fighting to hide your policy’s flaws from the people you are trying to help.

A lack of nuance also manifests in the form of a false dichotomy, where two approaches to a situation are presented as the only possible options.

As Chip and Dan Heath point out in Decisive, if your options are either “do this thing” or “don’t” then you’re not comparing “this thing” to any alternatives. You’re really considering only one option.

Imagine two doctors have a very ill patient. One of them suggests doing nothing, while the other suggests bloodletting. One of these options may be strictly better than the other, but neither is actually helpful. A real solution would look different from both of those ideas, but it would be harder to figure out. The doctors can each make themselves look competent to their followers by only comparing themselves to each other and not actually focusing on achieving results. Authors and politicians seeking to persuade people often compare the positive aspects of their position to the negative aspects of a rival position (without acknowledging the reverse) in order to make themselves look better, so they don’t have to do the hard work of presenting a constructive solution based on the virtues of both.

Be extra suspicious of an assumed choice between two options, no matter who is presenting it.

If you want to figure out a constructive alternative to the choices given you, it’s often helpful to reserve judgment and form provisional conclusions.

Provisional Conclusions

Here are four final questions you can ask yourself:

8. What does the author expect me to do after reading this piece?

9. What would I do differently if the piece were not true?

10. What can I do constructively as a response that doesn’t require me to trust this piece?

11. What action can I take that gives me more information to falsify my conclusion?

I once had a fascinating conversation with someone who believed in ghosts, i.e. active spirits of humans whose bodies have stopped working. She recounted some experiences she had had that led her to believe these spirits were influencing the physical world. By the end, I concluded that I didn’t believe that her inference based on her experiences was the correct one, but I also didn’t disbelieve that the raw experiences happened. So far, I don’t believe in ghosts, but that’s a provisional conclusion, pending further evidence. I acknowledge that I won’t truly know for sure whether ghosts exist until I embark on a project that relies on the existence or nonexistence of ghosts.

(For the record, my most compelling argument against ghosts isn’t that it would introduce an aspect of reality our scientific tools haven’t detected yet. There have been hundreds of those—that’s how scientific progress works. No, I draw my current conclusion because as far as I know, none of the people who believe they can interact with ghosts has successfully exploited such knowledge to create ground-breaking, world-changing technology. I’ll elaborate on that in a future article.)

What makes me different from most people who don’t believe in ghosts is that my conclusion isn’t supposed to take me to the end of the line. It’s only supposed to last until I learn more or until someone else challenges me on it—though it admittedly will influence how much more data I actively seek out on this topic.

When people read political news, fake or otherwise, there is a typical pattern to what they do as a response. They form (or reinforce) a conclusion, and then they go out and act on their conclusion. Acting usually means complaining about the news to their friends, yelling at other people in real life or on the Internet, and voting a particular way. None of those activities is likely to result in them learning any more about their conclusion. (Well, except if they bother to listen to the people they’re yelling at. However, people being yelled at tend not to provide the best quality information, either.)

Below is a diagram of what is going on in their heads:

Overdue Fake News Article diagram 1
Figure 1: Conclusion originates from a formative influence (e.g. family, a role model, or some inspirational event or work of fiction). News and updated data feed into the conclusion, but the conclusion itself continues in the same direction regardless.

This method is a terrible way to ensure you arrive at your destination.

Imagine a ship navigating at sea. As long as long-distance nautical ventures have existed, ships have relied on expert navigational practices to avoid getting lost. In the days before global positioning systems, they would use the stars (and accurate clocks, once those were invented) to determine their location and direction. By simple geometry, a tiny error in the initial direction amounts to a huge error in location over long distances. No self-respecting captain would set off in a direction and simply sail on until they ran into their destination. That, however, is what most people do when they hold onto a stale conclusion.

Jenkins, you idiot! Lumbricus the Worm is supposed to be off our port stern! You’ve let Saturn’s tides lift us into the mountains!

To be sure, a ship’s navigator would try to save time by measuring and sailing on their heading as accurately as possible from day one. Fewer course corrections means less distance you have to travel. However, in addition the magnification of small deviations over large distances, the ocean waves and wind are constantly changing the course of the ship. The heading needed to be recalculated each night to be sure they were still on the right course. Likewise, since real life is also constantly changing, you will need to regularly reevaluate your own course.

Overdue Fake News Article diagram 2.1.JPG
Figure 2: Note that each conclusion lasts as far as the next set of data, at which point it disappears and a new conclusion forms with a different direction.

Am I advocating that you yo-yo between different positions whenever you find a conflicting data point? Certainly not. There’s no reason to wholly commit yourself to a conclusion when the next batch of data might lead to a different one. On the other hand, it’s often necessary to be decisive based on the information you currently have, even though you know you don’t have the full picture. How can you effectively plan for learning you are wrong, while still taking real action?

Here are the steps you can take:

First, you form a provisional conclusion. This conclusion is a working understanding of reality that will last just long enough to for you to take the next step in whatever you’re doing.

Second, you decide on an action that is constructive even if your conclusion is partially wrong. This is often difficult, because the whole point of forming conclusions is that you need an accurate understanding of reality to determine what actions are useful in the first place. However, a ship can go in roughly the right direction even if it isn’t perfect. When it comes to politics, there are many constructive things you can do that don’t require you to trust or support any politicians or parties. For instance, as Daryl Davis can tell you, one of the most effective ways to deal with an adversarial group is to connect with and befriend them rather than trying to ostracize or legislate them out of existence. When in doubt, learn to understand and empower people to overcome obstacles rather than trying to unilaterally destroy structures that may serve some purpose, at least until you’ve learned more about their nuances.

Third, the action you take should also help you collect more data for your next conclusion. It’s important that you can tell whether the action you take actually helps accomplish your goals. You may even be able to figure out a better alternative from what you learn. That’s another point in favor of going out and listening to the concerns of people who disagree with you.

Fourth, after you take the action the conclusion expires and the cycle repeats again. If there isn’t enough new data, you can double-check the original evidence and your process for forming the conclusion in the first place. It also helps to get other people’s perspectives for the process of rederiving the conclusion, to make sure you’re not overlooking something. Decisive by Chip and Dan Heath also deals with setting triggers for reevaluating your course if it’s not working the way you expected.

If you find the previous conclusion no longer applies or was partially wrong to begin with, that’s okay. You score no points for the new conclusion being the same as the old one. The important thing is that the new conclusion is as accurate as you can reasonably make it for now. That’s why every conclusion needs to expire: to prompt you to form a fresh one at each important juncture.

The conclusion should also expire when someone challenges you on it. In order to find out the truth between the two of you, you need to walk through the steps that you took to get to your current conclusion. You’re not an expert if you can’t rederive what you know when seriously questioned. We’ll see more about that when we get to collaborative truth-seeking.


Usually when I see news organizations decrying fake news from their rivals, it comes off to me as a redundant and ad hominem message. “Don’t believe their lies, because they’re your enemies!” I infer an implicit corollary to that message: “Do believe our lies, because we’re your friends!”

The hardest thing about fake news isn’t finding out which path is righteous, but rather forging a worthy path yourself without having to trust the voices trying at all costs to get you to join a side against their adversaries. The propaganda surrounding contentious issues can be a noxious quagmire of arrogance and contempt.

And the majority of those participating in the quagmire have feet of clay.

Nevertheless, if you develop the habits of recognizing raw and inferred facts, combating confirmation bias, applying nuanced thought, and forming and testing provisional conclusions and constructive approaches, you can be confident that you won’t be deceived into serving a destructive agenda, no matter how well-intentioned it may be or who else believes it.

Further Reading

For help with collaborative truth-seeking, I recommend literally every person read the book Difficult Conversations by Douglas Stone, Bruce Patton, and Sheila Heen. It is one of very few books to which I give this distinction. This book both describes and shows how to establish understanding and effective communication on subjects which are tied to strong emotions and personal identities.

To get into the mood of nuanced thought and avoiding confirmation bias, I recommend listening to Angels or Demons? by the band I Fight Dragons (from their album DEMOlition).

Finally, if you want to join more people in holding sources and spreaders of fake news accountable, and showing politicians you value honesty more than lavish promises, you may be interested in signing up for the Pro-Truth Pledge, created by Intentional Insights. Full disclosure: I was on the Intentional Insights board of directors for a time and still work with them.


I hope you all find this article useful. Take care, and have fun!


An Introduction to Developing Powerful Skills

Hello. If you’re reading this article, you are probably interested in acquiring abilities which will improve your life. You may even be interested in making the world a better place in a major way. I’m here to help with that.

First, let’s take a look at the status quo. Do you ever find yourself despairing that you can’t do something that you want to do, maybe something that everyone else seems to be able to do? Do you ask yourself, “why can’t I plan ahead?” “Why can’t I use computers?” “Why can’t I save money?” “Why can’t I keep up with my peers?” “Why can’t I handle stress?” “Why can’t I understand people, or get them to like me?”

Do you feel like this?


Do you want to accomplish something that only a few others have? If you’re particularly globally-minded, maybe you’ve asked yourself, “why hasn’t this problem been solved yet? Why does poverty still exist? What about war? Corruption? Oppression? Are these problems impossible? Are humans just too stupid or too flawed to solve them?”

I’ve got good news for you.

…Well, it’s not really news, actually. The knowledge and wisdom to help you build the life and world you want have already been discovered and articulated, in many cases decades or centuries ago.

The reason these issues still exist isn’t anything inherently wrong with you or humanity in general. The problem is we’ve all been forced to learn how to be people almost from scratch.

The keys you need to succeed are somewhere in here. Good luck!

With the exception (if you’re lucky) of some basic guidance from family, friends, fiction, and mentors, most people grow up with only the skills they’ve picked up from dealing with their childhood environment.

Furthermore, one person may live decades without developing the skill to handle a situation they deal with every day, while another person learns the skill immediately from the experience. Why the difference?

The answer is paradigms.

(Pronounced “para-dimes”, because it was decided that a word should be spelled according to how it was pronounced in the original Latin.)

You k’now, this mi’gh’t be a good time to ta’l’k about silent letters.

A paradigm is how you see the world. It’s what you notice and what you assume. It’s what you care about and how you fit everything together into a model that makes sense. You may have many different paradigms, each one for a different situation. Though you may not have words to describe it, a paradigm is how you think a situation works.

Why are paradigms so important? Imagine that at the beginning of your life you have a hammer. Maybe you’re born with it, or maybe your parents gave it to you, because they were given hammers by their parents. You go through life being good at pounding in nails and being terrible at driving screws. No matter how many screws you encounter, you’re not going to get better at it. Nails are all you can deal with.

You may not even recognize a screw or a screwdriver when you see one, until someone else points it out. After all, as the saying goes, when all you have is a hammer, everything looks like a nail. People with screwdrivers might look like magicians to you, except when they try to drive a nail, at which point you show them the superiority of a good, old-fashioned hammer.

Also, don’t make the mistake of thinking you can turn nails or screws with a wrench. That’s nuts.

What if someone handed you a screwdriver and showed you how to use it, though? You’d still be bad at driving screws, at least for a little while. However, you would get better with practice. You’d become at least competent, if perhaps not a master. Importantly, you would also be able to understand and judge the skills of other screwdriver users, instead of being limited to saying, “that one is more powerful”.

As a more concrete example, imagine a little boy has fallen off his bicycle and skinned his knee, and the bike chain has come loose.

Not pictured: Actual bicycle accident.

Someone with a person-oriented paradigm might notice the child’s emotions and comfort him. Someone with a health-related paradigm might notice the injury, inspect it, and want to apply disinfectant. Someone with a mechanical-based paradigm might look at the bike and know how to fix it. Someone with a social order paradigm might see that the boy was riding in a forbidden area and move to scold him. All these are valid approaches to dealing with different aspects of the same situation. These people are starting from different premises about what is important or relevant and different knowledge of how things work.

What happens when a paradigm meets a situation it doesn’t know about, though? Would the medic know how to fix the bike? Would the mechanic know how to comfort the child? Would the comforter know how to discipline him?

No, they wouldn’t.

But could they learn?

They say people learn from experience, but that’s not completely true. Watching television in another language or living in another country helps you learn the language, but mere exposure doesn’t work for just anyone. Being allowed to play around with a piano doesn’t mean that a person will automatically learn how to play music, but you won’t be able to play music if you don’t practice. Experience is necessary for learning, but it is not sufficient on its own. Paradigms are necessary as well.

If a person doesn’t have a paradigm to help them gather what is important about their experience, then experience won’t do them much good. That’s why people who are decades older than you aren’t necessarily any wiser about things you’d expect them to pay attention to. They never had access to the paradigms that would have allowed them to learn from their experiences, or they considered the paradigms unimportant.

If that describes you, it’s never too late to start learning. Or to stop being so obnoxiously arrogant.

The paradigms exist, though. There are people out there who know how to interact with people, how to build good habits, how to learn technical skills, how to take smart risks and avoid stupid ones. Their paradigms even guide them in seeking out new experiences to learn from. There are a few reasons why other people haven’t been able to find useful paradigms, though:

  • They don’t know what they need to know.
  • They don’t know the paradigm exists or could help them.
  • There are just too many paradigms to sift through to find what they need.
  • They don’t know how to recognize a useful paradigm from a flawed one.
  • They think mere knowledge is the same as a paradigm. (Knowledge becomes obsolete, but a paradigm helps you keep up-to-date.)
  • They don’t know how to generalize a paradigm to solve multiple similar problems.

Often, people give up on looking for the paradigms they need and try to brute-force their way through life with the paradigms they already have. They are resigned to the idea that they’ve either got it or they haven’t. It may be true that some people take to a paradigm easily while others need more help, but there’s no reason to limit your learning to the paradigms that come naturally to you. Doing so cripples your learning in every direction, including what you do best—a paradigm can only take you so far without support from other key paradigms.

What does this all mean for our world, and for you in particular?

All the hard work in the world won’t help if you don’t have the right tools, but if you do have the right tools, you have a decent chance at almost anything. You can do things you can be proud of, and even change your world. Furthermore, you’re not stuck with the tools you already have.

Where can you find more tools? That’s where I come in.

Useful for plying your trade.

I’ve identified and cataloged all the fundamental tools (more or less), and I can point you to some good places to pick them up and learn to use them. Many articles in this blog are and will be about what these tools are, how they work, what they can do, and how to obtain them. Once you have them, using them is up to you.

Take care, and have fun.