All Hail the Self

  • From the standpoint of critical thinking, we have taken self-centered thinking too far when we accept claims for no good reason. In the service of our almighty selves, we distort our judgment and raise our risk of error, which is ironically a risk to ourselves.
  • Self-interested thinking takes several forms. We may decide to accept a claim solely on the grounds that it advances, or coincides with, our interests. Or we may be tempted to accept claims for no other reason than that they help us save face.
  • To overcome the excessive influence of your own needs, watch out when things get very personal, be alert to ways that critical thinking can be undermined, and ensure that nothing has been left out.

All Hail My Group

  • Group pressure to accept a statement or act in a certain way has several overlapping subtypes. When the pressure to conform comes from your peers, it’s called peer pressure. When the pressure comes from the mere popularity of a belief, it’s known as an appeal to popularity. When the pressure comes from what groups of people do or how they behave, it’s called an appeal to common practice. In all cases, the lapse in critical thinking comes from the use of group pressure alone to try to support a claim.
  • The assumption that your group is better than others is at the heart of prejudice, a negative or adverse belief about others without sufficient reasons. It is dislike or intolerance based on no good evidence.
  • We all have certain beliefs not because we have thought critically about them but because our parents raised us to believe them or because the conceptual push and pull of our social or political group has instilled them in us. That is, we may believe what we believe—and assume that our beliefs are better than anyone else’s—merely because we were born into a family or society that maintains such views. This endemic pressure can lead to wishful thinking, rationalization, self-deception, and—worst of all—violence. Group thinking of this kind can also easily generate narrow-mindedness, resistance to change, and stereotyping (classifying individuals into groups according to oversimplified or prejudiced attitudes or opinions).
  • For critical thinkers, the only way to counter the outsize influence of the group is to achieve an impartial stance and proportion your belief to the strength of reasons. Both actions take courage, dedication, and practice.

The Toughest Mental Obstacles

  • In its most general sense, evidence is something that makes a statement more likely to be true. It does not mean “something that I feel or perceive is true.” The mere fact that you strongly believe a statement, or have a friend who strongly believes it, or have read Twitter posts by people swearing that it’s true, or hear from your favorite radio or TV personality that it’s so—such things do not, by themselves, constitute evidence.
  • An all-too-human tendency is to try to deny or resist evidence that flies in the face of our cherished beliefs. We may deny evidence, or ignore it, or reinterpret it so it fits better with our prejudices. Denying evidence may be psychologically comforting (for a while, anyway), but it thwarts any search for knowledge and stunts our understanding.
  • We often not only resist conflicting evidence, but also seek out and use only confirming evidence—a phenomenon known as confirmation bias. When we go out of our way to find only confirming evidence, we can end up accepting a claim that’s not true, seeing relationships that aren’t there, and finding confirmation that isn’t
  • Motivated reasoning is reasoning for the purpose of supporting a predetermined conclusion, not to uncover the truth. It’s confirmation bias in overdrive. It’s a way of piling up evidence that agrees with our preferred conclusion and of downplaying, ignoring, or devaluing evidence that supports the contrary view. We set out to prove our point, not to determine whether the point is justified.
  • We commit the availability error when we rely on evidence not because it’s trustworthy but because it’s memorable or striking—that is, psychologically available. In such cases, we put stock in evidence that’s psychologically impressive or persuasive, not necessarily logically acceptable.

Your Brain on Social Media

  • The mere exposure effect is the idea that just being exposed repeatedly to words or images (even without registering them consciously) can induce a favorable or comfortable feeling toward them, whether or not there is any good reason for doing so.
  • The illusion-of-truth effect is a phenomenon in which you come to believe that a false claim is actually true simply because it is familiar. But, of course, familiarity is no guarantee of truth. The worrisome part is that the illusory truth effect can happen even when we know better—that is, even when we have the opportunity to draw on our store of knowledge.
  • The false consensus effect is the tendency to overestimate the degree to which other people share our opinions, attitudes, and preferences. We like to think that most people agree with us (on a single issue or all issues), believe what we believe, have the same values, and look at the world the same way we do. The problem is that we are often wrong about how widely our beliefs and attitudes are shared by others.
  • The Dunning–Kruger effect is the phenomenon of being ignorant of how ignorant we are. The Dunning–Kruger effect is made worse by the tendency of many to believe that because they know a little something about a subject, they are experts; because they have read a book or a few Internet pages on a topic, they are as much an expert as any Ph.D.

Philosophical Obstacles

  • Subjective relativism is the view that truth depends solely on what someone believes—a notion that may make critical thinking look superfluous. But subjective relativism leads to some strange consequences. For example, if the doctrine were true, each of us would be infallible. Also, subjective relativism has a logical problem—it’s self-defeating. Its truth implies its falsity. There are no good reasons to accept this form of relativism.
  • Social relativism is the view that truth is relative to societies—a claim that would also seem to make critical thinking unnecessary. But this notion is undermined by the same kinds of problems that plague subjective relativism.
  • Philosophical skepticism is the doctrine that we know much less than we think we do. One form of philosophical skepticism says that we cannot know anything unless the belief is beyond all possible doubt. But this is not a plausible criterion for knowledge. To be knowledge, claims need not be beyond all possible doubt, but beyond all reasonable doubt.
Back to top