An important critical thinking tip is to study other views, to debate frequently, and to avoid living inside of an echochamber. While these might seem like three separate tips, they're closely related enough that it's worth including them under the umbrella of one post.
Inhabiting an echochamber is something that many of us do, at least to a certain degree: We listen to political commentators who we generally agree with; we visit news sites or read books that confirm our worldview or conform to our biases; we have a social media feed of our own creation that could very well be tailored to our particular views; and so forth. Despite the commonplace nature of such a lifestyle, I would argue that living in this way is intellectually constricting and damaging.
If you're only commonly exposed to one side of the argument, one side's analysis of a particular subject, you may find that side to be convincing. But what if the other side has a perfectly reasonable and utterly convincing rebuttal to your position that you haven't thought of? How would you know if you were only exposed to that one side?
You might say, "Ah, well I don't have to worry about this, because the political commentators that I listen to or the websites that I frequent very often do discuss the position of the other side." My response would be: Can we trust people to be constantly highlighting the best counter-arguments to their position? Insofar as they do discuss the opposing viewpoint and the counter-arguments that they receive, can we really trust them to accurately represent this opposing side's arguments in their most convincing form? This is something that John Stuart Mill discussed in his book On Liberty. As he wrote,
"Nor is it enough that he should hear the arguments of adversaries from his own teachers, presented as they state them, and accompanied by what they offer as refutations. That is not the way to do justice to the arguments or bring them into real contact with his own mind. He must be able to hear them from persons who actually believe them, who defend them in earnest and do their very utmost for them. He must know them in their most plausible and persuasive form; he must feel the whole force of the difficulty which the true view of the subject has to encounter and dispose of, else he will never really possess himself of the portion of truth which meets and removes that difficulty
. . . So essential is this discipline to a real understanding of moral and human subjects that, if opponents of all-important truths do not exist, it is indispensable to imagine them and supply them with the strongest arguments which the most skillful devil's advocate can conjure up."
Source: On Liberty, by John Stuart Mill, p. 35–36.
I would even apply this logic to situations where a person directly quotes the opposing side, because perhaps he's leaving out of the quotation key parts of the argument, or perhaps he's quoting from a relatively uninformed representative of the opposing viewpoints. So even with direct quotations, we should still be asking ourselves, "Could there be more to the other side's position? Are there additional arguments and key pieces of evidence that I might currently be unaware of?"
The most surefire way to learn about particular viewpoints is to simply explore the different viewpoints yourself. Read the material straight from the horse's mouth. Investigate the evidence. Study the arguments in their original form from their original sources. Watch back-and-forth, real-time debates between proponents of the different viewpoints to see how they handle their views being challenged, to see how they respond to certain counter-arguments, and to see whether their position can withstand scrutiny. Only after doing this do I think a person is in a good position to make a balanced, informed, rational assessment of where the truth lies on a particular topic.
And after studying all sides, you sometimes may discover that the truth is not a black-and-white issue of Side A is completely wrong and Side B is completely right; depending upon the subject, one of the competing sides could be completely right, mostly right, partially right, or completely wrong. Sometimes the many different sides might all have embedded within their viewpoints sound arguments and correct views, with the truth being more an amalgamation of these different perspectives instead of one side having full ownership over all aspects of the truth.
Some issues are very complex and can't be distilled into a simple matter of "I'm completely right and you're completely wrong." But in other cases, it really is this simple. When fundamentalist Christians argue, for example, that the earth is 6,000 years old, this is a very specific factual matter on which they're flatly mistaken. The truth isn't somewhere in the middle on this question; this isn't a matter of different opinions being equally valid; one side is just completely wrong, whether they recognize it or not.
So in which camp the truth resides will vary on a case-by-case basis depending upon the subject, and the way to determine the truth isn't just to listen to the one side that you traditionally agree with and mindlessly adopt and regurgitate their views; it is to actually explore the evidence; do the research; listen to the arguments made by proponents of different viewpoints and watch them debate each other and put their views to the test.
You might be asking at this point, what about studying beliefs that are plainly absurd? How is there ANY value to that? How is it anything more than a waste of time?
I would start by asking: How do you know that these beliefs are absurd? If you haven't examined the arguments put forth by their proponents, if you haven't researched whether there's supporting or refuting evidence for that view, how do you know that it's worth dismissing? One man's absurd belief is another man's deeply held conviction.
But yes, obviously not all beliefs are on a level-playing field, and some are so ridiculous that it doesn't require an enormous expenditure of time and thought to see how unjustified they are. The belief in a flat-earth is probably the go-to example: There's such an overwhelmingly large body of evidence rejecting this view that it seems laughable to even consider it. The enormity of the conspiracy that would be required to manufacture the viewpoint that the earth is spherical when it's actually flat would be so immense, would be so pointless and devoid of a plausible motive, that this viewpoint scarcely merits consideration. But insofar as people actually believe this viewpoint, I think it does merit refutation. And in order to be in the best position to refute a person's belief, we need to understand the arguments and the evidence that underlie this belief. So this is a key part of the value of studying even the most ridiculous viewpoints.
As Christopher Hitchens put it in a lecture on free speech: (Watch from 3:45–5:47. Or just watch the whole thing. It's a great speech.)
Some believe that it's just not worth their time to debate certain people, but I'm a firm believer that it's almost always worth trying to convince somebody of the folly of their beliefs. If I held a belief that was false, I would want somebody to change my mind. Regardless of how silly the viewpoint may be to us, we are doing these people a favor by attempting to persuade them of how wrong they are. And then there's the fact that many beliefs impact other people and the world at large; for example, if a person believes that vaccines cause autism, or that vaccines are part of some kind of sinister government plot, this isn't just a harmless delusion; this belief can get people killed. So there's much more to debating people than just chalking up another victory within our debate ledger. Because holding false beliefs is undesirable in and of itself, and because many beliefs have real-world consequences, I consider debating people that we disagree with to be our intellectual civic-duty.
I also reject this commonly-held dismissive viewpoint that many people are just beyond the pale and are so deeply ingrained in their belief system that it would be nothing more than a fruitless waste to even engage with them and give them the time of day. You'll often hear people sprinkle in a quote in support of this strategy that goes something like "You can't reason people out of a belief that they weren't reasoned into."
The reason I reject this position is that it's demonstrably untrue. People do become persuaded to reject views that they formerly held. People do listen to arguments made by the other side, and if they find them to be convincing, they sometimes do change their mind. How many people do you know of that are currently atheists who once held fundamentalist religious beliefs? Matt Dillahunty, Dan Barker, Tracie Harris, Ayaan Hirsi Ali—these are people who were once die-hard believers, but they changed their minds when they encountered and considered arguments against their position.
So this is where the value of studying opposing views comes into play: When you know not just what it is that the other side believes, but the arguments and the evidence that they put forth in support of their position, you are better equipped to refute their arguments and convince them that they're on the wrong side of the battlefield. I'm sure we've all been in a position where we're talking with somebody whose view on a subject we know is incorrect, and then they hit us with an argument or throw some evidence at us that we've never previously encountered, and we just can't think of a good on-the-spot refutation because this is new material to us. The more exposed you are to arguments made by the other side—the more you seek them out, study them, and debate them—the easier it will be to shut them down when you encounter them. And chipping away at their supporting arguments in this way contributes to the eventual implosion of the core belief itself. So you're not just benefiting yourself when you follow this critical thinking tip; you're also helping to make other people more rational.
But here's an important thing to consider about debating: It takes two to tango. If people don't seek out debates or study different views, but instead live within an echochamber, regardless of how prepared you are to debunk their viewpoints, chances are, they're never going to encounter you and even give you an opportunity to do this. Religious beliefs, probably more than anything, showcase how much of a problem inhabiting an echochamber is, because as we can see here in this Pew Research data, they have a very recognizable geographical distribution.
That is to say, a good predictor of a person's religious beliefs is simply the region of the globe that they inhabit. And if you think about it, this is no surprise: If your parents, friends, and the surrounding community predominantly believes in one particular religion, you're going to grow up in an environment that takes these religious beliefs for granted, so of course you're likely to end up adopting them and believing them into adulthood. If the people around you don't challenge your beliefs, but instead confirm them and only amplify your confidence in them, you're much less likely to doubt them and seriously consider their folly.
As Richard Dawkins pointed out in The God Delusion (p. 129), "Sociologists studying British children have found that only about one in twelve break away from their parents' religious beliefs." How likely are your beliefs to be true when you've adopted them largely because they predominate within your surrounding intellectual environment? Wouldn't it be much more reasonable to adopt beliefs only after a careful consideration of their merits against competing ideas?
It's easy to see how living within an echochamber can apply geographically, but walling ourselves off, metaphorically, from opposing viewpoints, and insulating ourselves within an intellectual environment of our own choosing, can have a very similar effect. If you create for yourself an echochamber where certain views are taken for granted and constantly reiterated, at the level of the mind, the end-result will be little different than if you lived in a small village where all the people around you shared the same views on certain topics. Living in a diverse society and having access to technology is no guaranteed antidote against living an intellectually hermetic life.
Something worth noting is that we should be cognizant of where and how we have our debates, because certain exchanges are going to be more worthwhile than others. For example, the typo and insult-riddled YouTube comment devoid of substance posted by somebody who is clearly deranged probably isn't going to be my top choice for a productive dialogue.
On the other hand, certain exchanges that might seem like a complete waste of time could actually be some of the most fruitful. For example, the raving street preacher yelling at college students on campus, or the pair of Jehovah's witnesses knocking on your door, are probably worth debating. Why? Because these people are so confident in their beliefs that they're willing to shout in the streets like a maniac or go knocking on strangers' doors in an attempt to disseminate their beliefs. These people, much moreso than your average Christian on the street, know their Bible, cover-to-cover; know their arguments, top-to-bottom; and have prepared responses to common counter-arguments. Their preparedness and spiritedness is likely to lead to a much more challenging and interesting debate than one with somebody who's nominally a Christian but doesn't take their beliefs very seriously or even know much about their holy book.
We also need to beware of having debates within an echochamber, because this can provide the illusion of balanced discourse when in reality, it's anything but. This brings to mind a Noam Chomsky quote, where he said:
"The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum—even encourage the more critical and dissident views. That gives people the sense that there's free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate."
Source: How The World Works, by Noam Chomsky, p. 234
Obviously he's going in a different direction than I am here, because his is more a criticism of the media, but the basic point I'm making here is: Having debates within a restricted spectrum can give the impression of open-mindedness and having views that are justified because they withstand scrutiny; but if that spectrum was to be unchained and widened, we might quickly discover that our beliefs actually don't match up with reality.
Furthermore, debates within a thin spectrum may be about the fine details, about the minutiae, while the core ideas themselves are being completely unchallenged and taken for granted. We can imagine, for example, a Christian internet forum where participants are bickering about the true pathway to heaven, or the correct interpretation of a certain Bible passage, where the foundational ideas—whether heaven even exists, or whether the Bible is even worth taking seriously—are being taken for granted by all participants. So these people are having debates about their religion, but they're a very restricted kind of debate.
Another good example of debating within an echochamber comes from my personal life. When I was younger, I used to be a major conspiracy theorist. I used to think that 9/11 was an inside job, that aliens were visiting earth and that our governments were covering it up—the whole nine yards. I adopted these views in large part because I was an almost daily visitor of online forums that were little more than a nexus and haven for conspiracy theories. The vast majority of the people posting, commenting, and upvoting on these websites agreed with the beliefs and would reinforce them constantly; and conversely, the rare dissenting voices would be drowned out and overwhelmed by the torrent of commenters disagreeing with them. So any counter-argument, any challenge to the conspiratorial belief, would be dogpiled upon by the community and completely smothered by rebuttals. To the uncritical person living within the echochamber, this created the impression that the dissenting viewpoints were without merit.
I think part of the problem was that my critical thinking skills weren't what they are today, as I was fairly young at this point, probably between the ages of 14 and 17 years old. But another part of the problem was that I was living within a textbook echochamber, where the community was very slanted to one side.
So how did I escape the bubble, you might be asking? Well it's very simple: I broke free from the bubble by breaking free from the bubble. I started reading and listening to a more diverse collection of viewpoints from a larger range of sources; I came to discover that there were counter-arguments that I hadn't seen or seriously considered; and ultimately, I realized that there were good reasons to doubt, or even outright reject, some of the views that I held. So when it comes to echochambers, you could say that I am living proof of both the problem and the solution.
To spell it out plainly, why should we follow this critical thinking tip? First off, because inhabiting an echochamber limits the number and quality of ideas that you're exposed to and ultimately restricts your intellectual growth. Echochambers produce a distorted and biased image of ideas and the world. Living within an echochamber almost guarantees that you're going to hold false beliefs for a longer period of time than you otherwise would have, because you're less likely to encounter counter-arguments and contradictory viewpoints and pieces of evidence presented in their most convincing form.
Studying other views and debating frequently will make you a more persuasive advocate of your views and will also make it easier to convince other people of the folly of their views. Knowing the specific justifications for opposing viewpoints, hearing the arguments and seeing the evidence presented to support these views, and hearing the rebuttals leveled against your position, time and time again, will put you in a better position to respond to these things when people bring them up. And if you're more effective at changing people's minds, this means that they're going to spend less time believing false claims that very well could cause tangible harm to themselves or to other people.