• ForgeHub
  • Posts
  • DEVELOPING CRITICAL THINKING

DEVELOPING CRITICAL THINKING

How to Think Clearly When Everyone Else Is Manipulated

4 November 2025

Most men aren't stupid. They're manipulated. The information environment is designed to bypass critical thinking by exploiting emotional triggers, social pressure, and cognitive shortcuts. Headlines are optimised for reaction rather than accuracy. Experts present credentials without demonstrating sound reasoning. Consensus is manufactured through social pressure rather than evidence.

Intelligence does not protect you. Intelligent people frequently believe foolish things because clear thinking is a skill, not an innate trait. It demands deliberate practice, systematic questioning, and the humility to recognise when you are being deceived.

Critical thinking is not cynicism or the act of questioning everything. It is the discipline of systematically evaluating claims before accepting them; the ability to detect manipulation while remaining open to the truth; and the courage to reach unpopular conclusions when the evidence demands it.

Why Smart People Believe Dumb Things

Intelligence makes rationalisation easier, not harder. The smarter you are, the better you can justify positions adopted for non-rational reasons. Tribal affiliation, emotional appeal, and social pressure bypass reasoning regardless of IQ.

Fear bypasses intelligence entirely. During COVID I watched highly intelligent people accept glaring inconsistencies because fear made questioning seem irresponsible. Lockdowns that lacked epidemiological justification. Policies that contradicted one another. Data presented selectively. None of this prompted critical evaluation because fear had been deliberately instilled through government advertising campaigns designed to terrify. When you are frightened, tribal conformity feels like safety, and questioning feels like a risk. The manipulation succeeded precisely because the threat felt sufficiently real to render scepticism dangerous.

Expertise in one domain does not necessarily transfer to others. The accomplished surgeon may believe economic fallacies. The brilliant physicist might accept historical myths. The successful businessman often swallows political propaganda. Competence is typically domain-specific. Critical thinking must be deliberately developed across multiple fields.

The smarter you are, the more perilous your blind spots become. You trust your reasoning ability, while emotional manipulation and social conformity hijack it. Intelligence gives you the confidence that you are thinking, when in fact you are merely believing.

The Five Critical Questions

Critical thinking requires systematic questioning. While not every issue demands deep analysis, claims that could alter your thinking or behaviour require careful evaluation. Five key questions help distinguish manipulation from legitimate reasoning.

Who Benefits?

Cui bono? The oldest tool of critical thinking. Follow the incentives, not the stated motivations. Who gains money, power, or status from this claim being believed? Why am I being told this now, and by this person? What does the messenger gain from my acceptance?

A pharmaceutical company funds research that endorses its drug. A politician promotes policies that expand their power. An academic defends theories that justify their grant funding. A journalist writes stories that generate clicks. These incentives reveal more than the stated reasons.

This does not mean that every motivated claim is false; rather, it means that motivated reasoning requires closer scrutiny. When someone stands to benefit from your belief in something, examine the evidence more carefully.

What is the Counter-Argument?

Can you present the opposing view more effectively than its advocates? If you cannot clearly explain why intelligent people disagree, you do not fully understand the issue. Steel man, not straw man. The strongest version of the opposing argument exposes weaknesses in your own.

I test my understanding by explaining opposing positions to those who hold them. If they say "yes, that's exactly what I think," I understand the disagreement. If they say "no, that's not what I believe," I realise I am arguing with a caricature.

Refusing to engage with the strongest counter-arguments is intellectual cowardice. Defeating arguments that no one actually makes proves nothing. The measure of your reasoning is how well it handles the best opposing case, not the weakest.

What Would Change My Mind?

If you cannot articulate what evidence would change your position, you are not thinking critically. Unfalsifiable claims do not constitute reasoning. Can you specify what would prove you wrong?

This question distinguishes conviction based on evidence from conviction based on identity. Evidence-based positions can specify what would change them, whereas identity-based positions cannot. Saying, "I believe X because I am the sort of person who believes X" is not reasoning.

I have changed my mind on significant issues because evidence contradicted my positions. It is not comfortable, but it is necessary. The willingness to be proven wrong by evidence is what distinguishes critical thinking from tribalism.

Am I Thinking or Rationalising?

Did you reach this conclusion through reasoning, or was it from your tribe? Are you evaluating the evidence objectively, or merely defending a position? Would you accept this reasoning if it led to the opposite conclusion?

The honest answer reveals whether you are thinking critically or merely rationalising. If you evaluate the same evidence differently depending on which conclusion it supports, you are not reasoning. Similarly, if you apply standards selectively based on whether they support or undermine your position, you are rationalising.

Your emotional response is a warning signal, not a guide. A strong emotional reaction to a claim often indicates manipulation. Outrage, fear, and moral certainty bypass critical thinking. When you experience strong emotions about an idea, that is when evaluation matters most.

What Is Being Assumed?

Every argument rests on unstated premises. Identifying hidden assumptions exposes weak foundations. Question the framework, not just the content. The assumptions you overlook govern your thinking.

A policy debate about "fairness" assumes a particular definition of fairness. An economic argument presupposes certain human behaviours. A moral claim relies on shared values. Examining these assumptions often reveals that the actual disagreement lies there, rather than in the surface argument.

I ask, "What must be true for this argument to work?" The answer reveals assumptions that can be examined independently. Often, the explicit argument is sound if its hidden assumptions hold. However, the assumptions do not hold.

Detecting Emotional Manipulation

Outrage is primarily used to bypass critical thinking. If someone's initial tactic is to provoke anger, they are manipulating you. Genuine arguments do not rely on emotional manipulation; they stand on their own merits. A reply to an article I wrote recently began: “That Russian shill is the one spreading heresy.” The emotion was palpable.

"How can you even question this?" is always a tell. Shutting down inquiry through moral language and making scepticism itself seem immoral. Legitimate claims should welcome examination.

When moral language replaces reasoning, it bypasses your critical faculties. The statement "This is about basic human decency" may be true, but it does not constitute an argument. Rather, it attempts to portray disagreement as indecent.

Evaluating Expert Claims

"Trust the experts" functions as a thought-terminating cliché. Which experts? Experts often disagree. On what basis are you accepting one expert over another? Is it simply because this expert confirms what you already believe?

Credentials indicate training, not truth. A PhD holder might be knowledgeable in their field or might be incompetent. True expertise requires demonstrated reasoning, not merely letters after your name. Show me why you believe this; don't just tell me who believes it.

When writer Douglas Murray debated comedian and podcaster Dave Smith on The Joe Rogan Experience regarding American policy towards Israel, Murray ultimately abandoned substantive argument in favour of credentialism, stating, "You're a comedian. You're not a historian." This revealed two things: he had lost the argument on its merits, and he misjudged the audience. Rogan's millions of followers have lost faith in credentialed expertise, valuing logical argument over academic pedigree.

Expert disagreement demonstrates that expertise does not always yield clear answers. When experts are divided on a question, appeals to expertise resolve nothing. Instead, you choose which expert to believe based on factors other than expertise, usually depending on which conclusion you prefer.

I evaluate expert claims by examining their reasoning, not their credentials. Does the argument make sense? Does the evidence support the conclusion? Would I accept this reasoning in another context? Credentials earn attention; reasoning earns acceptance.

Resisting Social Pressure

Consensus is more often manufactured than discovered. The appearance of agreement creates pressure to conform. People are told "everyone believes X", when many privately doubt X but fear speaking out.

The spiral of silence describes how individuals holding minority opinions remain silent, making their views appear even less common. This, in turn, leads more people to stay quiet. Consequently, a manufactured consensus becomes perceived as genuine consensus, driven by social pressure rather than evidence.

Your personal experience matters more than what you are told everyone believes. If you observe X but experts tell you Y, examine both carefully. Do not automatically dismiss your experience simply because authorities disagree. Sometimes, authorities are wrong. Sometimes, you are noticing what they are incentivised not to see.

The courage to say, "I don't see that" when everyone else claims they do. Not an arrogant certainty that you are right, but an honest admission that the emperor appears naked to you, despite assurances about his fine clothes.

Teaching Your Children

My children learned critical thinking by observing me (and others) think. Not through lectures on logical fallacies, but through witnessing the questioning process. They saw me ask, "who benefits?" when evaluating claims. They heard me present opposing arguments fairly. They watched me change my mind when the evidence demanded it.

When my children were growing up, I often reminded them to "ask the second question." They would tell me something, and I would ask a question that encouraged them to think more deeply about what they had said. The first answer is usually what they have been told or what feels right. The second question pushes beyond that surface to genuine reasoning. This habit develops over time. They learned that stopping at the first answer is not thinking; it is merely accepting.

A couple of years ago, I spent time examining the ideas of flat earth advocates. I then showed one of my sons interesting claims they were making and various experiments they felt supported their views. He was not happy. He felt frustrated that I was even considering for a second that they might have something to say.

But what I was really doing was practising the suggestions I set out here. Encouraging him not to write off a view simply because it seems outrageous or because no intelligent person would believe it. It's important to be able to hold unsettling ideas temporarily without immediately closing them down. You can examine an argument without accepting its conclusion. You can understand why people believe something without believing it yourself.

This is the difference between critical thinking and tribal conformity. Tribal thinking dismisses claims without examination because the right people reject them. Critical thinking examines claims on their merits, then rejects them based on evidence. The first requires no thought. The second requires discipline.

Teaching this discipline means focusing on process, not conclusions. I ask them, "Why do you think that?" rather than, "What do you think?" The reasoning matters more than the conclusion. They can hold incorrect opinions if their thinking is clear. I would rather they reason well towards incorrect conclusions than adopt correct conclusions through poor reasoning.

The distinction between healthy scepticism and paranoid conspiracy thinking is important. Scepticism involves evaluating claims and accepting evidence, whereas conspiracy thinking dismisses all evidence as part of the conspiracy. One represents critical thinking; the other is its opposite.

I try to model intellectual humility by saying "I don't know" more often than expressing false certainty. I admit when I have changed my mind and acknowledge when evidence contradicts my preferences. They learn that clear thinking requires the courage to be wrong. And when I forget this modelling, I am quickly reminded by them that I thought something completely different last year.

The Connection to the Margin

Critical thinking requires cognitive space. When you are exhausted from constant reactivity, manipulation succeeds easily. Developing the capacity to ignore most information creates the space needed to evaluate what truly matters. Deliberate filtering enables you to focus on what passes through the filter.

Your information discipline, as discussed in the article on acceleration, enhances your critical thinking. They work in tandem. Filtering creates space, which critical thinking then fills with evaluation rather than mere acceptance.

Looking Ahead

Develop the habit of systematic questioning. Not everything warrants deep analysis, but claims that could change your thinking or behaviour deserve the five questions: Who benefits? What is the counter-argument? What would change my mind? Am I thinking critically or rationalising? What assumptions are being made?

Practice evaluating reasoning rather than merely accepting conclusions. Ask how someone arrived at their position, not just what position they hold. Examine the path leading to the conclusion, not just the conclusion itself.

Recognise emotional manipulation when you come across it. If you feel a strong emotion in response to a claim, take it as a signal to slow down and think carefully. Outrage, fear, and moral certainty are more often signs of manipulation than of truth.

If you have children, model critical thinking for them. Let them see you question claims, evaluate evidence, and change your mind. Teach them how to think, not what to think. The greatest gift you can give them is the ability to think clearly in a world designed to manipulate them.

Build the margin that makes thinking possible. Critical thinking requires cognitive space; protect that space deliberately. Filter information ruthlessly and create capacity for evaluation.

The world will continue to try to manipulate you. Let it try. You understand that clear thinking is a skill, that intelligence does not protect you, and that systematic questioning is the only defence against deliberate manipulation.

Most men will continue to accept claims uncritically. They may be intelligent yet manipulated, educated yet deceived, confident yet mistaken. You, however, will think clearly, evaluate systematically, and reach conclusions based on evidence rather than emotion or allegiance.

Build the habit. Maintain it. Think clearly. This is how capable men navigate an environment designed to circumvent their reasoning.

Richard Morrissey

Reply

or to participate.