Chapter Four

The Tools We Built: Critical Thinking and the Scientific Method

In this episode I want to tell you about the toolkit. Not the romantic version of the scientific method you may remember from a high school poster — “observe, hypothesize, experiment, conclude” — but the actual, lived, hard-won toolkit that human beings spent thousands of years inventing in order to manage uncertainty without making things up.

The story of human progress is, in no small part, the story of learning to manage uncertainty more honestly.

For most of our species’ existence, the tools were crude. When the crops failed, we needed an explanation adequate to the terror of starvation. When disease swept through the village, we needed urgently to understand why — and what to do about it. When someone died before their time, we needed to make sense of a loss that could otherwise destroy the community.

In the absence of tools to investigate these things empirically, the most available explanations were supernatural. The gods were angry. The spirits were disturbed. The harvest god required appeasement. These were not stupid explanations. Given the knowledge available, they were the most logical responses to genuine uncertainty. They provided a framework for action, a basis for communal ritual, and a vocabulary for grief. They had real value. I want to acknowledge that clearly, because a recurring mistake in arguments like mine is to treat every expression of religious life as if it were straightforwardly a product of fear or manipulation. It is not. The complexity of human religious experience runs far deeper than that.

What changed the calculus — slowly, unevenly, against fierce resistance — was the development of a different set of tools for managing uncertainty. Tools that did not require supernatural explanation. That were testable. Revisable. And in principle available to anyone, regardless of status or belief.

Critical thinking began as something modest. The recognition that some claims are better supported than others, and that it is possible to evaluate claims systematically rather than simply accepting the ones that come from the most authoritative source.

The Greek philosophical tradition was one of the first sustained attempts to apply this principle to the great questions of existence. Socrates was not primarily a philosopher of answers. He was a philosopher of questions — specifically, the uncomfortable questions that destabilize false certainties. His method, the Socratic method, was essentially a procedure for exposing the limits of what any of us actually knows. His famous claim that the wisest man is the one who knows what he does not know is not a counsel of despair. It is a methodological proposition. The recognition of the limits of our knowledge is the precondition for expanding those limits.

Eratosthenes calculated the circumference of the Earth, to within about two percent of the right value, in 240 BCE. He used a stick, some sunlight, and the kind of careful reasoning that’s available to anyone willing to look. He worked with shadow angles and camel-day estimates of the distance between two Egyptian cities. He got it right. And it still took nearly two thousand years to become socially acceptable.

That was 1,800 years before Copernicus. The evidence that the Earth was a sphere, that it moved, that the human senses and mathematical reasoning could be trusted over inherited doctrine — all of it was sitting in the library at Alexandria, patiently waiting. People were still arguing about it in the seventeenth century.

I sometimes wonder what a conversation with Copernicus would be like, if I could explain that today’s Church still holds enough authority to sway social and political discourse and influence every facet of modern society. From what we learn in school, to who we love at home. From the healthcare we need to live, to the suffering we endure to die. In my mind’s eye, I can see Copernicus belly-laughing, incredulous, before his smile fades into shock and horror as he realizes I’m not joking. I don’t even want to think about what a conversation with Eratosthenes would be like, considering flat earthers still exist more than two thousand years after he scienced the heck out of that one.

The period from 1500 to 1700 is called the Scientific Revolution for good reason. It represented a revolutionary shift in thinking. Long-held beliefs about anatomy, biology, astronomy, the natural world were shattered, challenging even the most sacred tenets of Holy Scripture. The early scientists who led this revolution sought absolute truth not through the interpretation of doctrine, but by being curious and using critical thinking. They built the foundation that Sir Francis Bacon ultimately synthesized into his scientific methodology.

Bacon’s insight, in his 1620 *Novum Organum*, was deceptively simple. Instead of beginning with general principles and deducing particular conclusions — the method of scholastic philosophy — begin with particular observations and induce general principles from them. Check the principles against further observations. Revise them when they fail. Repeat.

This sounds obvious now. It was not obvious in 1620. The intellectual tradition Bacon was challenging had held that the highest form of knowledge was deduction from universal principles revealed by authority. The authority of Aristotle. The authority of the Church. The authority of the classical tradition. Bacon argued that this tradition had produced, over centuries, an impressive edifice of speculation and a remarkably thin accumulation of knowledge about the actual world. The remedy was the deliberate, systematic subordination of theory to observation. Not the abandonment of theory. Its perpetual accountability to evidence.

In the four centuries since Bacon’s proposal, the application of the inductive method to the natural world has produced vaccines that eliminated smallpox and nearly eliminated polio. Antibiotics that transformed bacterial infections from frequently fatal to routinely curable. An understanding of matter that made semiconductor technology, medical imaging, and nuclear energy possible. A theory of evolution that explains the diversity of life on Earth. A cosmology that places our solar system in a galaxy of two hundred billion stars in a universe of two hundred billion galaxies. None of this was available in 1620. All of it became available through the systematic application of Bacon’s method.

The question I am asking in this book — and in this episode — is why we have not applied this method to the questions of social and political life with the same rigor and the same willingness to revise. Why do we accept, in the governance of our communities, a standard of evidence that would be immediately recognized as inadequate in any science laboratory?

Before I go further, let me anchor some terms, because they have been so systematically muddied in public discourse that recovering their meaning feels, at this point, like an act of resistance.

The scientific method is a process. You observe something. You form a hypothesis about why it happens. You test that hypothesis against evidence in ways that could in principle prove it wrong. You revise your understanding based on what you find. Then you invite others — ideally people who disagree with you — to try to break your conclusion. If it survives, it earns provisional acceptance. The best current explanation, held until better evidence or a better framework arrives. This is not an institution. It is not a credential. It is a procedure available to anyone, for distinguishing what is actually true from what merely feels true.

Critical thinking is the application of this discipline beyond the formal laboratory. The everyday practice of asking: What is the evidence for this claim? What assumptions am I making? What would convince me I was wrong? Is this source reliable, and how do I know? It is not cynicism, and it is not the reflexive rejection of everything. A critical thinker holds conclusions in proportion to the evidence and updates them honestly when better evidence arrives.

An objective truth is a claim about the world that does not depend on who is making it. The Earth orbits the Sun. The mean global surface temperature has risen by approximately 1.1 degrees Celsius since pre-industrial times. Children who receive gender-affirming care show substantially lower rates of depression and suicidal ideation than those who are denied it. These things are true or false independent of anyone’s feelings about them.

A subjective truth is a claim about inner experience — one that is real without being externally verifiable. When my son told me, at not yet three, that something was wrong with how he was being asked to present himself to the world, that was a subjective truth. I could not measure it the way I measure a temperature. But it was no less real.

Subjective truths are not lesser truths. The category I’m identifying as dangerous is not subjective truth itself — it’s the conflation of subjective truth with objective truth. The claim that personal certainty constitutes evidence. That what I feel deeply must therefore be factually true about the external world.

Logic is the formal structure of valid inference — what follows from what. Reason is the broader capacity to apply logical thinking to real questions. To connect evidence to conclusions without leaping. To identify the points where an argument breaks down. You’re applying logic when you notice that an argument proves too much — that if its principle were true, it would also justify things its proponents clearly would not accept. You’re applying reason when you hold two competing explanations up to the evidence and ask which one fits better. Neither requires a philosophy degree. Both require practice. Both can be learned.

The branch of philosophy that studies these concepts is called epistemology — the study of how we know what we know, and what separates justified belief from mere opinion. We are not, at bottom, fighting about policy. We are fighting about what counts as knowledge, and who gets to decide.

These are not abstractions. They are the operational foundation of every institution we have built to constrain arbitrary power. The legal system of every functioning democracy rests on exactly these commitments. A verdict must be reached based on evidence, argued through logic, evaluated by reason. The rule of law is, at its core, an epistemological commitment.

Karl Popper gave us the technical version of this same insight. A claim is scientific, Popper argued, if and only if there is some conceivable evidence that could in principle prove it wrong. Falsifiability. That criterion doesn’t just separate science from pseudoscience. It separates any honest claim from any merely rhetorical one. When someone tells you something and you ask “what evidence would change your mind?” and the answer is “nothing” — you now know something important about what kind of claim you are dealing with.

That test is worth running on yourself. I run it on myself regularly. If I cannot name, specifically, what evidence would cause me to revise a belief, I treat that belief with suspicion. Not contempt. Just the awareness that I am holding it for some reason other than the evidence.

Embracing uncertainty is not the same as abandoning the possibility of knowledge. The point is not that nothing can be known, or that all claims are equally valid, or that we must forever suspend judgment on everything. That position — sometimes called radical relativism — is not what I’m advocating. I’m advocating something simpler. The degree of confidence we place in a claim should reflect the quality of the evidence supporting it. When the evidence is weak or absent, hold the claim lightly. When the evidence is strong, consistent, and independently verified, you can and should act on it — while remaining open to revision.

The scientific consensus on human-caused climate change is not a matter of uncertain opinion. It is supported by converging lines of evidence from multiple independent disciplines, accumulated over decades, affirmed by the vast majority of researchers in the relevant fields. Acting on that consensus is not a failure of epistemic humility. It is an exercise of it.

Similarly, the medical consensus on gender-affirming care is not ideological advocacy. It is the product of decades of clinical research, longitudinal studies, and the accumulated expertise of every major medical and mental health organization in the United States and internationally. Calling that consensus uncertain is not intellectual honesty. It is the manufacture of false doubt in service of a predetermined conclusion.

This is the distinction that runs through the rest of the series. Honest uncertainty versus manufactured uncertainty. The genuine acknowledgment of what we do and don’t know — versus the cynical deployment of the language of doubt to protect false certainties from scrutiny.

Michael Tomasello, the primatologist whose work I will return to in the next episode, has shown that the deepest cognitive capacity that distinguishes human beings is what he calls shared intentionality. The ability to jointly attend to a common reality, share a goal, coordinate toward it. This capacity is the foundation of both science and democracy. It is what allows us to compare notes about the world. To build on each other’s observations. To correct each other’s errors. To arrive, slowly and imperfectly, at a more accurate picture than any individual could construct alone.

Science is not the work of geniuses working in isolation. It is the work of a community committed to a shared set of norms about evidence, argument, and the willingness to be wrong. Democracy, when it works, is the same.

I want to close this episode with a single sentence that holds the whole argument together. When the model doesn’t fit the data, you revise the model. That is the rule that keeps the bridge standing. That is the rule that lets a parent meet a three-year-old where the three-year-old actually is. That is the rule a society needs in order to govern itself by something other than its loudest fears.

It is not a complicated rule. It is, however, an unpopular one. The following episodes are about why.

Sign up on this site to receive updates on the soon-to-be-published book, “Hello, World, I’m the Dad of a Trans Kid: The Case for Curiosity in a World Addicted to Certainty.” If podcasts are your thing, please check out the podcast series of the same name.