Most people read conclusions. The conclusion is the least important part of any document.
A study’s conclusion tells you what the authors want you to believe. A regulator’s summary tells you what the agency decided. An earnings transcript tells you what management wants priced in. A news article tells you what the journalist understood, filtered through what the editor approved, shaped by what the audience expects.
None of these are what actually happened.
What actually happened is in the methodology. In the footnotes. In the reconciliation table. In the sentence starting with “excluding” or “adjusted for.” The conclusion is the building. The methodology is the foundation. Most people never look down.
The skill
Reading is not the skill. Thinking is the skill. Reading is the input channel.
Every document is a presentation of an underlying structure. The presentation is optimised for an audience. The structure is what’s actually there. The gap between them is where almost everyone gets lost, and where almost everything worth knowing lives.
A reader asks “what did they find?” A thinker asks “what did they actually test, and is that the same thing they claim to have answered?” A reader asks “did they beat estimates?” A thinker asks “what was excluded to produce this number?”
The practice: ignore the presentation. Find the structure. Check whether the structure supports what’s built on top of it.
Reverse the transformation
Every claim you encounter started as a complex reality that someone compressed into a presentable output. Your job is to run the process backwards.
A company reports adjusted EBIT growth of 15%. That’s the output. Open the reconciliation table: GAAP operating income declined 3%. The presentation and the reality are moving in opposite directions. The reconciliation table is the story. Everything else is decoration.
A study reports harmful effects at a given dose. That’s the output. Open the methods: the substance was artificially dispersed into a state that doesn’t exist outside a laboratory, delivered through a mechanism that doesn’t resemble human consumption, into subjects chemically pretreated to amplify the effect being measured. Other researchers tested the same substance under realistic conditions at orders of magnitude higher doses and found nothing. The study didn’t test what it claimed to test.
A supplier reports record production and the customer’s stock rises. That’s the output. Open the allocation: the production serves multiple customers, spare pools, and an aftermarket the supplier itself projects will triple. The headline number can’t simultaneously justify the supplier’s growth story and the customer’s. One of them is wrong.
Same error every time. Someone looked at the presentation and stopped.
Find the equivalences
This is where the skill goes from competent to rare.
Most people evaluate claims one at a time. The real leverage is recognising when two things that look different are structurally identical, or when two things that look identical are structurally different.
A regulator bans a substance. Another reviews the same evidence and doesn’t. These look like conflicting conclusions. They’re not. One applies a precautionary framework where “cannot rule out harm” is sufficient to act. The other applies a risk-based framework where “no demonstrated harm at realistic exposure” is sufficient to permit. They agree on the evidence. They disagree on the decision rule. Treating one as right and the other as wrong misidentifies the disagreement entirely.
A company’s revenue grows 12% and its share price drops. Contradictory, until the segment data shows growth came entirely from the low-margin division while the high-margin division contracted. Revenue grew. The earnings engine shrank. Same number, opposite meaning.
When two things are treated as equivalent, check whether they are. When two things are treated as contradictory, check whether they’re measuring the same thing.
Check who disagrees
Any time someone presents a conclusion as settled, ask who looked at the same evidence and reached a different answer.
If the answer is “several major regulatory bodies across multiple continents,” you don’t have a consensus. You have a disagreement being presented as a consensus by someone showing you one side.
Consensus is not the majority of sources you’ve encountered. It’s the majority of qualified evaluators who have examined the evidence. The internet makes minority positions look unanimous through volume.
Follow the money
Conflict of interest doesn’t invalidate a finding. It tells you where to apply scrutiny.
This cuts in every direction. The industry-funded study arguing safety deserves scrutiny. So does the influencer whose business model requires the audience to be frightened. The academic whose career depends on publishing novel alarming findings has a conflict. Everyone has a conflict.
The question is never “does this person have a conflict?” It’s “does the evidence stand independent of the conflict?” Answering that requires reading the evidence, which brings you back to the structure underneath.
Read what isn’t said
In filings: if a company derives 15-20% of revenue from a single region and doesn’t break it out, the absence is the disclosure.
In risk factors: read this year’s against last year’s. Additions, deletions, and changed language are never accidental.
In studies: what comparisons weren’t run, what controls weren’t included, what alternative explanations weren’t addressed. The methods section tells you what was done. The gaps tell you what was avoided.
Hold the distinction
“No evidence of harm” is not “evidence of no harm.” “Harm cannot be ruled out” is not “harm is occurring.” Precautionary conclusions and empirical findings are different things with different standards of proof. When someone presents one as the other, in either direction, that substitution is the error.
“We don’t know” is almost always the correct conclusion. It is never the popular one.
The incentive
A post that says “this is destroying you” gets shared. A post that says “one contested study using unrealistic methods found a potential mechanism at doses not relevant to human exposure, and five regulatory bodies disagreed” does not.
Every platform selects for confidence. The person who says “this will kill you” and the person who says “this is completely safe” both sound like they know something. The person who says “it’s complicated” sounds like they don’t.
The person who says it’s complicated is almost always the one who read the file.
The practice
Find the structure underneath the presentation. Check whether it supports what’s built on top. Find the equivalences nobody stated. Find the divergences nobody noticed. Follow the money. Read what isn’t said. Hold the uncertainty.