Skip to content

Logical Fallacies

The Observatory Almanac โ€” Section 19

A logical fallacy is an error in reasoning โ€” an argument that seems valid but isn't. Recognizing them helps you think more clearly, argue more effectively, and resist manipulation.

Format for each entry: - Name / Also Known As - Definition - Example (contemporary, relatable) - Why it's persuasive (why smart people fall for it) - How to counter it


1. Ad Hominem

Also known as: Personal attack, attacking the person

Definition: Attacking the person making the argument rather than the argument itself.

Example: "You shouldn't listen to Dr. [X]'s advice on nutrition โ€” they're clearly overweight."

Why it's persuasive: We naturally evaluate sources, and trustworthiness of a person does matter sometimes. It feels like relevant information even when it isn't.

How to counter it: "That observation about the person may or may not be true, but it doesn't tell us whether the argument is correct. What's wrong with the reasoning itself?"


2. Straw Man

Also known as: Misrepresentation

Definition: Misrepresenting someone's argument to make it easier to attack โ€” arguing against a distorted version of what they actually said.

Example: Person A: "I think we should increase the education budget." Person B: "So you think throwing money at problems is always the solution?"

Why it's persuasive: The reconstructed argument is usually extreme or absurd, making it genuinely easy to refute. Audiences may not notice the bait-and-switch.

How to counter it: "That's not what I said. My actual position is [restate clearly]. Can you respond to that?"


3. Appeal to Authority

Also known as: Argument from authority, ipse dixit

Definition: Claiming something is true because an authority figure says so, without evaluating the evidence.

Example: "This diet must work โ€” a celebrity trainer endorsed it."

Note: Citing genuine experts is reasonable โ€” the fallacy occurs when (a) the authority isn't expert in this specific domain, (b) there's significant expert disagreement being ignored, or (c) the authority is used as a substitute for evidence rather than a supplement.

Why it's persuasive: We correctly use expert consensus as evidence, and distinguishing genuine expertise from inappropriate appeal is cognitively demanding.

How to counter it: "What evidence supports this? Does this person have relevant expertise, and do most experts in the field agree?"


4. False Dichotomy

Also known as: False dilemma, either/or fallacy, black-and-white thinking

Definition: Presenting only two options when more exist.

Example: "You're either with us or against us." / "If you don't support this policy, you must not care about children."

Why it's persuasive: Binary choices are cognitively simple. Nuanced positions require more mental effort to articulate and evaluate.

How to counter it: "Those aren't the only two options. A third possibility is [X], and there's also [Y] to consider."


5. Slippery Slope

Also known as: Thin end of the wedge, the camel's nose

Definition: Asserting that one event will inevitably lead to extreme consequences without evidence for the causal chain.

Example: "If we allow same-day marijuana sales, next they'll be selling heroin at elementary schools."

Note: Not all slippery slope arguments are fallacious โ€” sometimes there really are incremental effects. The fallacy occurs when the chain of causation is assumed rather than demonstrated.

Why it's persuasive: Fear of extreme outcomes is visceral. The claimed chain feels intuitive even without evidence.

How to counter it: "What evidence shows that step A leads to step B? Each step in that chain needs independent justification."


6. Red Herring

Also known as: Irrelevant conclusion, ignoratio elenchi

Definition: Introducing irrelevant information to distract from the actual argument.

Example: (Debate about healthcare costs) "But what about the problems with the VA system? We need to talk about that first."

Why it's persuasive: The diversion often touches on a real issue, which makes it feel relevant. Audiences lose track of the original question.

How to counter it: "That's an interesting point but it's separate from the question we're discussing, which is [X]. Let's return to that."


7. Tu Quoque

Also known as: Appeal to hypocrisy, "you too," whataboutism's close cousin

Definition: Deflecting criticism by pointing out that the person criticizing you does the same thing.

Example: "You say I shouldn't smoke, but you smoked for 20 years."

Why it's persuasive: Hypocrisy is genuinely relevant to trust and credibility. The problem is that hypocrisy doesn't make a claim false โ€” a hypocritical doctor can still give correct medical advice.

How to counter it: "Whether I've been consistent is a separate question. The argument stands on its own merits regardless."


8. Bandwagon

Also known as: Appeal to popularity, ad populum

Definition: Arguing something is true or correct because many people believe it.

Example: "Millions of people have tried this supplement โ€” it must work."

Why it's persuasive: Social proof is a powerful cognitive shortcut. When something is widely accepted, it's often because it's true โ€” so the heuristic usually works, which makes the exceptions hard to catch.

How to counter it: "Popularity isn't evidence of truth. What's the actual evidence? Lots of people once believed the earth was flat."


9. Appeal to Emotion

Also known as: Argumentum ad passiones, emotional manipulation

Definition: Using emotional appeals (fear, pity, outrage) to substitute for logical argument.

Example: (Showing a sad child) "Think of the children. You must support this policy."

Note: Emotions are legitimate in argumentation โ€” empathy and values matter. The fallacy occurs when emotion replaces evidence rather than accompanying it.

Why it's persuasive: Emotion is processed faster and more powerfully than logic. We feel before we think.

How to counter it: "I understand this is emotionally compelling, but what's the actual evidence that this policy produces the outcome we care about?"


10. Circular Reasoning

Also known as: Begging the question, petitio principii

Definition: Using the conclusion as a premise in your own argument.

Example: "The Bible is true because it says so in the Bible." / "I'm a great manager because I make excellent management decisions."

Why it's persuasive: Circular arguments can be very hard to spot when the circle is large. The premises and conclusion may be worded differently, obscuring the loop.

How to counter it: "Your argument assumes the thing it's trying to prove. What evidence supports the premise independently of the conclusion?"


11. Hasty Generalization

Also known as: Overgeneralization, fallacy of the lonely fact

Definition: Drawing a broad conclusion from an insufficient number of examples.

Example: "I've met three rude people from that city โ€” everyone there is unfriendly."

Why it's persuasive: Vivid personal experience feels convincing. We're wired to learn from individual cases, which works fine for immediate environments but fails at scale.

How to counter it: "That's a small sample. What does the broader data show? Is this consistent across many observations?"


12. False Cause

Also known as: Post hoc ergo propter hoc ("after, therefore because"), correlation/causation

Definition: Assuming that because B followed A, A caused B.

Example: "I wore my lucky jersey and the team won. The jersey makes them win."

Why it's persuasive: Temporal sequence is the most natural evidence of causation. We're pattern-matchers, and before-after is the most basic pattern.

How to counter it: "Correlation isn't causation. What mechanism would explain A causing B? What else might explain the pattern?"


13. Equivocation

Also known as: Semantic ambiguity

Definition: Using the same word in two different senses within an argument.

Example: "The sign said 'fine for parking here.' So I parked there because it was fine."

Why it's persuasive: Ambiguous language is everywhere, and spotting the shift in meaning requires careful attention to words rather than flow.

How to counter it: "Let's define the key terms precisely. In premise 1, 'X' seems to mean [A]. In the conclusion, it seems to mean [B]. Which do you mean?"


14. Loaded Question

Also known as: Complex question, leading question

Definition: Asking a question that contains a hidden assumption that must be accepted to answer.

Example: "Have you stopped cheating on your exams?" (Either yes or no accepts that you were cheating.)

Why it's persuasive: Questions feel less threatening than accusations. The embedded assumption slips by because the mind focuses on the explicit question.

How to counter it: "I'd like to challenge the assumption in that question before answering it. I haven't been cheating."


15. No True Scotsman

Also known as: Appeal to purity

Definition: Redefining a group to exclude counterexamples that challenge a claim about that group.

Example: "No true [group member] would do that." (Used to dismiss counterexamples.) Person A: "Some Christians support [policy]." Person B: "No true Christian would."

Why it's persuasive: It's unfalsifiable โ€” any counterexample can be excluded by definition. It feels like clarification.

How to counter it: "You're changing the definition of the group to exclude the counterexample. That makes your claim unfalsifiable."


16. Moving the Goalposts

Also known as: Raising the bar, special pleading

Definition: Demanding additional evidence after an initial standard has been met, rather than accepting the conclusion.

Example: "Show me one scientific study." (Study provided.) "Well, that's just one study. Show me ten." (Ten provided.) "Well those aren't from top journals..."

Why it's persuasive: Skepticism is a virtue. Each individual demand for more evidence seems reasonable, so the pattern of perpetual demand goes unchallenged.

How to counter it: "What specific evidence would satisfy you? If you can state the standard clearly, we can see whether it's been met."


17. Appeal to Nature

Also known as: Naturalistic fallacy (in this sense)

Definition: Arguing that something is good because it's natural, or bad because it's unnatural.

Example: "This supplement is safe โ€” it's completely natural!" (Arsenic and botulinum toxin are natural too.)

Why it's persuasive: Nature carries associations of health, purity, and balance โ€” associations that sometimes have merit but are not universally true.

How to counter it: "Natural and synthetic are categories, not value judgments. Many natural things are harmful, many synthetic things are beneficial. What's the actual evidence of safety/efficacy?"


18. Sunk Cost

Also known as: Sunk cost fallacy, Concorde fallacy

Definition: Continuing a course of action because of past investment (time, money, effort) rather than future value.

Example: "I've already spent $5,000 on this failing business. I have to keep going."

Why it's persuasive: Quitting feels like waste and loss. The past investment is emotionally real even though it's economically irrelevant to future decisions.

How to counter it: "The money is already spent regardless of what we do. The only question is: looking forward, is continuing or stopping the better choice?"


19. Whataboutism

Also known as: Tu quoque extended, deflection, false equivalence through distraction

Definition: Deflecting criticism of one party by pointing to similar behavior in another, without addressing the original criticism.

Example: (Criticizing one country's human rights record) "But what about what [other country] does?"

Why it's persuasive: It introduces real information (the other party's behavior) and exploits our sense of fairness. If we're going to criticize X, we should criticize Y too, right?

How to counter it: "Whether [other party] also engages in this behavior is a separate conversation. Does that make the original criticism wrong? Let's address it directly."


20. Burden of Proof

Also known as: Onus probandi, shifting the burden

Definition: Claiming that the person who doubts a claim must disprove it, rather than the person making the claim proving it.

Example: "Prove that ghosts don't exist." / "You can't prove this product doesn't work."

Why it's persuasive: Demanding disproof is an effective rhetorical move. Absence of disproof feels like evidence.

How to counter it: "The burden of proof lies with the person making the claim. I don't need to disprove it โ€” you need to prove it. What evidence do you have?"


21. Composition / Division

Composition Fallacy: What's true of the parts must be true of the whole. Example: "Every player on this team is excellent, so this team must be excellent." (Great individuals often form dysfunctional teams.)

Division Fallacy: What's true of the whole must be true of each part. Example: "This team is world-class, so every player must be world-class."

Why it's persuasive: We intuitively aggregate and disaggregate. The fallacy occurs when properties don't transfer this way.

How to counter it: "Group properties don't necessarily apply to members, and vice versa. What's the specific evidence about [the part/whole in question]?"


22. Texas Sharpshooter

Also known as: Data dredging, cherry-picking

Definition: Selecting data that supports a conclusion while ignoring contradictory data โ€” the reverse of proper hypothesis testing (like shooting at a barn and drawing bullseyes around where the bullets hit).

Example: "This stock analyst correctly predicted 7 market movements this year!" (They made 30 predictions and you're hearing about the 7 hits.)

Why it's persuasive: Selected evidence looks like strong evidence. Without knowing the full dataset, there's no way to detect the selection.

How to counter it: "How many total predictions/data points were there? What was the full dataset before selecting these results?"


23. Anecdotal Evidence

Also known as: Anecdote as evidence

Definition: Using a personal story or single instance as evidence for a broad claim.

Example: "My grandmother smoked until 95 and was perfectly healthy โ€” smoking can't be that bad."

Why it's persuasive: Personal stories are vivid, emotionally resonant, and immediately available. They feel more real than abstract statistics.

How to counter it: "One example can't overturn population-level data. It might represent an exception. What does the systematic evidence show?"


24. Middle Ground

Also known as: False compromise, argument to moderation, splitting the difference

Definition: Assuming the truth lies between two positions, simply because they are two positions.

Example: "Scientists say vaccines are safe; some people say they cause harm. The truth must be somewhere in the middle."

Why it's persuasive: Moderation and balance are virtues. "Both sides must have a point" feels fair.

How to counter it: "The middle between a true position and a false one is still false. What's the actual evidence, regardless of who's claiming what?"


25. Gambler's Fallacy

Also known as: Monte Carlo fallacy, hot hand fallacy (inverse)

Definition: Believing that past independent random events influence future probabilities.

Example: "It's been heads 8 times in a row โ€” it has to be tails next." (Each coin flip is independent.)

Why it's persuasive: Our brains are expectation machines that seek balance and patterns. Random streaks feel unsustainable.

How to counter it: "Each event is independent. Past outcomes don't change the probability of future outcomes in random systems."


26. Appeal to Tradition

Also known as: Argumentum ad antiquitatem, "we've always done it this way"

Definition: Arguing that something is good or correct simply because it's old or traditional.

Example: "Marriage has always been defined this way โ€” we shouldn't change it."

Why it's persuasive: Tradition often encodes tested wisdom. It's not always wrong to value it. The fallacy occurs when tradition is used as a substitute for evaluation.

How to counter it: "The fact that something has been done a certain way doesn't tell us whether it should continue to be. What are the actual reasons for this practice?"


27. Genetic Fallacy

Also known as: Origin fallacy

Definition: Evaluating an argument by its source rather than its content โ€” either dismissing good arguments because of their origin or accepting bad ones for the same reason.

Example: "That idea came from a political opponent, so it must be wrong." / "That came from our side, so it must be right."

Why it's persuasive: We use source heuristics because they're often reliable shortcuts. The fallacy is treating them as decisive rather than preliminary.

How to counter it: "Where an idea comes from doesn't determine whether it's correct. What are the actual merits of the argument?"


28. Moral Equivalence

Also known as: False equivalence in the moral domain

Definition: Treating two morally very different actions as equivalent based on superficial similarity.

Example: "Criticizing a government policy is just as bad as the genocide they're committing."

Why it's persuasive: Pointing out that both sides do something bad feels like balanced analysis. Calibrating the magnitude of wrongs requires careful thought.

How to counter it: "Both X and Y may have flaws, but the severity, scale, and nature of those flaws matter. Treating them as equivalent obscures important differences."


29. Dunning-Kruger Effect

(Technically a cognitive bias, but practically essential) Also known as: Mount Stupid

Definition: People with limited knowledge in a domain tend to overestimate their competence, while experts tend to underestimate theirs.

The curve: Novices gain a little knowledge โ†’ peak confidence โ†’ learn more โ†’ confidence crashes ("I don't know anything") โ†’ continue learning โ†’ calibrated expertise.

Example: The person who spent a weekend reading about vaccines and is now certain the medical establishment is wrong. The expert immunologist who hedges everything with "the evidence suggests" and lists caveats.

Why it's relevant: Overconfident novices sound more certain than genuine experts. Certainty โ‰  correctness.

How to counter it: Look for calibrated uncertainty in experts ("the evidence suggests," "in most cases"). Notice uncaveated certainty in non-experts as a red flag, not a sign of confidence.


30. Confirmation Bias

(Cognitive bias, not strictly a fallacy โ€” but essential) Also known as: Myside bias

Definition: The tendency to seek out, interpret, and remember information in ways that confirm existing beliefs while ignoring contradictory evidence.

Example: Reading only news sources that agree with your political views. Remembering the predictions you got right and forgetting the ones you got wrong.

Why it's universal: It's not a failure of intelligence โ€” it's a feature of how all human brains process information. No one is immune.

How to counter it: 1. Actively seek out the strongest version of the opposing view 2. Ask "What would change my mind?" and take the answer seriously 3. Track your predictions and evaluate them honestly 4. Read outside your echo chamber deliberately


Quick Reference Index

Fallacy One-Line Summary
Ad Hominem Attacking the person, not the argument
Straw Man Arguing against a distorted version
Appeal to Authority Citing authority instead of evidence
False Dichotomy Only two options when more exist
Slippery Slope Unproven chain of consequences
Red Herring Irrelevant distraction
Tu Quoque "You do it too"
Bandwagon True because popular
Appeal to Emotion Emotion instead of evidence
Circular Reasoning Conclusion assumed in premise
Hasty Generalization Too few examples, too broad a claim
False Cause After doesn't mean because
Equivocation Same word, two meanings
Loaded Question Question embeds false assumption
No True Scotsman Redefine group to exclude counterexamples
Moving the Goalposts Perpetual demand for more evidence
Appeal to Nature Natural = good; unnatural = bad
Sunk Cost Past investment driving future choices
Whataboutism Deflect by pointing to others
Burden of Proof Demanding disproof instead of providing proof
Composition/Division Part = whole (or reverse)
Texas Sharpshooter Cherry-picking data
Anecdotal One story overriding data
Middle Ground Truth must be between two positions
Gambler's Fallacy Past independent events affect future odds
Appeal to Tradition Old = correct
Genetic Fallacy Judge by origin, not content
Moral Equivalence Very different things treated as equal
Dunning-Kruger Novice overconfidence, expert underconfidence
Confirmation Bias Seeking evidence that confirms existing beliefs

Knowing these fallacies doesn't make you right โ€” it makes you better at figuring out what's right. Use them to evaluate arguments (including your own) not to win debates.