0% found this document useful (0 votes)
18 views18 pages

fallacies

The document discusses various logical fallacies, which are errors in reasoning that can mislead arguments. It categorizes fallacies such as ad hominem, strawman, appeal to ignorance, false dilemma, slippery slope, circular argument, hasty generalization, red herring, tu quoque, and causal fallacies, explaining their definitions and implications. Understanding these fallacies is essential for effective reasoning and argumentation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views18 pages

fallacies

The document discusses various logical fallacies, which are errors in reasoning that can mislead arguments. It categorizes fallacies such as ad hominem, strawman, appeal to ignorance, false dilemma, slippery slope, circular argument, hasty generalization, red herring, tu quoque, and causal fallacies, explaining their definitions and implications. Understanding these fallacies is essential for effective reasoning and argumentation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 18

BUSINESS

LOGIC
LOGICAL
FALLACIES

PROF. SHEKINAH S. GALUNA


WHAT IS A LOGICAL FALLACY?

A logical fallacy is an error in reasoning common enough to warrant a fancy name.


Knowing how to spot and identify fallacies is a priceless skill. It can save you time,
money, and personal dignity. There are two major categories of logical fallacies,
which in turn break down into a wide range of types of fallacies, each with their
own unique ways of trying to trick you into agreement.

ADD A FOOTER 2
AD HOMINEM
• In logic and rhetoric, a personal attack is called an ad
hominem. Ad hominem is Latin for “against the man.” Instead
of advancing good sound reasoning, an ad hominem replaces
logical argumentation with attack-language unrelated to the
truth of the matter.
• More specifically, the ad hominem is a fallacy of relevance
where someone rejects or criticizes another person’s view on
the basis of personal characteristics, background, physical
appearance, or other features irrelevant to the argument at
issue.
• An ad hominem is more than just an insult. It’s an insult used
as if it were an argument or evidence in support of a
conclusion. Verbally attacking people proves nothing about the
truth or falsity of their claims. Use of an ad hominem is
commonly known in politics as “mudslinging.” Instead of
addressing the candidate’s stance on the issues, or addressing
his or her effectiveness as a statesman or stateswoman, an ad
Ad hominem is an insult used as if it were an hominem focuses on personality issues, speech patterns,
argument or evidence in support of a wardrobe, style, and other things that affect popularity but have
conclusion. no bearing on their competence.
STRAWMAN
ARGUMENT
• This fallacy can be unethical if it’s done on purpose,
The Strawman argument is aptly named after a deliberately mischaracterizing the opponent’s position
harmless, lifeless, scarecrow. In the strawman for the sake of deceiving others. But often the
argument, someone attacks a position the opponent strawman argument is accidental, because the
doesn’t really hold. Instead of contending with the offender doesn’t realize the are oversimplifying a
actual argument, he or she attacks the equivalent of a nuanced position, or misrepresenting a narrow,
lifeless bundle of straw, an easily defeated effigy, cautious claim as if it were broad and foolhardy.
which the opponent never intended upon defending
anyway.
The strawman argument is a cheap and easy way to
make one’s position look stronger than it is. Using
this fallacy, opposing views are characterized as “non-
starters,” lifeless, truthless, and wholly unreliable. By
comparison, one’s own position will look better for it. With the strawman argument, someone
You can imagine how strawman arguments and ad attacks a position the opponent doesn’t 4

hominem fallacies can occur together, demonizing really hold.


opponents and discrediting their views.
APPEAL TO IGNORANCE
(ARGUMENTUM AD IGNORANTIAM)

An appeal to ignorance isn’t proof of anything except that you don’t


know something.
• Any time ignorance is used as a major premise in support of an argument, it’s liable to be a fallacious appeal to
ignorance. Naturally, we are all ignorant of many things, but it is cheap and manipulative to allow this unfortunate
aspect of the human condition to do most of our heavy lifting in an argument.
• Interestingly, appeal to ignorance is often used to bolster multiple contradictory conclusions at once. Consider
the following two claims:

• “No one has ever been able to prove definitively that extra-terrestrials exist, so they must not be real.”
• “No one has ever been able to prove definitively that extra-terrestrials do not exist, so they must be real.”
• If the same argument strategy can support mutually exclusive claims, then it’s not a good argument strategy.

• An appeal to ignorance isn’t proof of anything except that you don’t know something. If no one has proven the
non-existence of ghosts or flying saucers, that’s hardly proof that those things either exist or don’t exist. If we
don’t know whether they exist, then we don’t know that they do exist or that they don’t exist. Appeal to ignorance
doesn’t prove any claim to knowledge.
FALSE DILEMMA/FALSE DICHOTOMY
Dilemma-based arguments are only fallacious when, in fact, there are
more than the stated options.
• This fallacy has a few other names: “black-and-white fallacy,” “either-or fallacy,” “false dichotomy,” and “bifurcation
fallacy.” This line of reasoning fails by limiting the options to two when there are in fact more options to choose
from. Sometimes the choices are between one thing, the other thing, or both things together (they don’t exclude
each other). Sometimes there is a whole range of options, three, four, five, or a hundred and forty-five. However it
may happen, the false dichotomy fallacy errs by oversimplifying the range of options.

• Dilemma-based arguments are only fallacious when, in fact, there are more than the stated options. It’s not a
fallacy however if there really are only two options. For example, “either Led Zeppelin is the greatest band of all
time, or they are not.” That’s a true dilemma, since there really are only two options there: A or non-A. It would be
fallacious however to say, “There are only two kinds of people in the world: people who love Led Zeppelin, and
people who hate music.” Some people are indifferent about that music. Some sort of like it, or sort of dislike it, but
don’t have strong feelings either way.
6
• The false dilemma fallacy is often a manipulative tool designed to polarize the audience, heroicizing one side and
demonizing the other. It’s common in political discourse as a way of strong-arming the public into supporting
controversial legislation or policies.
SLIPPERY SLOPE
FALLACY
• You may have used this fallacy on your parents as a teenager:
“But, you have to let me go to the party! If I don’t go to the party,
I’ll be a loser with no friends. Next thing you know I’ll end up alone
and jobless living in your basement when I’m 30!” The slippery
slope fallacy works by moving from a seemingly benign premise
or starting point and working through a number of small steps to
an improbable extreme.
• This fallacy is not just a long series of causes. Some causal
chains are perfectly reasonable. There could be a complicated
series of causes that are all related, and we have good reason for
expecting the first cause to generate the last outcome. The
slippery slope fallacy, however, suggests that unlikely or ridiculous
outcomes are likely when there is just not enough evidence to
think so.
• It’s hard enough to prove one thing is happening or has
happened; it’s even harder to prove a whole series of events will
happen. That’s a claim about the future, and we haven’t arrived The slippery slope fallacy suggests that
there yet. We, generally, don’t know the future with that kind of
certainty. The slippery slope fallacy slides right over that difficulty
unlikely or ridiculous outcomes are 7
by assuming that chain of future events without really proving likely when there’s just not enough
their likelihood. evidence to think so.
CIRCULAR
ARGUMENT
(PETITIO PRINCIPII)
• When a person’s argument is just repeating what they already
assumed beforehand, it’s not arriving at any new conclusion.
We call this a circular argument or circular reasoning. If
someone says, “The Bible is true; it says so in the Bible”—
that’s a circular argument. They are assuming that the Bible
only speaks truth, and so they trust it to truthfully report that it
speaks the truth, because it says that it does. It is a claim
using its own conclusion as its premise, and vice versa, in the
form of “If A is true because B is true; B is true because A is
true”. Another example of circular reasoning is, “According to
my brain, my brain is reliable.” Well, yes, of course we would
think our brains are in fact reliable if our brains are the one’s
telling us that our brains are reliable.

• Circular arguments are also called Petitio principii, meaning


“Assuming the initial [thing]” (commonly mistranslated as
“begging the question”). This fallacy is a kind of presumptuous
argument where it only appears to be an argument. It’s really
ADD A FOOTER 8
just restating one’s assumptions in a way that looks like an
argument. You can recognize a circular argument when the
conclusion also appears as one of the premises in the
argument
HASTY GENERATION
• A hasty generalization is a general statement without sufficient evidence to support it. A hasty generalization is made out of a rush to have a
conclusion, leading the arguer to commit some sort of illicit assumption, stereotyping, unwarranted conclusion, overstatement, or exaggeration.
• Normally we generalize without any problem; it is a necessary, regular part of language. We make general statements all the time: “I like going
to the park,” "Democrats disagree with Republicans,” "It’s faster to drive to work than to walk," or "Everyone mourned the loss of Harambe, the
Gorilla.”
• Indeed, the above phrase “all the time” is a generalization — we aren’t literally making these statements all the time. We take breaks to do
other things like eat, sleep, and inhale. These general statements aren’t addressing every case every time. They are speaking generally, and,
generally speaking, they are true. Sometimes you don’t enjoy going to the park. Sometimes Democrats and Republicans agree. Sometimes
driving to work can be slower than walking if the roads are all shut down for a Harambe procession.
• Hasty generalization may be the most common logical fallacy because there’s no single agreed-upon measure for “sufficient” evidence. Is one
example enough to prove the claim that, "Apple computers are the most expensive computer brand?" What about 12 examples? What about if
37 out of 50 apple computers were more expensive than comparable models from other brands?
• There’s no set rule for what constitutes “enough” evidence. In this case, it might be possible to find reasonable comparison and prove that
claim is true or false. But in other cases, there’s no clear way to support the claim without resorting to guesswork. The means of measuring
evidence can change according to the kind of claim you are making, whether it’s in philosophy, or in the sciences, or in a political debate, or in
discussing house rules for using the kitchen. A much safer claim is that "Apple computers are more expensive than many other computer
brands.”
• A simple way to avoid hasty generalizations is to add qualifiers like “sometimes,” "maybe," "often," or "it seems to be the case that . . . ". When
we don’t guard against hasty generalization, we risk stereotyping, sexism, racism, or simple incorrectness. But with the right qualifiers,
9 we can
often make a hasty generalization into a responsible and credible claim.
RED HERRING
FALLACY
(IGNORATIO ELENCHI
• A “red herring fallacy” is a distraction from the argument typically
with some sentiment that seems to be relevant but isn’t really on-
topic. This tactic is common when someone doesn’t like the
current topic and wants to detour into something else instead,
something easier or safer to address. A red herring fallacy is
typically related to the issue in question but isn’t quite relevant
enough to be helpful. Instead of clarifying and focusing, it
confuses and distracts.
• The phrase “red herring” refers to a kippered herring (salted
herring-fish) which was reddish brown in color and quite pungent.
According to legend, this aroma was so strong and delectable to
dogs that it served as a good training device for testing how well
a hunting dog could track a scent without getting distracted. Dogs
aren’t generally used for hunting fish so a red herring is a
distraction from what he is supposed to be hunting.
• A red herring fallacy can be difficult to identify because it’s not
always clear how different topics relate. A “side” topic may be
used in a relevant way, or in an irrelevant way. In the big meaty
A red herring fallacy can be difficult to disagreements of our day, there are usually a lot of layers
identify because it’s not always clear how involved, with different subtopics weaving into them.10 We can
different topics relate. guard against the red herring fallacy by clarifying how our part of
the conversation is relevant to the core topic.
TU QUOQUE
FALLACY
• The “tu quoque,” Latin for “you too,” is also called the “appeal to
hypocrisy” because it distracts from the argument by pointing out
hypocrisy in the opponent. This tactic doesn’t solve the problem, or
prove one’s point, because even hypocrites can tell the truth.
Focusing on the other person’s hypocrisy is a diversionary tactic. In
this way, using the tu quoque typically deflects criticism away from
yourself by accusing the other person of the same problem or
something comparable. If Jack says, “Maybe I committed a little
adultery, but so did you Jason!” Jack is trying to diminish his
responsibility or defend his actions by distributing blame to other
people. But no one else’s guilt excuses his own guilt. No matter who
else is guilty, Jack is still an adulterer.
• The tu quoque fallacy is an attempt to divert blame, but it really only
distracts from the initial problem. To be clear, however, it isn’t a fallacy
to simply point out hypocrisy where it occurs. For example, Jack may
say, “yes, I committed adultery. Jill committed adultery. Lots of us did,
but I’m still responsible for my mistakes.” In this example, Jack isn’t
defending himself or excusing his behavior. He’s admitting his part
within a larger problem. The hypocrisy claim becomes a 11 tu quoque
fallacy only when the arguer uses some (apparent) hypocrisy to
neutralize criticism and distract from the issue.
CAUSAL FALLACY
• The causal fallacy is any logical breakdown when identifying a cause. You can think of the causal fallacy as a parent category for several
different fallacies about unproven causes.
• One causal fallacy is the false cause or non causa pro causa ("not the-cause for a cause") fallacy, which is when you conclude about a cause
without enough evidence to do so. Consider, for example, “Since your parents named you ‘Harvest,’ they must be farmers.” It’s possible that
the parents are farmers, but that name alone is not enough evidence to draw that conclusion. That name doesn’t tell us much of anything
about the parents. This claim commits the false cause fallacy.
• Another causal fallacy is the post hoc fallacy. Post hoc is short for post hoc ergo propter hoc ("after this, therefore because of this"). This
fallacy happens when you mistake something for the cause just because it came first. The key words here are “post” and “propter” meaning
“after” and “because of.” Just because this came before that doesn’t mean this caused that. Post doesn’t prove propter. A lot of superstitions
are susceptible to this fallacy. For example: “Yesterday, I walked under a ladder with an open umbrella indoors while spilling salt in front of a
black cat. And I forgot to knock on wood with my lucky dice. That must be why I’m having such a bad day today. It’s bad luck.”
• Now, it’s theoretically possible that those things cause bad luck. But since those superstitions have no known or demonstrated causal power,
and “luck” isn’t exactly the most scientifically reliable category, it’s more reasonable to assume that those events, by themselves, didn’t cause
bad luck. Perhaps that person’s "bad luck" is just their own interpretation because they were expecting to have bad luck. They might be
having a genuinely bad day, but we cannot assume some non-natural relation between those events caused today to go bad. That’s a Post
Hoc fallacy. Now, if you fell off a ladder onto an angry black cat and got tangled in an umbrella, that will guarantee you one bad day.
• Another kind of causal fallacy is the correlational fallacy also known as cum hoc ergo propter hoc (Lat., “with this therefore because of this").
This fallacy happens when you mistakenly interpret two things found together as being causally related. Two things may correlate without a
causal relation, or they may have some third factor causing both of them to occur. Or perhaps both things just, coincidentally, happened
together. Correlation doesn’t prove causation. Consider for example, “Every time Joe goes swimming he is wearing his Speedos. Something 12
about wearing that Speedo must make him want to go swimming.” That statement is a correlational fallacy. Sure it’s theoretically possible that
he spontaneously sports his euro-style swim trunks, with no thought of where that may lead, and surprisingly he’s now motivated to dive and
swim in cold, wet nature. That’s possible. But it makes more sense that he put on his trunks because he already planned to go swimming.
FALLACY OF SUNK COSTS
We are susceptible to this errant behavior when we crave that sense of
completion or a sense of accomplishment
• Sometimes we invest ourselves so thoroughly in a project that we’re reluctant to ever abandon it, even when it turns out
to be fruitless and futile. It’s natural and usually not a fallacy to want to carry on with something we find important, not
least because of all the resources we’ve put into it. However, this kind of thinking becomes a fallacy when we start to
think that we should continue with a task or project because of all that we’ve put into it, without considering the future
costs we’re likely to incur by doing so. There may be a sense of accomplishment when finishing, and the project might
have other values, but it’s not enough to justify the cost invested in it.
• “Sunk cost” is an economic term for any past expenses that can no longer be recovered. For example, after watching the
first six episodes of Battlestar Galactica, you decide the show isn’t for you. Those six episodes are your “sunk cost.” But,
because you’ve already invested roughly six hours of your life in it, you rationalize that you might as well finish it. All
apologies to Edward James Olmos, but this isn’t "good economics" so to speak. It’s more cost than benefit.

• Psychologically, we are susceptible to this errant behavior when we crave that sense of completion or a sense of
accomplishment, or we are too comfortable or too familiar with this unwieldy project. Sometimes, we become too
emotionally committed to an “investment,” burning money, wasting time, and mismanaging resources to do it
13
APPEAL TO AUTHORITY
(ARGUMENTUM AD VERECUNDIAM)
• This fallacy happens when we misuse an authority. This misuse of authority can occur in a number of ways. We can cite only authorities — steering
conveniently away from other testable and concrete evidence as if expert opinion is always correct. Or we can cite irrelevant authorities, poor
authorities, or false authorities. The argumentum ad verecundiam (“argument from respect”) can be hard to spot. It’s tough to see, sometimes,
because it is normally a good, responsible move to cite relevant authorities supporting your claim. It can’t hurt. But if all you have are authorities, and
everyone just has to “take their word for it” without any other evidence to show that those authorities are correct, then you have a problem.
• Often this fallacy refers to irrelevant authorities — like citing a foot doctor when trying to prove something about psychiatry; their expertise is in an
irrelevant field. When citing authorities to make your case, you need to cite relevant authorities, but you also need to represent them correctly, and
make sure their authority is legitimate.
• Suppose someone says, “I buy Hanes™ underwear because Michael Jordan says it’s the best.” Michael Jordan may be a spokesperson, but that
doesn’t make him a relevant authority when it comes to underwear. This is a fallacy of irrelevant authority.
• Now consider this logical leap: “four out of five dentists agree that brushing your teeth makes your life meaningful.” Dentists generally have expert
knowledge about dental hygiene, but they aren’t qualified to draw far-reaching conclusions about its existential meaningfulness. This is a fallacy of
misused authority. For all we know, their beliefs about the “meaning of life” are just opinions, not expert advice.
• Or take the assumption that, “I’m the most handsome man in the world because my mommy says so.” Now, while I might be stunningly handsome,
my mom’s opinion doesn’t prove it. She’s biased. She’s practically required to tell me I’m handsome because it’s her job as a mother to see the best
in me and to encourage me to be the best I can be. She’s also liable to see me through “rose-colored glasses.” And, in this case, she’s not an expert
in fashion, modeling, or anything dealing in refined judgments of human beauty. She’s in no position to judge whether I’m the most handsome man in
the world. Her authority there is illusory (sorry mom).
• There’s another problem with relying too heavily on authorities: even the authorities can be wrong sometimes. The science experts in the 16th century
thought ADD
the AEarth was the center of the solar system (geocentrism). Turns out they were wrong. The leading scientists in the 19th century14thought that
FOOTER
the universe as we know it always existed (steady state theory). They too were wrong. For these reasons, it’s a good general rule to treat authorities
as helpful guides with suggestive evidence, but even authorities deserve a fair share of skepticism since they can make mistakes, overstep their
expertise, and otherwise mislead you.
EQUIVOCATION
(AMBIGUITY)
• Equivocation happens when a word, phrase, or sentence is used
deliberately to confuse, deceive, or mislead by sounding like it’s
saying one thing but actually saying something else. Equivocation
comes from the roots “equal” and “voice” and refers to two-
voices; a single word can “say” two different things. Another word
for this is ambiguity.

• When it’s poetic or comical, we call it a “play on words.” But when


it’s done in a political speech, an ethics debate, or in an
economics report, for example, and it’s done to make the
audience think you’re saying something you’re not, that’s when it
becomes a fallacy. Sometimes, this is not a “fallacy” per se, but
just a miscommunication. The equivocation fallacy, however, has
a tone of deception instead of just a simple misunderstanding.
Often this deception shows up in the form of euphemisms,
replacing unpleasant words with “nicer” terminology. For example,
a euphemism might be replacing “lying” with the phrase “ creative
license, ” or replacing my “criminal background” with my “youthful
indiscretions,” or replacing “fired from my job” with “taking early
retirement.” When these replacement words are used to mislead 15
people they become an equivocation fallacy.
APPEAL TO PITY
(ARGUMENTUM AD MISERICORDIAM)

Truth and falsity aren’t emotional categories, they are factual categories
• Argumentum ad misericordiam is Latin for “argument to compassion.” Like the ad hominem fallacy above, it is a fallacy of relevance. Personal
attacks, and emotional appeals, aren’t strictly relevant to whether something is true or false. In this case, the fallacy appeals to the compassion and
emotional sensitivity of others when these factors are not strictly relevant to the argument. Appeals to pity often appear as emotional manipulation.
For example,
• “How can you eat that innocent little carrot? He was plucked from his home in the ground at a young age and violently skinned, chemically treated,
and packaged, and shipped to your local grocer, and now you are going to eat him into oblivion when he did nothing to you. You really should
reconsider what you put into your body.”
• Obviously, this characterization of carrot-eating is plying the emotions by personifying a baby carrot like it’s a conscious animal, or, well, a baby. By
the time the conclusion appears, it’s not well-supported. If you are to be logically persuaded to agree that “you should reconsider what you put into
your body,” then it would have been better evidence to hear about unethical farming practices or unfair trading practices such as slave labor, toxic
runoffs from fields, and so on.
• Truth and falsity aren’t emotional categories, they are factual categories. They deal in what is and is not, regardless of how one feels about the
matter. Another way to say it is that this fallacy happens when we mistake feelings for facts. Our feelings aren’t disciplined truth-detectors unless
we’ve trained them that way. So, as a general rule, it’s problematic to treat emotions as if they were (by themselves) infallible proof that something is
true or false. Children may be scared of the dark for fear there are monsters under their bed, but that’s hardly proof of monsters.
• To be fair, emotions can sometimes be relevant. Often, the emotional aspect is a key insight into whether something is morally repugnant or
praiseworthy, or whether a governmental policy will be winsome or repulsive. People’s feelings about something can be critically important data when
planning a campaign, advertising a product, or rallying a group together for a charitable cause. It becomes a fallacious appeal to pity when the
emotions are used in substitution for facts or as a distraction from the facts of the matter. 16

• It’s not a fallacy for jewelry and car companies to appeal to your emotions to persuade you into purchasing their product. That’s an action, not a claim,
so it can’t be true or false. It would however be a fallacy if they used emotional appeals to prove that you need this car, or that this diamond bracelet
will reclaim your youth, beauty, and social status from the cold clammy clutches of Father Time. The fact of the matter is, you probably don’t need
BANDWAGON FALLACY
• The bandwagon fallacy assumes something is true (or right, or good) because other people agree with it. A couple different
fallacies can be included under this label, since they are often indistinguishable in practice. The ad populum fallacy (Lat., “to the
populous/popularity”) is when something is accepted because it’s popular. The concensus gentium (Lat., “consensus of the
people”) is when something is accepted because the relevant authorities or people all agree on it. The status appeal fallacy is
when something is considered true, right, or good because it has the reputation of lending status, making you look “popular,”
“important,” or “successful.”

• For our purposes, we’ll treat all of these fallacies together as the bandwagon fallacy. According to legend, politicians would parade
through the streets of their district trying to draw a crowd and gain attention so people would vote for them. Whoever supported that
candidate was invited to literally jump on board the bandwagon. Hence the nickname “bandwagon fallacy.”

• This tactic is common among advertisers. “If you want to be like Mike (Jordan), you’d better eat your Wheaties.” “Drink Gatorade
because that’s what all the professional athletes do to stay hydrated.” “McDonald’s has served over 99 billion, so you should let
them serve you too.” The form of this argument often looks like this: “Many people do or think X, so you ought to do or think X too.”

• One problem with this kind of reasoning is that the broad acceptance of some claim or action is not always a good indication that
the acceptance is justified. People can be mistaken, confused, deceived, or even willfully irrational. And when people act together,
17
sometimes they become even more foolish — i.e., “mob mentality.” People can be quite gullible, and this fact doesn’t suddenly
change when applied to large groups.
the end.

18

You might also like