0% found this document useful (0 votes)
72 views4 pages

Ned Block - Consciousness and Accessibility

Uploaded by

es2344
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views4 pages

Ned Block - Consciousness and Accessibility

Uploaded by

es2344
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Commentary/Searle: Consciousness, and cognition

families of dispositions to behave involve the use of cognitive


structures that express systems of (unconscious) knowledge,
This discussion is the upshot of the application of two belief, expectation, evaluation, judgment, and the like. At least,
principles. Always ask yourself: What do you know for so it seems to me (p. 24). These systems may be unconscious for
sure? and: What facts are supposed to correspond to the the most part and even beyond the reach of conscious introspec-
claims you are making? Now, as far as the inside of the tion75 (p. 35).
Among the elements that are beyond the reach of conscious
skull is concerned, we know for sure that there is a brain introspection is "universal grammar" and Chomsky says: "Let
and that at least sometimes it is conscious. With respect to us define universal grammar (UG) as the system of principles,
those two facts, if we apply the second principle to the conditions, and rules that are elements or properties of all
discipline of cognitive science, we get the results I have human languages not merely by accident but by necessity — of
tried to present. course, I mean biological, not logical, necessity" (p. 29).
2. The argument here is a condensed version of a much
ACKNOWLEDGMENT longer development in Searle (1989). I have tried to keep its
I am indebted to a very large number of people for helpful basic structure intact; I apologize for a certain amount of
comments and criticisms on the topics discussed in this article. I repetition.
cannot thank all of them, but several deserve special mention; 3* I am indebted to Dan Rudermannforcalling my attention
indeed many of these patiently worked through entire drafts and to this article.
made detailed comments. I am especially grateful to David 4. For these purposes I am contrasting "neurophysiologicaT
Armstrong, Ned Block, Francis Crick, Hubert Dreyfus, Vinod and "mental," but in my view of mind/body relations, the
Goel, Stevan Hamad, Marti Hearst, Elisabeth Lloyd, Kirk mental simply is neurophysiological at a higher level (see Searle
Ludwig, Irvin Rock, Dagmar Searle, Nathalie van Bockstaele, 1984a). I contrast mental and neurophysiological as one might
and Richard Wolheim. contrast humans and animals without thereby implying that the
first class is not included in the second. There is no dualism
NOTES implicit in my use of this contrast.
1. Chomsky, Noam (1976): "Human action can be under- 5o Specifically, David Armstrong, Alison Gopnik, and Pat
stood only on the assumption that first-order capacities and Hayes.

conscious of their desire for water, we should think of their


desire as in principle accessible to consciousness because its
brain state gives rise to awareness of the desire for water in us.
As Searle notes, similar reasoning would justify ascribing visual
Commentaries submitted by the qualified professional readership of knowledge to blind-sighted patients.1
this journal will be considered for publication in a later issue as Oride Searle's point is set out in this way, it becomes clear that
Continuing Commentary on this article. Integrative overviews and it is susceptible to a straightforward objection. For we can
syntheses are especially encouraged. imagine a species of More Conscious People who bear the same
relation to us that we bear to the Less Conscious People. The
Consciousness and accessibility point calls for a concrete example.
Consider the dialect of English in which there is a difference
Ned Block between the pronunciation of the "ng" in "finger" and in
Department of Linguistics and Philosophy, Massachusetts institute of "singer." In the dialect I have in mind, the "ng" in "finger"
Technology, Cambridge, MA 02139 might be said to be hard, whereas the "ng" in "singer" is soft.
Electronic mail: [email protected]
(Actually, the "g" is deleted in "singer," and the "n" is velar.) in
Searle's Connection Principle says that unconscious mental this dialect there is a rule that at least describes the phe-
states must be in principle accessible to consciousness. If deep nomenon. For our purposes, we can take it as: Pronounce the
unconscious rules, representations, states, and processes are "ng" as soft in "nger" words derived from "ng" verbs — other-
not in principle accessible to consciousness, then they are not wise hard (see Chomsky & Halle 1968, pp. 85-87; see Halle
mental. I don't think that many in the cognitive science commu- 1990 for other examples). One bit of weak evidence in favor of
nity care whether these phenomena are mental or not; the the hypothesis that such an internally represented rule (note,
important point is that they are representational. But since incidentally, that like many other cognitivists, I say "inter-
Searle's argument applies as well to representationality as to nally," not "mentally") actually governs our behavior is that this
mentality, we can move on to the real issues. hypothesis predicts certain facts about the pronunciation of new
What does Searte mean by "'accessibility in principle?" One of words. If you tell a member of the dialect group in question that
the real issues of which I speak is what "in principle" comes to. "to bling" is to look under tables, asking what you call one who
(Another, to be discussed later, is what consciousness is.) Searle blings, they say "blinger" with a soft "g." This result rules out
clarifies his stance by describing people (I will call them the Less the hypothesis that "nger" word pronunciation is simply a
Conscious People) who have a desire for water, despite there matter of a memorized list, and it also rules out certain alter-
being a "blockage" that prevents this desire from having any native hypotheses of rules governing behavior. Nonetheless, I
disposition to become conscious. What makes the Less Con- concede that there is no strong evidenceforthe hypothesis that
scious People's desire in principle accessible to consciousness? an internal representation of the mentioned rule governs our
The answer, as I read Searle, is that the Less Conscious People behavior. But we need not tarry over this matter, since Searie's
have the same brain state that is the "I want water" configura- quarrel is with the very idea of the "deep unconscious," not with
tion in us. We have the "I want water" brain state just in case we the strength of the empirical evidence.
want water, and we satisfy the Connection Principle since we To return to the point, we can now imagine a species of
can become conscious of our desireforwater. So the presence of people, the aforementioned More Conscious People, some of
the "I want water" brain state in them justifies ascribing the whom speak the dialect of English just mentioned, and are also
desire for water to them. Though they are unable to become conscious of using the "nger" rule mentioned. Let us farther

596 BEHAVIORAL AND BRAIN SCIENCES (1990) 13:4


Commentary/Searle: Consciousness and cognition

suppose that in the More Conscious People, the use of this rule might have exactly the same aspectual shape despite the fact
is coextensive with a certain brain state, call it the applying- that one can be detected by certain mechanisms and the other
the-"nger"-rale brain state. To complete the analogy, let us now can be detected only by different mechanisms.
suppose that you and I also have the applying-the- "nger"-rule Further, for the access sense of "conscious," the metaphor of
brain state just in case we belong to the dialect group that makes thefishthat Searle rejects is quite appropriate. Just as the same
the mentioned distinction between "singer" and "finger." Here type offishcan be below or above the water, the same type of
is the punch line: The very reason that Searle gives for postulat- mental state (with the same aspectual shape) can be either
ing blockage in the Less Conscious People applies to us. We accessible or inaccessible to mechanisms of reasoning and re-
have a blockage that keeps us from becoming conscious of our porting. If Searle's argument is to get off the ground, he must
application of the "nger" rule. Since a similar story can be told take consciousness to be an intrinsic property of a conscious
for any "deep unconscious" phenomenon, this point can be used state, not a relational property.
to legitimize any of the cognitivist's favorite rules, representa- It is time to move to the obvious candidate, the "what it is
tions, states, or processes. Thus Searle's clarification of his like" sense. Understanding Searle this way, we enter deep and
notion of in principle accessibility undermines his overall claim muddy waters where Searle cannot so easily be refuted, but
against the deep unconscious. where he cannot so easily make his case either. I think we can
What does Searle mean bf "Conscious?" One way of stating see that his argument depends on a point of view that cognitive
Searle's argument for the Connection Principle is this: Men- scientists should not accept, however. An immediate problem
tality requires aspectual shape, but there is no matter of fact for Searle with this sense of consciousness is this: How does
about aspectual shape without (potential) consciousness; hence Searle know that there is nothing it is like to have the rules and
mentality requires (potential) consciousness. The main line of representations he objects to? That is, how does he know that
cognitivist reply should be to challenge the second premise, what he calls the "deep unconscious" really is unconscious in
arguing for a different theory of aspectual shape, namely, a the what-it-is-like sense? Indeed, how does he know that there
language of thought theory. This line of reply will be no surprise is nothing it is like to have Freudian unconscious desires?
to Searle. I prefer to follow a less traveled path, taking seriously (Recall that in the present sense of "unconscious" there is
Searle's point that consciousness is a neglected topic. nothing it is like to be in any unconscious state.) Our reasoning
The word "conscious" is notoriously ambiguous. In one sense and reporting mechanisms do not have direct information about
of the term, for a state to be conscious there must be something these states, so how are we to know whether there is anything it
"it is like" to have it. This is the sense thatfiguresin the famous is like to have them? Suppose you drive to your office, finding
inverted spectrum hypothesis: Perhaps, what it is like for me to when you arrive that you have been on "automatic pilot," and
look at things that we agree are red is the same as what it is like recall nothing of the trip. Perhaps your decisions en route were
for you to look at things that we agree are green. not available to reasoning and reporting processes, but that does
There are many other senses of "consciousness, "including not show that there was nothing it was like to, say, see a red light
Jaynes's (1977) "internal soliloquy" sense in which con- and decide to stop. Or consider a "deep unconscious" case. If
sciousness was actually discovered by the ancient Greeks.2 subjects wear headphones in which different programs are
There is one sense of'consciousness' that is particularly relevant played to different ears, they can obey instructions to attend to
for our concerns, one in which a state is conscious to the extent one of the programs. When so doing, they can report accurately
that it is accessible to reasoning and reporting processes. In on the content of the attended program, but can report only
connection with other states, it finds expression in speech. superficial features of the unattended program, for example,'
Something like this sense is the one that is most often meant whether it was a male or a female voice. Nonetheless, informa-
when cognitive science tries to deal in a substantive way with tion on the unattended program has the effect of favoring one of
consciousness, and it is for this reason that consciousness is often two possible readings of ambiguous sentences presented in the
thought of in cognitive science as a species of attention (see attended program. (See Lackner & Garrett 1973.) Does Searle
Posner 1978, Chapter 6, for example). know for sure that there is nothing it is like to understand the
Now there is some reason to take Searle's notion of con- contents of the unattended program? Let us be clear about who
sciousness to be the last sense, the accessibility sense. It is only has the burden of proof. Anyone who wants to reorient cognitive
in this sense that the phenomena postulated by Freud and by science on the ground that the rules, representations, states,
cognitive scientists are clearly and obviously unconscious. and processes of which it speaks are things that there is nothing
However, in this last sense of "conscious," Searle's Connection it is like to have must show this.
Principle is implausible and the argument for it is question- The underlying issue here depends on a deep division be-
begging. (I am assuming here that Searle will tighten his notion tween Searle and the viewpoint of most of cognitive science.
of "in principle accessibility" to avoid the conclusion of the last Cognitive science tends to regard the mind as a collection of
section that all of the "deep unconscious" is in principle accessi- semiautonomous agencies — modules - whose processes are
ble.) If consciousness is simply a matter of access to reasoning often "informationally encapsulated," and thus inaccessible to
and reporting processes, then two states could be exactly alike in other modules (see Chomsky 1986; Fodor 1983; Gazzaniga 1985;
all intrinsic properties, yet differ in that one is situated so that and Pylyshyn 1984) Though as Searle says, cognitivists rarely
reasoning and reporting processes can get at it, whereas the talk about consciousness (and to be sure, many cognitivists -
other is not. Yet the state that is badly situated with respect to Dennett, Harman, and Rey, for example - explicitly reject the
reasoning and reporting processes might be well situated with what-it-is-like sense), the cognitivist point of view is one accord-
respect to other modules of the mind, and may thus have an ing to which it is perfectly possible that that there could be
important effect on what we think and do. Searle would have to something it is like for one module to be in a certain state, yet
say that the first (well situated) state is mental, whereas the that this should be unknown to other modules, including those
second is not. But who would care about such a conception of that control reasoning and reporting. Searle will no doubt
mentality? This is why I say that according to the present sense disagree with this picture, but his conclusion nonetheless de-
of "conscious," the Connection Principle is implausible. pends on a view of the organization of the mind that is itself at
Recall that Searle's argument for the Connection Principle issue between him and cognitivists.
involves the premise that there is no matter of fact about Suppose Searle manages to refute the point of the last para-
aspectual shape without consciousness. But what reason would graph by showing that there is nothing it is like to be in states
there be to believe this premise if all that consciousness comes that are unavailable to reasoning and reporting. That is, suppose
to is a relation to reasoning and reporting processes? Two states that the access sense and the what-it-is-like sense of 'conscious'

BEHAVIORAL AND BRAIN SCIENCES (1990) 13:4 597


Commentary/Searle: Consciousness and cognition

apply to exactly the same things. Still, the issue arises as to valid interpretation should always stand behind It. (Bridgeman 1988,
which is primary. Searle will no doubt say that our states are p. 10, parenthetical comments added)
accessible to reasoning and reporting mechanisms precisely Thus purposeful language In neuroscience explanations
when and because there is something it is like to have them. But should always be taken metaphorically, a colorful and compact
how could he know that this is the way things are, rather than means of exposition that can always be unpacked to the two-step
the reverse, that is, there being something that it is like to have a mechanical/functional argument by the Informed stpdent.
state is a byproduct of accessibility of the state to reasoning and In this context, Rock (1984) receives somewhat of a "bum rap"
reporting mechanisms. If the latter is true, once again the from Searle. The processes of perception work as if they were
metaphor of the fish would be right. For an unconscious thought Intelligent, and Rock makes clear even In the quoted passage
would have its aspectual shape, whether or not reasoning and that the Intelligence describes processes In brains, not conscious
reporting processes can detect it; it would be only when and Insights. The perceptual parts of the brain merely process
because they detect it that there would be anything it is like to Information In ways that In other contexts are Interpreted as
have the thought. Intelligent.
The upshot is this: If Searle is using the access sense of The second part of my argument, that the Intentionalistic
"consciousness," his argument doesn't get to first base. If, as is explanation Is more of a problem than Searle makes It out to be,
more likely, he intends the what-it-is-like sense, his argument comes from a closer look at the processes that Searle Is still
depends on assumptions about issues that the cognitivist is willing to describe in Intentionalistic terms, that is, those ex-
bound to regard as deeply unsettled empirical questions. plicitly Identified as conscious.
Perhaps the primary problem with the concept of uncon-
NOTES scious mental state Is not the "unconscious" or the "mental," but
1. It is worth mentioning that it is easy to arrange experimental the assumption of a static "state." The state, of course not
situations in which normal people act like blind-sighted patients in that Searle's Invention, Is a problem because It derives ultimately
they give behavioral indications of possessing information that they say from Introspection and nothing else. At the start of his target
they do not have. See the discussion below of the Lackner & Garrett article, Searle questions the relatively small role of con-
(1973) experiment. Thus in one respect we are not very different from
the Less Conscious People. sciousness In cognitive science, noting that mention of the term
2o See Block, 1981, for a critique ofjaynes (1977), and Dennett, 1986, was completely suppressed in respectable circles until recently.
for a ringing defense of the importance of jaynes's notion of This was not always so, however - psychology as a separate
consciousness. discipline was in fact founded on the basis of introspection, or
careful examination of the contents of consciousness. Implicit in
this effort was the assumption that mental life was Indeed
accessible to consciousness. The assumption turned out to be
false. Freud formalized the Insight that some aspects of brain
Intention itself will disappear when its function were unconscious, and neurophyslology has revealed
mechanisms are known more and more nonconsclous processing in the brain. Every-
where we look, nonconscious processes such as early vision,
Bruce Bridgeman parsing of sensory tasks to different cortical areas even within a
Professor of Psychology, Clark Kerr Hall, University of California, Santa modality, or coding of different sorts of memory dominate the
Cruz, Ca 95064 brain.
Electronic mail: [email protected]
Psychology has been understandably wary of returning to
The problem of rnemtalistic explanation is both more and less consciousness, and has done so only with an array of new
than meets the eye. It is less because some of Searle's as-if techniques. But it is already clear that the role-of consciousness
examples were never meant to be taken literally. The problem is In mental life is very small, almost frightenlngly so. The aspects
more serious than Searle implies, however, because intentional of mental life that require consciousness have turned out to be a
language for brain processes Is always metaphorical; intention is relatively minor fraction of the business of the brain, and we
a result, not a process or state. must consider consciousness to be a brain system like any other,
Searle gives a particularly clear explanation of the contrast with particular functions and properties. It looms large only in
between Intentionallstic arid mechanical/functional explana- our introspections.
tions with his examples of anthropomorphosed plants and the More specifically, whenever we examine an aspect of-what
successful mechanical/functional explanation of the VOR (ves- seems to be conscious it turns out to be made of simpler parts..
tibular ocular reflex). But he exaggerates in assigning in ten™ The process of seeing, for Instance, is made of a great cascade of
tionallty wherever cognitive scientists use intentionalistic lan- neural processing based on a welter of relatively simple al-
guage. Searle admits that "We often make . . . metaphorical gorithms. What had seemed like visual intelligence turns out to
attributions of as-if intentionality to artifacts," but he maintains be only processing after all, when we look empirically at how it
that the as-if character is lost when descriptions of brain pro- works. (A similar fate has befallen seemingly Intelligent AI
cesses are concerned. Not so - in fact, the contrast between efforts.) Wherever we look for intentionality, we find only
functional and as-if explanation is quite explicit in the neuro- neurons, as Searle laments. We can predict that intentionality
sciences, and Is taught to students of physiological psychology. will evaporate in the twenty-first century, as certainty did In the
A recent textbook (with which I am particularly familiar) twentieth.
makes this clear In words that almost echo Searle's: My final comment Is on the problem of aspectual shape. The
Biologists often speak, in a kind of verbal shorthand, as though useful resolution, seen from psychology, Is that just the fact that a
traits were evolved purposefully, using statements such as: Fish conscious manifestation has aspectual shape is no reason that the
evolved complex motor systems to coordinate their quick swimming memories on which It Is based should have any such property.
movements." This is the intentionalistic statement. The textbook Why not construct the aspectual shape during the process of
goes on: "What they really mean, though, is that the fish that by bringing memory from storage? With apologies to Searle, the
chance happened to have a few more neurons in the motor parts of computer analogy applies here, in a kind of lilies-of-the-field
their brains (Searle's mechanical hardware explanation). . . survived argument - If the humble computer has a given capacity or
in greater numbers and had more offspring than those that happened property, why not us? The Information stored on my computer's
to have fewer neurons or less effectively organized ones (Searle's disc has no margins, lines or paragraphs; it looks nothing like the
functional explanation). . . . The shorthand of purposeful language form it will have when it Is displayed on my terminal. When it Is
will sometimes be used in this book, though the more biologically needed, the bare-bones disc information Is receded, formatted,

598 BEHAVIORAL AND BRAIN SCIENCES (1990) 13:4


Response/Searle: Consciousness and cognition
Young in various ways question my notion of con- I assign no epistemic privilege to our knowledge of our
sciousness; and several others, specifically Chomsky^ own conscious states. By and large, I think our knowledge
Limber, Piattelli-Palmarini and Mey9 think I am relying of our own conscious states is rather imperfect, as the
in some way on the notion of introspection. much cited work of Nisbett and Wilson (1977) shows. My
.By consciousness I simply mean those subjective states points about consciousness, to repeat, had little or noth-
of awareness or sentience that begin when one wakes in ing to do with epistemology. They were about the on-
the morning and continue throughout the period that one tology of consciousness and its relation to the ontology of
is awake until one falls into a dreamless sleep, into a coma, the mental.
or dies, or is otherwise, as they say, unconscious. On my Chomsky., by the way, is mistaken in thinking that the
account, dreams are a form of consciousness (this answers discussion in this article is the same as our dispute of
the queries of Young and Hodgkin & Houston), though several years ago. That issue was genuinely epistemic.
they are of less intensity than full blown waking alertness. This one is not.
Consciousness is an on/off switch: You are either con-
scious or not. Though once conscious, the system func- i¥D Program explanations: Reply to Objection 5
tions like a rheostat, and there can be an indefinite range
of different degrees of consciousness, ranging from the One of the most fascinating things in this discussion is the
drowsiness just before one falls asleep to the full blown extent of the disagreement among the objectors about
complete alertness of the obsessive. There are lots of the nature of cognitive science explanations. Most of the
different degrees of consciousness, but door knobs, bits of commentators accept my claim that cognitive science
chalk, and shingles are not conscious at all. (And you will typically postulates deep unconscious rules and that as if
have made a very deep mistake if you think it is an intentionality explains nothing. But several have a differ-
interesting question to ask at this point, "How do you ent conception of cognitive science explanation. They see
know that door knobs, etc., are not conscious?") These the computational paradigm not as an implementation of
points, it seems to me, are misunderstood by Block He intrinsic intentionality but as an alternative to it or even a
refers to what he calls an "access sense of consciousness." rejection of it. Matthews9 Higgiebotham, Glymour and -
On my account there is no such sense. I believe that he, as to some extent - the EDITOBIAL COMMENTARY take it that
well as Uleman & Uleraan9 confuse what I would call computational forms of explanation might be a substitute
peripheral consciousness or inattentiveness with total for intentionalistic explanations, and Hobbs and McDer-
unconsciousness. It is true, for example, that when I am mott think we have already superseded intentionalistic
driving my car "on automatic pilot" I am not paying much explanations, that the ascriptions of intentionality in cog-
attention to the details of the road and the traffic. But it is nitive science are entirely as if but that this does not
simply not true that I am totally unconscious of these matter because the causal account is given by the pro-
phenomena. If I were, there would be a car crash. We gram explanation. We substitute an algorithmic explana-
need therefore to make a distinction between the center tion in terms of formal symbol manipulation for the
of my attention, the focus of my consciousness on the one intentionalistic explanation.
hand, and the periphery, on the other. William James The logical situation we are in is this: We are assuming
and others often use the notion of consciousness to mean that the Chinese Room Argument shows that the program
what I am referring to as the center of conscious attention. level is not sufficient by itself for intentionality and that
This usage is different from mine. There are lots of the Connection Principle argument shows that there is no
phenomena right now of which I am peripherally con- deep unconscious intentionality. Well, all the same, the
scious, for example, the feel of the shirt on my neck, the question remains open whether or not the brain pro-
touch of the computer keys at my finger tips, and so on. cesses might still be in part computational.
But as I use the notion, none of these is unconscious in the And the underlying intuition is this: As a matter of plain
sense in which the secretion of enzymes in my stomach is fact, there are a lot of physical systems out there in the
unconscious. world that are digital computers. Now maybe, as a matter
A remarkably large number of commentators think that of plain fact, each brain is one of those. And if so, we could
introspection plays some role in my account. They are causally explain the behavior of the brain by specifying its
mistaken. I make no use of the notion of introspection at programs the same way we can causally explain the
all. Chomsky in particular thinks that I assign some behavior of this machine by specifying its programs.
special epistemic priority to introspection, that I am I can, for example, causally explain the behavior of the
committed to the view that we have some special knowl- machine on which I am writing this by saying that it is
edge of our own mental states by what Chomsky calls an running the vi program and not the emacs program.
"inner eye." That is not my view at all. Except when Hobbs and McDermott, more strongly than the others,
quoting Chomsky, I was very careful never to use the concede my points about the nonexistence of the deep
word "introspection" in the course of my article, because, unconscious and think that cognitive scientists in general
strictly speaking, I do not believe there is any such thing. should also, but they think the Darwinian Inversion could
The idea that we know our conscious mental states by be avoided because they think we might just discover that
introspection implies that we spect intro, that is, that we the brain is a biological computer.
know them by some inner perception. But the model of Notice that this point is logically independent of the
perception requires a distinction between the act of main argument in the article. I could, in principle, just
perceiving and the object perceived, and that distinction concede it as irrelevant to the present issue but I want to
cannot in general be made for our own conscious states, discuss it, at least briefly, because I think the hypothesis
This point was already implicit in our discussion of Objec- of computationalism as a causal explanatory model of
tion 3. cognition has some real difficulties.

BEHAVIORAL AND BRAIN SCIENCES (1990) 13:4 635

You might also like