Sunteți pe pagina 1din 8

METAPHORS IN ACTION

Editor; RAYMOND Gozzi, JR.

IS THE COMPUTER
A VALID METAPHOR
EOR THE HUMAN MIND?

T HEmetal
COMPUTER is a Collection of wires, plastic, solder, "chips,"
and cathode-ray tubes. Described in such a fashion,
one would not expect to find organic metaphors to be ap-
plied to it.
However, the computer is routinely described in our cul-
ture by a cluster of metaphors which center around the mas-
ter metaphor of the computer as a brain. It is held to have a
memory, to be able to read words in special languages, to recog-
nize certain voices, to make decisions, and even simulate hu-
man thought.
In the realm of thought, what difference is there between a
simulation of thought and the real thing (whatever it is)?
Not much, said pioneer computer scientist Alan Turing, who
predicted that' by the end of the century such quesfions
would seem irrelevant. If the simulation of thought can ar-
rive at valid conclusions, what is the difference?
The metaphorical identification of the computer's opera-
tions with mental operations can quickly force us to face our
basic assumptions about mind and thought. In general, we
assume that we know what our minds are, and that it is use-
ful and economical to ascribe mental functions to computers.
* Dr. Gozzi is Associate Professor of Communication al Bradley University in
Peoria, Illinois.

445
446 Et cetera WINTER 1991-92

It is easy to communicate using such metaphors as computer


"memory" to describe the data storage functions of the ma-
chine.

It is worth pointing out, however, that we really do not


know that much about the mind. Is "mind" identical with the
brain, separate from it, dependent upon it, or complementary
with it? You will find all points of view argued.
Is our mind ours alone? Or is it part of a larger collectivity?
Certainly language and culture, which shape the mind, are
collective instruments. And the storehouse of imagery in our
unconscious minds seems to have collective ties, called by
Jung the Collective Unconscious.
And the mind has a chameleon-like quality. It can seem to
take on many different forms. As Patanjali claims in his Yoga
Sutras, the mind is like a crystal, taking on the colors of any-
thing it is near to.
Rather than assuming that we know what the mind is, and
extrapolating from this known domain to the unknown do-
main of computers, we should humbly realize that the mind
is largely an unknown domain itself. Our culture, for exam-
ple, is very poor in vocabulary to describe states of mind
compared with Hindu or Buddhist cultures, which have a
long history of meditative practice and have bequeathed
huge vocabularies to describe different mental states.
Therefore the metaphorical identification between comput-
ers and mental functions can flow both ways. We can see
computers in terms of minds, and/or minds in terms of com-
puters. Much of cognitive psychology rests on the meta-
phorical mapping of mental functions in terms of computer
functions.
I believe this identification useful up to a point, beyond
which it can lead to serious confusions as to what is human
and what is mechanical. (See my article in the Spring, 1989
Et cetera for a discussion of this problem.) (1)

Here I wish to focus on what the metaphorical identifica-


tion of the mind with computers implies for our conceptions
of mind. If we accept the metaphor of the computer as a
model for our mental functioning, we accept some baggage
METAPHORS IN ACTION 447

which it is well to make explicit. First, we accept a materialis-


tic definition of mind. Second, we assume thinking is like
programming.
A materialistic definition of mind is perhaps predominant
in our culture in the late twentieth century. It sees mind as
an emergent property, resulting from ever-more complex
patterns of organization of matter. The basic unit of mind,
Gregory Bateson felt, was the simple feedback loop. Thus
any cybernetic device could be said to possess mind, even
though in rudimentary form. (2)
Given this approach, it is no problem to assume that a com-
puter, with its many cybernetic potentialities, has, or can
have, a mind. The mind will appear like a glow around the
circuitry, and perhaps one day we can keep it "alive" for
longer than a microsecond and begin to communicate with it.
Then we are into the realm of science fiction, which has
prophesied many possible courses for such communication
to take.
This materialistic definition of mind, in my opinion, will
ultimately not be able to demonstrate any difference between
human mental functioning and mechanical simulations of
"mind." A materialistic approach will not be able to see any
difference between a sufficiently sophisticated computer and
a human mind.

I would like to contrast the materialistic view of mind with


another perspective implicit in Hindu philosophy. This posi-
tion claims that the entire universe is ultimately mental in
nature, since it is created by the thought of the Absolute Be-
ing. In this view, matter is the emergent phenomenon, ap-
pearing as the Universal Mind thinks repetitive and coherent
thoughts. Matter is congealed mind. Our own minds are
small sparks refiecting the great mental flame. (3)
With this spiritual, or perhaps mentalistic, view of mind
and the universe, the computer takes on the role of cheap
imitation. It is a simulator put together by a small piece of
the universal mind called humanity. It may have important
lessons about specific sub-processes, but should never be
confused with the basic mind, which is superior to matter.
448 Et cetera • WINTER 1991-92

Since, according to this view, computer simulations are


mere imitations, we will look in vain within them for insights
into the animating awareness which is the inseparable
ground of all human mental activities.
Undoubtedly, some of our mental activities are mechan-
ical-like, automatic, and programmed. Yet we always have
the saving potential to transcend our automatic behavior. As
a Buddhist saying goes, in matters of the mind, any obstacle
can be turned into a pathway to enlightenment.
How far we accept the metaphor of the mind as a comput-
er, then, rests ultimately on how we define the mind: in ma-
terialist or spiritual terms.

A second important piece of baggage accompanies the


metaphor of the mind as a computer. Computers run, as we
all know, on programs. If the mind is a computer, does it run
solely on programs? If so, who writes the programs for the
mind? Can the mind-computer write its own programs? Or
do the programs come "hard-wired"? Or do they come about
through fortuitous interactions with other mind-computers?
The ambiguities about the origins of these programs carry
important implications for our understanding of human be-
ings. Are we simply responding to programs when we think
we are making our own decisions? This is nineteenth-
century determinism in its newest guise. Or can we program
ourselves? In which case, is "program" the best metaphor to
describe what we do, since it implies such predetermination?
There is a further issue regarding programming. Can we
transcend our programming's limitations? A program must
start with a set of formal propositions then play out the im-
plications of that set. Yet, as Godel showed in his famous
proof, any sufficiently rich system, such as arithmetic, will
have resultants which are not derivable from the formal
propositions with which the system was founded.
In other words, a program-based system would be con-
fined to operations which derived from its founding formal-
isms, and would miss much that was true about the systems
it was examining. However, human thought jumps beyond
its formal bases all the time, using what we call intuition,
hunches, connoisseurship, or expertise. This suggests that
METAPHORS IN ACTION 449

human thought is fundamentally different from computer


operations.
Much of this issue comes down to specifiability: can we
specify all the steps a human mind takes when it operates?
Pick your experts — some say yes, others say no.
Alan Turing felt that all essentials of mental operations
could be specified, therefore a "universal machine" could
combine simple operations and perform any operation a per-
son's mind could. However, this has been tested primarily in
work on mathematical problems, a restricted area of thought.
When machines are used for language-processing, limita-
tions in their abilities become apparent Of special interest to
Et cetera readers are the difficulties machines encounter in
the area of semantics. Words can mean many things, and
much of their meaning is carried in their context. People
easily take in this mix of explicit and implicit meaning, but
the process cannot be specified so that a machine can do it.
After years of optimistic hype about how computers would
soon be able to process natural language, many researchers
now feel this goal is far off. "It's not in sight," Stanford Profes-
sor Terry Winograd told the Atlantic Monthly. "I'm not saying
it will never happen, but it's not something that can be done
by improving and tuning up existing systems." (4)
There is no reason to think that the new neural networks
incorporated in computer design, with their eerie ability to
"learn", will come any closer to understanding semantics.

Michael Polanyi claims that all human knowledge contains


an unspecifiable personal component. This personal knowl-
edge is largely semantic in nature, having to do with mean-
ings and implications. He claims that, by definition, this
aspect of knowledge cannot be transferred to a machine. As
Polanyi writes, "no unspecifiable skill or connoisseurship can
be fed into a machine." (5)
Some of the same sentiments appear in a newspaper for
union steamfitters and plumbers. A California utility spent
over $300,000 to develop an expert system to replace one
worker, an engineer who had perfected the ability to diag-
nose and correct problems at one of the utility's massive
450 Et cetera • WINTER 1991-92

dams. The programmers were not successful. The article


asked:

How would you possibly translate into computerese all that


UA workers do? Our jobs entail a special combination of
technical knowledge backed by years of training, on-the-job
experience that helps perfect an inborn talent, plus plain old
hard, physical work You just can*t duplicate all of that with
a machine...There are also the intangible qualities of doing
the job — such as knowing when it "feels" right. How can
you program a computer — something with no thoughts of
its own — to know when it "feels" right?

At first, giving electronic circuits the ability to reason like a


human sounds simple. After all, reasoning is a matter of log-
ical thoughts, one following the other in an orderly manner,
right?

Wrong. Human thinking is not machine-like. It includes


logic, yes, but also emotions, conscience, experience and in-
dividual preference. And nothing can replace that. (6)

This article has sounded some cautionary notes about our


use of the computer as a metaphor for the human mind. This
metaphor provides a problematic map for the territory.
There is clearly a large area where the metaphor is instruc-
tive, and the entailments of the metaphor lead us to useful
insights about how the mind might work. However, we
should not become prisoners of this metaphor. It implies a
materialistic definition of mind, which, in turn implies that
thinking can only follow out the paths of its programming.
Let us gain what insights we can from the metaphor, without
uncritically accepting the whole package.

NOTES AND REFERENCES


1. Raymond Gozzi, Jr., (1989), "Metaphors That Undermine Hunian Identity."
Etcetera, 46.(1), 49-53.
2. See Gregory Bateson, (1979), Mind and Nature, (NY: Dutton), for his later
thinking on this issue.
METAPHORS IN ACTION 451

3. See Yogi Ramacharaka, (1905), Advanced Course in Yogi Philosophy, (Chicago:


Yoga Publication Society), for an exposition of this f>oint of view for Western-
ers.
4. Barbara Wallraff, (1988), "The Literate Computer," Atlantic, 261, (January),
64-71. Quotation is froni p. 71.
5. See Michael Polanyi, (1958), Personal Knowledge, Chicago: University of Chi-
cago Press, especially pages 257-263 for his discussion of the differences be-
tween human and machine knowledge.
6. The Labor Paper, October 2D, 1988. "Machines Compute, But People Think."
No author cited.

S-ar putea să vă placă și