Sunteți pe pagina 1din 2

Lexical semantics

added on Feb. 25th, 2007

Traditional lexical semantics (structuralist and earlier, including anthropological


linguistics) and early semantics within generative grammar (Fodor/Katz, Jackendoff)
took lexical meanings as a starting point for a theory of semantics: the most common
paradigms involved "decomposing" lexical meaning in terms of (what were
hypothesized) "semantic primitives", allegedly atomic units of meaning. This
technique actually turned out to be only applicable at all to a small part of the lexicon,
and it never helped with any adequate theory of compositional semantics. Starting in
the 1970's and 1980's, linguistic semantics came under the influence of Richard
Montague's work, which took an account of truth conditions (which hypothetical
states of affairs the sentence would correctly describe) and denotation (what objects
words could refer to) as the basic goal of semantics---this is the conception of
semantics that logicians had employed for their formal logical languages for decades,
but it had never been seriously applied to natural languages before Montague. In
linguistics, this approach is most commonly called "model-theoretic semantics". The
most useful linguistic semantic data, under this theory, is 'entailment'---that is, if a
certain sentence is true in a situation, what other sentences must necessarily also be
true (for example, if "John doesn't need either car or house is true, then "John doesn't
need car and doesn't need house " must also be true). When this is the case, the first
sentence is said to "entail" the second). If you examine enough of the entailments of a
sentence you want to analyze (and contrast this with what kinds of sentences it does
NOT entail), and come up with a good theory of these, then it is suprising how much
of its meaning you have analyzed---eventually, all of it comes under entailment of one
kind or another. Logicians had worked out the meanings that "logical words" have
('and', 'or', 'not', 'if', 'some', 'every', 'none', 'possible', 'only' etc.) but had not devoted
attention to "non-logical words" (most nouns, verbs, adjectives, etc.). The primary
strength of this kind of theory, and the one first developed, is its account of
compositional semantics---how the meaning of a sentence is determined from the
meanings of its parts?. This question can be studied without first providing specific,
complete meanings for the individual words, because one can study compositionality
just by hypothesizing the class of all possible denotations each word might have.
However, linguists in the 1970's and 1980's began to apply the technique to some
classes of these words--how certain classes of verbs differed from other verbs, for
example. One interesting thing about all this is that the second approach studies
semantics "from the top down": that is, given what we know about the meaning of a
WHOLE sentence (from its entailments), we can ask what the meanings of subparts of
the sentence, including individual words, will have to be in order to contribute to the
entailments of the whole sentence correctly (given the principles of how word
meanings are combined compositionally to give sentence meanings). This, then, is an
example of the kind of study that analyzes the meaning of a word, you must determine
what role it plays in the meanings of the sentences in which it occurs. Most
commonly, the result of this study is not an analysis of the complete meaning of any
word (except the 'logical words'), but of how one large class of words differs form
another (for example, adjectives like 'former', 'future', 'false', 'alleged', 'apparent',
differ in from most other adjectives---including 'red', 'Canadian', 'feline', 'prime'---in a
a quite general way that can be characterized precisely in model-theoretic semantics).
How a large number of classes of word meanings differ must be understood before
you coiuld ever claim to have analyzed the meaning of any one single word. This
approach has led to very productive and very interesting analysis of word meaning in
natural languages, as well as better ever betrter understanding of compositional
semantics. (Idioms and collocations present interesting analytic problems, but they are
a very small part of the study of lexical and compositional semantics as a whole.) The
first approach was a "bottom up" approach---start analyzing meaning by first trying to
devise a theory of word meaning, then only afterward construct a theory of how word
meanings are put together to form phrase and sentence. Ironically, this never led to a
very enlightening or interesting theory of word meaning, and a real theory of
compositional semantics never arose at all from this approach. The most recent
development is in the theory of pragmatics---how the meaning of a sentence is
affected by the discourse context in which it is used (who is speaking, what the time,
place, audience, etc. are, what the content of the previous discourse has been, and
what knowledge the speaker assumes that the others in the conversation already
have). And the model-theoretic method has proved to be the most effective way to
study pragmatics as well as semantics. In America, and in many countries in Europe,
model-theoretic semantics has become the dominant approach to semantic theory
within linguistics, though structural semantics in various forms is still widely
practiced too. But quite aside from theoretical semantics of either of these two sorts,
compositional and word semantics is increasing studied as part of computational
linguistics (including the practical goal of making coimputers interact with users via
(limited use of) natural languages), as well as psycholinguistics.

S-ar putea să vă placă și