Sunteți pe pagina 1din 44

Cognitive linguistics and language structure

Abstract
Cognitive linguists agree that language is handled mentally by general cognitive structures and processes rather than by a dedicated mental module. However, in spite of remarkable progress in some areas, cognitive linguists have generally paid little attention to the possible implications of this cognitive assumption for the theory of language structure how language is organised and what structures we assign to utterances. Cognitive Grammar has avoided formali ation, and the various versions of construction grammar have adopted rather conservative views on grammatical structure. !he e"ception is #ord Grammar, which offers a radical alternative view of language structure. !his paper defends the structural claims of #ord Grammar on the grounds that most of them follow logically from the cognitive assumption $though a few need to be revised%. !he paper starts by breaking this assumption into a number of more specific tenets relating to learning, network structures, recycling, inheritance, relations, activation and chunking. &t then shows how these tenets support various claims that distinguish #ord Grammar in cognitive linguistics. 'ccording to this argument, morphology and synta" are distinct levels, so language cannot consist of nothing but symbols or constructions. (oreover, in synta" the main units and possibly the only units must be words, not phrases, so the basic relation of synta" is the dependency between two words, not the relation between a phrase and its part. &n other words, sentence structure is dependency structure, not phrase structure a network, as e"pected in cognition, not a tree. )ne of the benefits of this analytical machinery is in the treatment of the various patterns that have been called constructions, which benefit from the fle"ibility of a network structure. !his kind of structure is also very appropriate for semantic analysis, illustrated here by two e"amples* the distinction between universal and e"istential +uantifiers, and a detailed structural analysis of the meaning of the how about X? construction, covering its illocutionary force as well as deictic binding. ,inally, the paper discusses a formal property, the order of words, morphs and so on, arguing that the constraints on order are e"pressed in terms of the landmark relation of Cognitive Grammar, while the actual ordering re+uires the more primitive relation found in any ordered string, here called ne"t. !he paper e"plains how landmark relations can be derived from wordword dependencies in both simple and comple" syntactic patterns, and why the words in a phrase normally stay together.

1.

Introduction

.ow that cognitive linguistics $C/% has established itself as a valid and productive approach to the study of language, it is reasonable to ask what progress it has made on one of the traditional +uestions of linguistic theory* how language is structured. &n particular, how is the answer to this +uestion affected, if at all, by what we might call the Cognitive Assumption in $0%1 $0% !he only cognitive skills used in language are domain-general skills which are also used outside language. !his unifying belief of all cognitive linguists has been e"pressed more or less pithily by others* knowledge of language is knowledge $Goldberg 0223*3%, we should derive language from non-language $/indblom and others 0245*046, +uoted in

7ybee 8909*08%, language is not an autonomous cognitive faculty $Croft and Cruse 8995*0%. #hat difference does the Cognitive 'ssumption make to our ideas about how language is organised, compared with the alternative views in which language is seen either as having nothing to do with cognition, or as a separate module of cognition1 !he natural place to look for an answer is in the theoretical packages that address this +uestion directly. !he )"ford Handbook of Cognitive /inguistics lists three models of grammar $Geeraerts and Cuyckens 8996%* Cognitive Grammar, construction grammar $without capitals% and #ord Grammar. Cognitive Grammar has not yet been developed into a sufficiently formal system because the cost of the re+uisite simplifications and distortions would greatly outweigh any putative benefits $/angacker 8996* 58:%. #hatever the merits of this strategic decision, it means that we cannot e"pect a precise account of how language is organised, comparable to the accounts that we find in non-cognitive theoretical packages. #hen written without capitals, construction grammar has sometimes been identified simply with the cognitive linguistics approach to synta" $Croft and Cruse 8995*883%. &n his 8996 survey, Croft divides construction grammar into four versions $including Cognitive Grammar%. !he ,illmore;<ay Construction Grammar $with capitals% is formally e"plicit, but makes very similar claims about language structure to the non-cognitive model Head-=riven >hrase ?tructure Grammar $H>?G@ >ollard and ?ag 0225%. !he /akoff;Goldberg version is much less formally e"plicit, but offers syntactic analyses that are very similar to those of Construction Grammar $Croft 8996*54A%. ,inally, Crofts own Badical Construction Grammar does comprise original claims about language structure, but the arguments for these claims are only loosely related to the Cognitive 'ssumption, and indeed, & shall suggest in section 3 that they are incompatible with it. Construction grammarians agree in reCecting the distinction between grammar and le"icon. !his is certainly an innovation relative to the recent 'merican tradition $though ?ystemic ,unctional Grammar has recognised the grammar-le"icon continuum for decades under the term le"icogrammar - Halliday 02A0, Halliday 02AA%. )therwise, however, Cognitive Grammar and the other versions of construction grammar make assumptions about language structure which are surprisingly conservative considering their radical criticisms of main-stream linguistic theories. &t would probably be fair to describe the assumed model of synta" as little more sophisticated than Dwickys plain vanilla synta" $Dwicky 0243%, and more generally the assumed grammatical model is little different from the 'merican structuralism of the 0239s. !he aim of this paper is to +uestion some of these assumptions on the grounds that they conflict with the Cognitive 'ssumption. $!here are also good linguistic reasons for +uestioning them, but these arguments will be incidental to the main thrust of the paper.% !he third model of grammar recognised by the )"ford Handbook is #ord Grammar. .ot all cognitive linguists recognise #ord Grammar as a part of cognitive linguistics@ for instance, neither of the articles about the other two models mentions #ord Grammar, nor is it mentioned at all in the 499 pages of Cognitive Linguistics: An introduction $Evans and Green 899A%, and although Croft and Cruse 8995 mention it once, they do not regard it as an e"ample of cognitive linguistics. However, if the Cognitive 'ssumption is critical, then #ord Grammar definitely belongs to cognitive linguistics. ?ince the theorys earliest days its publications endorse this assumption in passages such as the following*

... we should assume that there is no difference between linguistic and nonlinguistic knowledge, beyond the fact that one is to do with words and the other is not $Hudson 0245*:A-6% !o reinforce the link to early cognitive linguistics, this is coupled with an approving reference to /akoffs view* ,or me, the most interesting results in linguistics would be those showing how language is related to other aspects of our being human. $/akoff 0266% 7y 0229, cognitive linguistics e"isted as such and is cited with approval in the ne"t book about #ord Grammar $Hudson 0229*4% in connection with cognitivism, one of the theorys main tenets. 7y 8996 it was possible to write of cognitive linguistics that #ord Grammar fits very comfortably in this new tradition $Hudson 8996*8%, and in 8909* /ike other cognitive linguists, & believe that language is very similar to other kinds of thinking $Hudson 8909*0%. )n the one hand, then, #ord Grammar incorporates the same Cognitive 'ssumption as other cognitive theories. (oreover, it has been heavily influenced by the work of other cognitive linguists, such as /akoffs work $mentioned earlier% on prototypes, /angackers analyses of construal in languages such as Cora $Casad and /angacker 0243%, ,illmores analyses of English le"ical fields such as commercial transactions and risk $,illmore 0248, ,illmore and 'tkins 0228% and his Coint work on constructions $,illmore and others 0244, <ay and ,illmore 0222%, and 7ybees work on learning $7ybee and ?lobin 0248%. )n the other hand, one of the distinctive characteristics of #ord Grammar is its focus on +uestions of language structure formal +uestions about the formal properties of language. Fnfortunately, formalism is associated in the literature of cognitive linguistics with Chomskyan linguistics $!aylor 8996*368%, and as noted earlier, Cognitive Grammar has positively resisted formalisation as a dangerous distraction. (easured solely in terms of insights into the formal structure of language, Chomsky has a point when he claims that cognitive linguistics $as he understands it% has achieved almost no results $Chomsky 8900%. 7ut there is no inherent incompatibility between the Cognitive 'ssumption and formalisation. 'fter all, an interest in formal cognitive structures is the hallmark of 'rtificial &ntelligence, and we have already noticed the similarity between Construction Grammar and the very formal theoretical work in H>?G. 's in other kinds of linguistics, work is needed on both formal and informal lines, and progress depends on fruitful interaction between the two. !his paper focuses on general formal +uestions about language stucture, to argue that the Cognitive 'ssumption actually leads to +uite specific answers which are different from the plain vanilla synta" which is generally taken for granted, and that these answers generally coincide with the claims of #ord Grammar $though some revision is needed%. !he ne"t section analyses the Cognitive 'ssumption into a number of more specific tenets that are relevant to language, and the following sections apply these assumptions to a number of +uestions about language structure* the nature of linguistic units, the relations between morphology and synta", the status of constructions and their relation to dependencies in synta", the nature of meaning and the ordering of words. !he last section draws some general conclusions.

2.

The Cognitive Assumption unpacked

&f cognition for language shares the properties of the general-purpose cognition that we apply in other domains, the first +uestion for cognitive linguistics is what we know about the latter. )f course, cognitive scientists know a great deal about cognition, so

the immediate +uestion is what they know that might have conse+uences for the theory of language structure. (ost of the following tenets are recognised in any undergraduate course on cognitive psychology or '& as reasonably mainstream views, even if they are also disputed@ so & support them by reference to te"tbooks on cognitive psychology $Beisberg 8996% and '& $/uger and ?tubblefield 022:%. !hese tenets are also very familiar to any reader of this Cournal, so little e"planation or Custification is needed. !he first relevant tenet of cognitive psychology might be called the learning tenet and consists of a truism* $8% !he learning tenet* #e learn concepts from individual e"periences, or e"emplars. )ne conclusion that can be drawn from e"perimental results is that we learn by building prototype schemas on the remembered e"emplars, but without deleting the latter from memory $Beisberg 8996*:80%@ and another is that schemas have statistical properties that reflect the +uantity and +uality of the e"periences on which they are based. !hus it is not Cust language learning, but all learning, that is usage-based. !he network tenet, also called the network notion $Beisberg 8996*838%, is this claim* $:% !he network tenet* !he properties of one concept consist of nothing but links to other concepts, rather than features drawn from some separate vocabulary. &n this view, concepts are atoms, not bundles or bo"es with an internal structure. ' concept is simply the meeting point of a collection of links to other concepts. Conse+uently, the best way to present an analysis of some area of conceptual structure is by drawing a network for it. (oreover, since a concepts content is carried entirely by its links to other concepts, labels carry no additional information and are simply mnemonics to help researchers to keep track of the network models they build@ so in principle, we could remove all the labels from a network without losing any information $/amb 0224*32%. 'n important corollary of the network tenet is that when we learn a new concept, we define it as far as we can in terms of e"isting concepts. !his is such an important idea that we can treat it as a separate tenet* $5% !he recycling tenet* E"isting concepts are recycled wherever possible as properties of other concepts. !he recycling tenet e"plains the often observed fact that social and conceptual networks tend to be scale-free, meaning that they have more densely-linked nodes than would be e"pected if links were distributed randomly $7arabasi 8992%. &t also has important implications for the formal structure of language which we shall consider in relation to morphology $section :% and semantics $section A%. .e"t, we have the inheritance tenet: $3% !he inheritance tenet* #e build ta"onomies of concepts linked to one another by the special relation isa, which allow generalisations to spread down the ta"onomy from more general to more specific concepts. !he isa relation is widely recognised as a basic relation $$Beisberg 8996*869%, and at least in '&, this process of generalisation is called inheritance $e.g. /uger and ?tubblefield 022:* :4A%, so that more specific concepts can inherit properties from more general ones. (any '& researchers accept the need for inheritance to allow e"ceptions, so that properties are inherited only by default. !his logic is variously called default inheritance, normal inheritance or normal mode inheritance, in contrast with complete inheritance which forbids e"ceptions. &f our basic logic is

default inheritance, this e"plains one part of the learning tenet above* how we can accommodate both prototypical and e"ceptional members in the same category. !he fifth relevant claim of cognitive psychology and '& concerns the classification of relations, so we can call call it the relation tenet* $A% !he relation tenet* (ost relations are themselves concepts which we learn from e"perience. )n the one hand it is obvious that the links in a network are of different types@ for instance, the links from the concept dog to animal, tail and bark are fundamentally different from each other. )n the other hand, we cannot assume that these different types are all built in, part of our inherited conceptual apparatus $Beisberg 8996* 869%. ' few of them must be built in, because they underlie our most basic logical operations, the clearest e"ample being the isa relation mentioned above@ but most of them must be learned from e"perience Cust like ordinary concepts. )ne solution to this dilemma is to recognise these learned relations as a sub-type of concept* relation, contrasted with entity. !he si"th tenet that is relevant to language structure is the activation tenet* $6% !he activation tenet* Betrieval is guided by activation spreading from node to node. &ndeed, the strongest evidence for the network tenet is the evidence for spreading activation, notably the evidence from priming e"periments and from speech errors $Beisberg 8996*835%. &n any search, the winner is the most active relevant node, and, all being well, this will also turn out to be the target node. ' nodes activation level is influenced in part by previous e"perience hence the statistical differences between schemas noted earlier but partly by the immediately current situation to which the person concerned is paying attention. ,inally, we note the chunking tenet* $4% !he chunking tenet* #e understand e"perience by recognising chunks and storing these in memory as distinct concepts $Beisberg 8996*06:%. ,or present purposes, the most important part of this claim is that we create new concepts for new e"periences@ so given a series of digits to memori e, we create a new concept for each digit-token as well as for each chunk of digits. !his is evidence for node-creation on a massive scale, though of course most of the nodes created in this way have a very short life. !hose that survive in our memories constitute the e"emplars of the learning tenet, and once in memory they are used in future e"perience as a guide to further node-creation. !hese seven elementary tenets of cognitive science are closely interconnected. !ake, for e"ample, the temporary e"emplar nodes that are created under the chunking tenet. #hat the network tenet predicts is that these temporary nodes must be part of the conceptual network, since that is all there is in conceptual structure@ so $by the recycling tenet% the only way in which we can understand an e"perience is by linking it to pre-e"isting nodes of the conceptual network. !he activation tenet predicts that these nodes are highly active because they are the focus of attention. (oreover, the inheritance tenet predicts an isa relation between each temporary node and some pre-e"isting node in the network from which it can inherit properties@ this follows because the aim of classifying e"emplar nodes is to inherit unobservable properties, and isa is the relation that allows inheritance. 's for the e"emplars observable properties, these must also consist of links to pre-e"isting concepts@ but according to the relation tenet, these links must themselves be classified, so we distinguish the very general isa relation from much more specific relational concepts.

!o make these rather abstract ideas more concrete before we turn to specifically linguistic structures, consider a scene in which you see an obCect and recognise it as a cat. 'lthough recognition is almost instantaneous, it must consist of a series of interconnected steps, with considerable feedback from logically later steps to earlier ones* node-creation* you create highly active temporary nodes for all the obCects you can see, including not only the cats parts but also the chunk that will eventually turn out to be a cat. classification* you link each temporary node to the most active pre-e"isting node in the network which classifies it as a paw, a tail, a cat and so on. enrichment* you enrich your knowledge of the cat by inheriting from cat through highly active relation nodes@ for e"ample, if you want to know whether to stroke it, this is how you guess an answer. Each of these processes is covered by some of the seven tenets* node-creation by the learning, activation and chunking tenets@ classification by the learning, recycling, network, relation and activation tenets, and enrichment by the learning and inheritance tenets. !he rest of the paper will e"plore the conse+uences of these rather elementary ideas about general cognition for the theory of language structure, following much the same logical route as /angackers $/angacker 0246% but starting from general cognitive psychology rather than Gestalt psychology and ending up with more specific claims about language structure. #hether or not similar bridges can be built from psychology to other theories of language structure & leave to the e"perts in those theories.

3.

Morphology and syntax

)ne of the issues that divides grammatical theories concerns the fundamental +uestion of how the patterning of language can be divided into levels or strata, such as phonology, grammar and semantics. !he basic idea of recognising distinct levels is uncontroversial* to take an e"treme case, everyone accepts that a phonological analysis is different from a semantic analysis. Each level is autonomous in the sense that it has its own units consonants, vowels, syllables and so on for phonology, and people, events and so on for semantics and its own organisation@ but of course they are also closely related so that variation on one level can be related in detail to variation on the other by correspondence or realisation rules. ' much more controversial +uestion concerns the number of levels that should be distinguished between phonology and semantics. !wo answers dominate cognitive linguistics* none $Cognitive Grammar%, and one $construction grammar%. Cognitive Grammar recognises only phonology, semantics and the symbolic links between them $/angacker 8996*586%, while construction grammar recognises only constructions, which constitute a single continuum which includes all the units of synta", the le"icon and morphology $Evans and Green 899A*63:, Croft and Cruse 8995*833%. &n contrast, many theoretical linguists distinguish two intervening levels* morphology and synta" $'ronoff 0225, ?adock 0220, ?tump 8990%@ and even within cognitive linguistics this view is represented by both .eurocognitive /inguistics $/amb 0224% and #ord Grammar. & shall now argue that the Cognitive 'ssumption in $0% supports the distinction between morphology and synta".

Consider first the chunking tenet $4%. !his effectively rules out reductionist analyses which reduce the units of analysis too far@ so a reductionist analysis of the se+uence 02538908 recognises nothing but the individual digits, ignoring the significant dates embedded in it $0253, 8908%. >sychological memory e"periments have shown that human subCects are looking for significant higher units and it is these units, rather than the obCective string, that they recognise in memory e"periments. !he higher units are the chunks which we actively seek, and which create a higher level of representation by representational redescription $<armiloff-?mith 0228, <armiloff-?mith 0225%. ' chunk based on a collection of lower units is not evidence against the cognitive reality of these units, but coe"ists with them in a hierarchy of increasingly abstract levels of representation@ so the individual digits of 0253 are still part of the analysis, but the se+uence is not merely a se+uence of digits more than the sum of its parts. Chunking is really Cust one conse+uence of the recycling tenet $5%, which says that a concepts properties are represented mentally by links between that concept and other e"isting concepts. Bather obviously, this is only possible if the concepts concerned are represented by single mental nodes $even if these nodes are themselves represented at a more concrete level by large neural networks%. !o pursue the previous e"ample, & must have a single mental node for 0253 because otherwise & couldnt link it to the concept date or to events such as the end of ##&&. 7ut once again it is important to bear in mind that the e"istence of the concept 0253 does not undermine the mental reality of the concepts 0, 2 and so on, which are included among its properties. 7earing chunking in mind, consider now /angackers claim that language consists of nothing but phonology, semantics and the links between them* ..., every linguistic unit has both a semantic and phonological pole... Semantic units are those that only have a semantic pole ... phonological units ... have only a phonological pole. ' symbolic unit has both a semantic and a phonological pole, consisting in the symbolic linkage between the two. !hese three types of unit are the minimum needed for language to fulfill its symbolic function. ' central claim ... is that only these are necessary. Cognitive Grammar maintains that a language is fully describable in terms of semantic structures, phonological structures, and symbolic links between them. $/angacker 8996*586% !his claim ignores the effects of chunking. !ake a simple e"ample such as cat, pronounced ;kat;. !rue, this is phonology, but it is not mere phonology. !he symbolic unit needs a single phonological pole, and in this simple case we might assume this is the syllable ;kat; - a single phonological unit but meaning-bearing units need not correspond to single phonological units. 'part from the obvious challenge of polysyllabic words, the literature of linguistics is full of e"amples where phonological boundaries are at odds with word boundaries, so that $unlike the case of cat% words cannot be identified as a single phonological unit. 'n e"treme case of this mismatch arises in clitici ation@ for e"ample, the sentence Youre late clearly contains a verb, written as re, and has the same meaning and synta" as You are late. 'nd yet, in a non-rhotic pronunciation there is no single phonological unit that could be identified as the phonological pole of the symbolic unit corresponding to are. !he words youre sound different from you, but, given the absence of ;r;, the difference lies entirely in the +uality of the vowel. .o doubt the phonological analysis could be manipulated to provide a single unit, such as an abstract feature, but this would be to miss the point, which is that

phonological analysis pays attention to completely different +ualities from grammatical analysis. &n short, chunking creates mental nodes which are only indirectly related to phonological structure@ so we must recognise units corresponding not only to syllables such as ;kat;, but also to se+uences of syllables such as ;katgri; $category% or parts of syllables such as the vowel +uality of ;C*; $youre%. !hese more abstract units, like the dates discussed earlier, are mapped onto more concrete units by properties@ so Cust as 0254 is mapped onto 0 and so on, symbolic units can be mapped onto phonological structures such as ;kat;, ;katgri; or ;C*;. 7ut this is very different from saying that the chunk is a phonological structure. &n fact, this view is Cust the same as the traditional view that words mediate between phonology and meaning. !he chunk is the word, and it e"ists precisely because it serves this mediating function* it brings together a recognisable stretch of phonology and a recognisable bit of meaning which systematically cooccur. 'nd it brings sound and meaning together by having both of them as its properties@ so in traditional terms, the word cat is pronounced ;kat; and means cat. 7ut of course once we have recognised the category of words, and allowed them to have two properties a pronunciation and a meaning there is nothing to stop us from treating them Cust like any other type of concept, with multiple properties. ?o Cust as linguists have been saying for centuries, words can be categori ed $as nouns, verbs and so on%, words have a language $such as English or ,rench%, words have a spelling, they can be classified as naughty or high-fallutin, and they even have a history and an etymology. !his view of words as collecting points for multiple properties is an automatic conse+uence of chunking combined with the network principle. .one of this is possible if a symbolic unit is simply a link between a semantic pole and a phonological pole. ' link is a property, but it cannot itself have properties@ so even if there is a link between ;kat; and cat, it cannot be e"tended to include, say, a spelling link@ the only way to accommodate spelling in this model would be to add another symbolic link between cat and the spelling GcatH, as though ;kat; and GcatH were mere synonyms. &n contrast, recognising words allows us to treat spellings and pronunciations, correctly, as alternative realisations of the same unit. (oreover, if meanings and pronunciations belong to a much larger set of properties, we can e"pect other properties to be more important in defining some words or word classes. Bather surprisingly, perhaps, /angacker himself accepts that some grammatical classes may have no semantic pole* 't the other end of the scale Ifrom basic grammatical classesJ are idiosyncratic classes reflecting a single language-specific phenomenon $e.g. the class of verbs instantiating a particular minor pattern of past-tense formation%. ?emantically the members of such a class may be totally arbitrary. $/angacker 8996*5:2% >resumably such a class is a symbolic unit $albeit a schematic one%, but how can a symbolic unit have no semantic pole, given that a unit without a semantic pole is by definition a phonological unit $ibid* 586%1 (y conclusion, therefore, is that /angackers symbolic units are much more than a mere link between meaning and sound* they are units with properties concepts, defined like all other concepts by their links to other concepts. ,ollowing the network tenet that structures can be represented as networks of atomic nodes, this conclusion can be presented as a reCection of the first diagram in ,igure 0 in favour of the second*

cat

cat

C'!

;kat;

;kat;
Figure 1: Symbolic units as concepts rather than links

?uppose, then, that symbolic units are like the unit labelled C'!. &f so, they are indistinguishable from constructions as defined by construction grammar. !his brings us to the second issue of this section* how well does the notion of construction accommodate the traditional distinction between morphology and synta"1 's mentioned earlier, one of the claims of construction grammar is that constructions comprise a single continuum which includes all the units of synta", the le"icon and morphology. Croft describes this as one of the fundamental hypotheses of construction grammar* there is a uniform representation of all grammatical knowledge in the speakerss mind in the form of generali ed constructions $Croft 8996*560%. 'ccording to Croft, the synta"-le"icon continuum includes synta", idioms, morphology, syntactic categories and words. !his generalisation applies to Cognitive Grammar as well as construction grammar* the only difference between morphology and synta" resides in whether the composite phonological structure ... is smaller or larger than a word. $/angacker 0229*82:%. /ike other cognitive linguists & fully accept the idea of a continuum of generality from very general grammar to a very specific le"icon@ but the claim that synta" and morphology are part of the same continuum is a different matter. !o make it clear what the issue is, here is a diagram showing an analysis of the sentence Cats miaow in the spirit of construction grammar. !he diagram uses the standard bo" notation of construction grammar, but departs from it trivially in showing meaning above form, rather than the other way round.

cats miaow cats cat plural miaow

cat cats

miaow

cats miaow

Figure 2: Cats miaow: a construction analysis in box notation

#e must be careful not to give too much importance to matters of mere notation. 'ccording to the network tenet $:%, all conceptual knowledge can be represented as a network consisting of nothing but links and atomic nodes@ this rules out networks whose nodes are bo"es with internal structure, such as the one in ,igure 8. However, a bo" is Cust a notation for part-whole relations, so it will be helpful to remind ourselves of this by translating this diagram into pure-network notation, with straight arrows pointing from wholes to their parts. 7earing this convention in mind, the ne"t figure presents e"actly the same analysis as ,igure 8* cats miaow cats cat plural miaow

cat cats

s miaow Cats miaow. cats miaow

Figure 3: Cats miaow: a construction analysis in network notation

's before, the vertical lines show the symbolic link between each form and its meaning, and the diagram illustrates Crofts generalisation about constructions providing a single homogeneous analysis for the whole of grammar, including the morphology inside cats as well as the synta" that links cats to miaow. His claim fits well with the plain vanilla 'merican ?tructuralist tradition in which morphology is

simply synta" within the word a tradition represented not only by pre-Chomskyans $Harris 0230, Hockett 0234% but also by Chomsky himself in his famous analysis of English au"iliaries as two morphemes $e.g. beKing%, and by =istributed (orphology $Halle and (arant 022:%. 7ut how well does it mesh with the Cognitive 'ssumption $and indeed, with the linguistic facts%1 & shall now present an obCection to it based on the prevalence of homonymy. !he argument from homonymy is based on the recycling tenet $5% and goes like this. #hen we first meet a homonym of a word that we already know, we dont treat it as a completely unfamiliar word because we do know its form, even though we dont know its meaning. ,or instance, if we already know the adCective B)F.= $as in a round table%, its form is already stored as a morph a form which is on a higher abstraction level than phonology@ so when we hear o round the corner, we recognise this form, but find that the e"pected meaning doesnt fit the conte"t. 's a result, when creating a new word-concept for the preposition B)F.= we are not starting from scratch. 'll we have to do is to link the e"isting form to a new word. 7ut that means that the e"isting form must be conceptually distinct from the word in other words, the morph LroundM is different from the words that we might write as B)F.=adC and B)F.=prep. #herever homonymy occurs, the same argument must apply* the normal processes of learning force us to start by recognising a familiar form, which we must then map onto a new word, thereby reinforcing a structural distinction between the two levels of morphology $for forms% and synta" $for words%, both of which are different from phonology and semantics. !he proposed analysis for the homonymy of round is sketched in ,igure 5. semantics

rotund

around

B)F.=adC

B)F.=prep

synta"

LroundM

morphology

;r;

;au;

;n;

;d;

phonology

Figure : !omonymy an" the levels o# language

&t might be thought that this conclusion could be avoided simply by linking the form LroundM directly to its two different meanings. 7ut this wont do for purely linguistic reasons* because the different meanings are also associated with very different syntactic properties, so distinct concepts are needed to show these

correlations. !he problem is especially acute in the case of bilingual individuals, who may link the same meaning to words from different languages where the syntactic and morphological differences are even greater. !he fact is that round is not mere phonology, because it is a listed, and recognised, English word i.e. a word-form. 7ut this word-form is actually shared by $at least% two distinct words, each with its different syntactic properties and each distinct from the meaning that it e"presses. &n short, we need to distinguish the forms of morphology from the words of synta". (oreover, the relation between a word and its form is different from the part-whole relations that are recognised within both morphology and synta". &t makes no sense to say that LroundM is a part of B)F.= for the simple reason that, if we can measure si e at all, they are both the same si e at least in the sense that they both map onto the same number of phonological segments. Bather, the relation between form and word is the traditional relation called reali ation, where the form reali es the word $by making it more real, or more concrete%. !he psychological reality of morphological form is evident in another area of language-learning* folk etymology, where we try to force a new form into an e"isting pattern, regardless of evidence to the contrary. ,or instance, our word bridegroom developed out of )ld English bryd!guma when the word gyma man fell out of use, leaving the guma form isolated. !he form groom had the wrong meaning as well as the wrong phonology, but it had the great virtue of familiarity i.e. it already e"isted in everybodys mind as an independent concept so it was pressed into service $recycled% for lack of a better alternative. !he fact that folk etymology has affected so much of our vocabulary, and continues to do so, is clear evidence of our enthusiasm for recycling e"isting forms so that every word consists of familiar forms, regardless of whether this analysis helps to e"plain their meaning. !his conclusion about morphology and synta" undermines the main distinctive claim of construction grammar. 'ccording to the construction-based analysis of Cats miaow in ,igure :, the relation of the morphs cat and s to the word cats is Cust the same as the latters relation to the entire sentence, namely a part-whole relation. 7ut if the argument from homonymy is correct, the morphs LcatM and LsM e"ist on a different level of analysis from the word cats $which we might write, to avoid ambiguity, as C'!*plural%@ and the morphs relation to the word is different from the words relation to the rest of the sentence. &n short, grammatical structure within the word is not merely a downwards e"tension of grammatical structure above the word@ as has often been pointed out by morphologists since Bobins $Bobins 8990%, morphology is not Cust synta" within the word. &t could be obCected that languages vary much more than we might e"pect $if morphology and synta" are fundamentally different% in the sheer amount of morphology that they have, with analytical languages such as Nietnamese showing virtually none. How can a language have no inflectional morphology at all if morphology is a logical necessity in any language1 !his obCection misses the point of the argument, where there was no mention of the morphological processes or patterns that we associate with inflectional morphology $or, for that matter, with derivational morphology%. !he only claim is that if a language has homonyms, then it will also distinguish words from the forms that reali e them, however simple or comple" those reali ation relations may be. #e can push the argument a step further by +uestioning the constructional claim that every unit of grammar has a meaning. !his claim is e"plicit in ,igure :, where the form s means plural. &n contrast, the two-level analysis of ,igure 5 has no direct link between the morph LroundM and either meaning, and a similar analysis of

cats would only link LsM to plural via the word C'!*plural. !his separation of form from meaning follows from the homonymy argument* if meanings are correlated $as they are% with syntactic behaviour, they must be a property of words, not forms@ so homonyms must be words that share the same form, not meanings that $directly% share the same form. ?o far, then, the argument seems to support a rather traditional four-level analysis of language in which semantic structures relate to words, which relate to morphs, which relate to phones $or whatever the units of phonological structure may be%. However, one of the weaknesses of the traditional view of language architecture is its restrictiveness@ it not only claims that meanings are related to words, but it also claims that they cannot be related to morphs or to phones. !his is psychologically very implausible, for the simple reason that we learn by recording correlated patterns, so if a morph correlates with a particular meaning, there is nothing to prevent us from learning the correlation as a link between the two. ?o if the morph LunM correlates with the meaning not, it seems likely that we will record this link in addition to any more specific link that we may establish between not and words such as F.!&=O. !here is no reason to believe that we avoid redundancy in fact, redundancy is to be e"pected in any adaptive system - so we can assume considerable direct linkage between morphs and meanings. (oreover, one of the continuing embarrassments for the traditional model has always been the e"istence of phonesthemes and other kinds of sound symbolism, such as the meaning indolence or carelessness shared by words such as slac"# slattern# slea$y# slob# slut $Beay 899A%. ?uch e"amples seem to show a direct connection between phonological patterns and meanings a connection which is even more obvious in the case of intonation, where neither morphs nor words are available to mediate the link to meaning. !he conclusion of this argument, then, is that homonymy automatically pushes learners to create separate mental representations for morphs and for words. !ypically, it is words that are directly linked to meanings, while morphs only have meanings indirectly by virtue of the words that they help to reali e, and phones are even more indirectly related to meanings via morphs and words@ but e"ceptions can be found in which morphs, or even phones, have meanings. !his is a very different model from construction grammar, in which a construction is by definition a pairing of a meaning with a form, and words and morphs co-e"ist on the same level as forms. .or, on the other hand, is it +uite the same as published versions of #ord Grammar $Hudson 8996, Hudson 8909%, which all assume that only words can have meanings. Encouragingly, the argument from homonymy takes us to e"actly the same conclusion that many linguists reached in the 0239s by simply looking at the facts of language. &n those days the choice was between the 'merican ?tructuralist approach called &tem and 'rrangement $with its process-based alternative called &tem and >rocess% and the European approach called #ord and >aradigm $Bobins 8990%. !he argument centred round the place of the word in grammatical analysis, with the 'mericans tending to deny it any special place and the Europeans making it central@ for the 'mericans, the grammar within the word is simply a downwards e"tension of synta", whereas for the Europeans it was different. !he European argument centred on languages such as /atin in which it is very easy to show that morphs have no direct or simple relation to meanings@ for e"ample, in the /atin verb amo, & love, the suffi" LoM e"presses person, number and tense all bundled up together. !he conclusion is that, unlike words, morphs have no meaning in themselves@ instead, they help to realise word-classes $such as first person singular present tense%, and it is these that

have meaning. )f course this is no longer a debate between Europe and 'merica, as many leading 'merican morphologists accept the same conclusion $'ronoff 0225, ?adock 0220, ?tump 8990%@ but it is very noticeable that the literature of construction grammar follows the 'merican ?tructuralist path rather than engaging with the debate. E+ually encouragingly, the separation of phonology, morphology and synta" is confirmed by psycholinguistic priming e"periments, which offer a tried and tested way to e"plore mental network structures $Beisberg 8996*836%. &n a priming e"periment, the dependent variable is the time $in milliseconds% taken by a person to recognise a word that appears on a screen in front of them, and to make some decision about it which depends on this recognition@ for e"ample, as soon as the word doctor appears, the subCect may have to decide whether or not it is an English word and then press one of two buttons. !he independent variable is the immediately preceding word, which may or may not be related to the current word@ for instance, doctor might follow nurse $related% or lorry $unrelated%. #hat such e"periments show is that a related word primes the current word by making its retrieval easier, and therefore faster@ and crucially, they allow us to distinguish the effects of different kinds of priming* semantic $nurse doctor* 7olte and Coenen 8998, Hutchison 899:% le"ical $doctor doctor@ (arslen-#ilson 899A% morphological $factor doctor@ ,rost and others 8999% phonological $nurse verse@ ,rost and others 899:% )nce again, therefore, we have evidence for a three-level analysis which breaks down the general notion of form into three kinds of structure, each with its own distinctive units* phonology, morphology and synta" $whose units are words%. !his is the architecture claimed in #ord Grammar $and many other theories of grammar%, but it conflicts with construction grammar and even more so with Cognitive Grammar. )ne possible obCection to this line of argument is that it could be the thin end of a very large wedge which would ultimately reduce the argument to absurdity. #hy stop at Cust the three formal levels of phonology, morphology and synta"1 #hy not recognise four or five levels, or, for that matter, forty or fifty1 !his +uestion re+uires a clear understanding of how levels are differentiated* in terms of abstractness, and not in terms of either si e or specificity. ,or instance, the only difference between the word =)G, the morph LdogM and the syllable ;dg; is their abstractness, because they all have the same si e and the same degree of specificity. =)G has properties such as its meaning and its word-class which are more abstract than those of LdogM, which in turn are more abstract than those of ;dg;. !he +uestion, therefore, is whether three levels of abstractness is in some way natural or normal. &n relation to English, might we argue for further levels1 're there other languages where the evidence points to many more levels1 (aybe, but it seems unlikely given that we already have evidence from massively complicated morphological systems that a single level of morphology is enough even for them $?adock 0220%.

4.

hrases or dependencies

'nother structural issue which has received very little attention in the cognitive linguistics literature is the choice between two different views of synta". )nce again, cognitive linguistics is firmly located in the tradition of 'merican ?tructuralism rather than in its European rival. !he 'merican tradition dates from the introduction of immediate constituent analysis, which inspired modern >hrase ?tructure Grammar

$>?G@ 7loomfield 02::, >ercival 026A%. &n contrast, the European tradition of =ependency Grammar $=G% dates back a great deal further, and certainly at least to the 'rabic grammarians of the 4th century $>ercival 0229, )wens 0244%, and provided the basis for a great deal of school grammar on both sides of the 'tlantic. 'mong the theories of grammar that are aligned with cognitive linguistics, #ord Grammar is the only one that even considers =ependency Grammar as a possible basis for synta". !he essential difference between the two approaches lies in their assumptions about what relations can be recognised in synta". ,or >?G, the only possibility is the very elementary relation of meronymy* the relation between a whole and its parts. !his restriction follows automatically from the definition of a phrase-structure tree as e+uivalent to a bracketed string. 7rackets are very elementary devices whose purpose is simply to distinguish parts from non-parts@ for instance, the brackets in a Ib cJ show that b and c are both parts of the larger unit b c, but a is not. ' bracketed string is inherently incapable of giving information about the relations between parts and non-parts $e.g. between a and b%. &n contrast, =G focuses on the relations between individual words, recognising traditional relations such as subCect, complement and modifier. !o take a very simple e"ample, consider Small birds sing. >?G recognises the phrase small birds as well as the whole sentence, but it recognises no direct relation at all between small and birds, or between birds and sing. &n contrast, the individual words and their direct relations are all that a =G recognises, although the phrase small birds is implicit in the link between small and birds. )f course, there are versions of >?G which include the traditional relations as an enrichment of the basic part-whole relation, so that small birds is recognised e"plicitly as the subCect of the sentence. !his is true in /e"ical ,unctional Grammar and Head-=riven >hrase ?tructure Grammar, as well as in other functional theories such as Belational Grammar and ?ystemic ,unctional Grammar. (ore relevantly here, it is also a characteristic of construction grammar $e"cept Badical Construction Grammar%. 7ut this compromise should not obscure the fact that the relations concerned are still basically part-whole relations. 'll these versions of >?G, including those found in C/, still e"clude direct relations between words, such as those between small and birds and between birds and sing. !o clarify the issues it will again help to consider concrete diagrammatic structures, so here are three diagrams for the syntactic structure of Small birds sing. $'% is pure >?G, without function labels@ $7% is a compromise analysis which is at least within the spirit of construction grammar in terms of the information conveyed, even if tree notation is not popular in the cognitive linguistics literature@ and $C% is an e"ample of =G enriched with function labels. !he stemma notation in $C% was introduced by the first =G theorist $!esniPre 0232% and is widely used in =G circles. &t has the advantage of comparability with the tree notation of >?G, but & shall suggest a better notation for dependencies below. 'll three diagrams show synta" without semantics, but this is simply because the present topic is syntactic relations. !his is also why the nodes are unclassified.

small birds sing

small birds sing subCect

sing subCect

small birds

sing

small birds modifier

sing

birds modifier

small

birds

small

birds

small $C%

$' %

$7%

Figure $: %hrase structure or &epen"ency structure

!he issue can be put in concrete terms as two related +uestions about the e"ample Small birds %ly* #hat is the subCect, and what is it the subCect of1 &n the European dependency tradition, the subCect is birds, and it is the subCect of %ly. &n contrast, the 'merican >?G tradition takes small birds as the subCect of the sentence Small birds %ly. !he >?G tradition has had such an impact on grammatical theory that most of its followers take it as obviously true, but it has serious weaknesses, and especially so if we start from the Cognitive 'ssumption, so it is especially problematic for cognitive linguistics. !his is already recognised in Cognitive Grammar* ?ymbolic assemblies e"hibit constituency when a composite structure ... also functions as component structure at another level of organi ation. &n Cognitive Grammar, however, grammatical constituency is seen as being variable, nonessential and nonfundamental. $/angacker 8996*558% & shall start with the specifically cognitive weaknesses, before turning to more familiar purely linguistic weaknesses. ,rom a cognitive point of view, >?G has two weaknesses, to do with relations and with tokens. !he first weakness is the e"treme poverty of its assumptions about possible relations, in contrast with the assumption accepted by almost every psychologist that cognitive structure is an associative network $,erreira 8993%. 7y e"cluding all relations e"cept that between a whole and its parts, it e"cludes a great deal of normal cognitive structure and not least, the whole of social structure, based as it is entirely on relations such as mother and friend. !he relation tenet $A% asserts that relations of many different types can be learned and distinguished from one another, so there is no reason to prioritise the part-whole relation. 'nd if other relations are possible, then we can freely relate any obCect to any other obCect. ?o Cust as we can relate one human to another in our social world, in the realm of synta" we can relate one word directly to another. 'lthough 7loomfieldian &mmediate Constituent 'nalysis had roots in German psychology, Chomskys formalisation in terms of bracketed strings and rewrite rules removed any semblance of psychological plausibility. !he second cognitive weakness of >?G is its assumption about tokens, i.e. about the sentences that a grammar generates. !he symbols in a sentence structure are

seen as mere copies of those used in the grammar@ so if the sentence structure contains, say, the symbol . $for noun%, this is the same symbol as the . in the grammar@ and the symbol birds in the sentence is the same as the one in the grammar. )nce again we find a very impoverished view of cognitive capacity, in which the only operation that we can perform is to copy symbols and structures from the grammar onto the sentence structure. &n contrast, the chunking tenet $4% says that when we encounter a new e"perience, we build a new node for it, and use relevant parts of our stored knowledge to enrich this node. &n this view, the nodes for tokens are much more than mere copies of those in the grammar for the relevant types@ although the tokens inherit most of the properties of the types, they also have a great many other properties, reflecting the particularities of their conte"t. &f the sentence contains the word birds, this is not Cust the stored word birds, less still the le"eme 7&B= or the category plural@ instead, it is a distinct concept from all these, with properties that include the other words in the sentence. ?uppose, then, that we take the relation and chunking tenets seriously. #hat mental structure would we e"pect for the sentence Small birds sing1 ,irst, small and birds are clearly related, so we e"pect a direct relation between these words@ e+ually clearly, the relation between small and birds is asymmetrical because small modifies the meaning of birds, rather than the other way round. &n traditional terminology, small is the modifier and depends on birds, which is the implied phrases head. 7ut if the meaning of birds is modified by small, it follows that birds i.e. this particular token of birds does not mean birds, but means small birds@ so there is no need to postulate a larger unit small birds to carry this meaning. &n other words, the meaning of a phrase is carried by its head word, so the phrase itself is redundant. ,urthermore, the link between small and birds e"plains why they have to be positioned ne"t to each other and in that order. #e return to +uestions of word order in section 6, but we can already see that direct word-word links will play an important part in e"plaining word order.

?imilar arguments apply to the subCect link. )nce again, this is a direct asymmetrical link between two single word-tokens* birds $not* small birds% and sing, bearing in mind that birds actually means small birds. 's in the first e"ample, the dependent modifies the heads meaning, though in this case the kind of modification is controlled by the specifics of the dependency, traditionally called subCect. Given, then, that birds is the subCect of sing# and that birds means small birds, it follows that the word-token sing doesnt Cust mean sing, but means small birds sing. ?o once again, the head word $sing% carries the meaning of the whole phrase, and no separate phrasal node is needed. &n this case, the phrase is the entire sentence, so we conclude that the sentence, as such, is not needed. !he proposed analysis is presented in ,igure A. meaning small birds sing meaning small birds sing subCect birds modifier small
Figure ': Small birds sing: syntax an" semantics

!his diagram includes the earlier stemma notation for syntactic relations alongside a more obvious notation for the meaning relation. !his is inconsistent, as the meaning relation and the syntactic dependency relations look more different than they should. 'fter all, part of the argument above in favour of dependency analysis is that our conceptual apparatus already contains plenty of relations of various kinds in domains outside language, so the same apparatus can be applied to synta". !he trouble with stemma notation is that it is specially designed for synta", so it obscures similarities to other domains. &n contrast, the notation for the meaning relation is simply a general-purpose notation for relations as defined by the relation tenet of $A%. !his general-purpose notation works fine for synta", and indeed avoids problems raised by the rather rigid stemma notation, so we now move to standard #ord Grammar syntactic notation, as in ,igure 6, with arrows pointing from a word to its dependents.

small birds sing

small birds meaning modifier Small birds subCect

meaning

sing.

Figure (: Small birds sing: )or" *rammar notation

!his notation should appeal to cognitive linguists looking for an analysis which treats synta" as an e"ample of ordinary conceptual structures. 'fter all, it makes synta" look like a network, which $according to the network tenet of $:%% is what ordinary conceptual structure is like@ in contrast, the notation used for synta" in the cognitive linguistics literature is hard to relate to general conceptual structure. 'dmittedly, the structure in ,igure 6 may not look obviously like a network, but more complicated e"amples certainly do. !he sentence in ,igure 4 makes the point, since every arrow in this diagram is needed in order to show well-know syntactic patterns such as complementation, raising or e"traction. !his is not the place to Custify the analysis $Hudson 8996, Hudson 8909%, but it is worth pointing to one feature* the mutual dependency between which and do. &f this is indeed the correct analysis, then the case for dependency structure is proven, because mutual dependency is impossible in >?G. $'nd even assuming dependency structure, it is impossible in stemma notation.% &n the diagram, s and c stand for subCect and complement. !he notational convention of drawing some arrows above the words and others below will be e"plained in the discussion of word order in section 6. e"tractee predicative c &hich c birds do e"tractee s you s thin" e"tractee Q s obCect sing adCunct best?

Figure +: A complex syntactic network

'nother way to bring out the network-like nature of syntactic structure is to go beyond the structure of the current tokens in order to show how this structure is derived from the underlying grammar. Beturning to the simpler e"ample, Small birds sing, we know that birds can be the subCect of sing because the latter re+uires a subCect, a fact which is inherited from the verb node in the grammar@ and similarly, small can be the dependent of birds because it is an adCective, and adCectives are

allowed to modify nouns. !he word tokens inherit these properties precisely because they are part of the grammar, thanks to isa links to selected word types. !his is all as predicted by the inheritance tenet $3%, and can be visualised in the diagram below, where the small triangle is the #ord Grammar notation for the isa relation. &n words, because the token small isa the le"eme ?('// which isa adCective, and because an adCective is typically the modifier of a noun $represented by the left-hand dot, which isa noun%, it can be predicted $by inheritance% that small is also the modifier of a noun, which $after some processing% must be birds. ' similar logic e"plains the subCect link from sing to birds. !he main point is that the syntactic structure for the sentence is a small part of a much larger network, so it must itself be a network. modifier adCective S noun S subCect verb

?('//

7&B=

>lural

?&.G

Small

birds
Figure ,: Small birds sing an" its un"erlying grammar

sing.

!he main conclusion of this section is that synta" looks much more like general conceptual structure if we allow words to relate directly to one another, as in dependency analysis, than if we apply a rigid >?G analysis. =ependency structure is very similar to many other areas of conceptual structure, such as social structure, whereas it is hard to think of any other area of cognition which allows nothing but part-whole relations. !o make a rather obvious point, it is arguably >?G that has encouraged so many of our colleagues to accept the idea that language is uni+ue@ but >?G is simply a method of analysis, not a demonstrated property of language. !he only way to prove that >?G is in fact correct is to compare it with the alternatives, especially dependency analysis@ but this has hardly happened in the general literature, let alone in the cognitive linguistics literature. Ruite apart from the cognitive arguments for direct dependencies between words. it is easy to find purely linguistic evidence@ for e"ample, most le"ical selection relations involve individual words rather than phrases. !hus the verb =E>E.= selects the preposition )., so there must be a direct relation between these two words which is impossible to show, as a direct relation, in >?G. &n the ne"t section & shall show how this advantage of dependency structure applies in handling constructions. However, it is important once again not to lurch from one e"treme position to its opposite. & have obCected to >?G on the grounds that it rules out direct links between words, a limitation which is arbitrary from a cognitive point of view@ but it

would be e+ually arbitrary to rule out chunking in synta". &n our simple e"ample, maybe we recognise small birds as a chunk as well as recognising the relations between its parts. Given that we clearly do recognise part-whole relations in other areas of cognition, it would be strange if we were incapable of recognising them in synta". Even in the area of social relations, we can recognise collectivities such as families or departments as well as relations among individuals. 'nd in synta" there do seem to be some phenomena which at least seem at first sight to be sensitive to phrase boundaries. )ne such is mutation in #elsh $!allerman 8992% which seems to be triggered by the presence of a phrase boundary and cannot be e"plained satisfactorily in terms of dependencies. 's we have already seen, there is no reason to think that cognition avoids redundancies@ indeed, it is possible that gross redundancy is e"actly what we need for fast and efficient thinking. !his being so, we cannot rule out the possibility that, even if syntactic structure includes direct word-word dependencies, it also includes e"tra nodes for the phrases that these dependencies imply $Bosta 0226, Bosta 8993%.

!.

Idioms and constructions

& have argued for a rather traditional view of language structure in which words are central both by virtue of being the units that define a level of language which is distinct from phonology and morphology, and also as the main $and perhaps only% units on that level. (y main argument was based on the Cognitive 'ssumption and its tenets, but & also showed that the conclusion is re+uired by more traditional types of evidence. #e now consider how this model of structure accommodates the grammatical patterns that are so familiar from the constructional literature* idioms $2%, $09%, clichTs or formulaic language $00%, meaning-bearing constructions $08%, and non-canonical constructions $0:%. $2% He kicked the bucket. $09% He pulled strings. $00% &t all depends what you mean. $08% ,rank snee ed the tissue off the table. $0:% How about a cup of tea1 'll such e"amples show some kind of irregularity or detail that supports the idea of usage-based learning $rather than mere triggering of an inbuilt propensity%, which in turn is predicted by the learning tenet $8% and the inheritance tenet $3%. 7ut if the proposed view of language structure makes these patterns relatively easy to accommodate, this will also count as further purely linguistic evidence for this view. !he following comments build on a number of earlier discussions of how #ord Grammar might handle constructions $Gisborne 8992, Gisborne 8900, Holmes and Hudson 8993, Hudson 8994, ?ugayama 8998%. )ne unifying feature of all the patterns illustrated in these e"amples is that they involve individual words rather than phrases. !his is obvious in most of the e"amples@ for instance, the meaning die is only available if the verb is <&C< combined with 7FC<E!. 'n apparent countere"ample is $08%, from Goldberg 0223*038, where it is given as an e"ample of the caused-motion construction. 'ccording to Goldberg, the semantic interpretation cannot be plausibly attributed to the main verb, but this is not at all obvious. 'fter all, it is precisely because the verb snee$ed describes an action that it can be turned into a causative, and it is because it needs no obCect in its own right that an e"tra one can be added. & shall argue below that this is a straightforward e"ample of word-formation, in which the e"planation for

the syntactic pattern and its meaning revolves around one word, the verb. &f this is so, then individual words play a crucial role in every pattern that has been claimed to re+uire constructions@ and crucially, phrase structure, as such, plays no such role. #e shall now work through the various types of construction listed above, starting with idioms. !he main challenge of idioms is that the idiomatic meaning overrides the e"pected literal meaning, so we need an analysis that includes the literal meaning as well as the idiomatic one, while also showing that the literal meaning is merely potential. ' system that generated only one analysis would miss the point as much as an analysis of a pun which showed only one of the two interpretations. !his view is supported by the psycholinguistic evidence that literal word meanings become active during idiom production $?prenger and others 899A%, and that the syntactic analysis of an idiom proceeds as normal $>eterson and others 8990%. ,or e"ample, in processing 'e "ic"ed the buc"et, we activate all the normal syntactic and semantic structures for <&C< and 7FC<E! as well as the meaning die. ' cognitive network is e"actly what is needed to e"plain a multiple analysis like this because it accommodates the effects of spreading activation* the processes by which activation from the word token "ic"ed spreads to the le"eme <&C<, and then as activation converges from the and buc"et, focuses on the sub-le"eme <&C<die, as found in "ic" the buc"et. !he diagram in ,igure 09 shows the end-state of this processing, but in principle a series of diagrams complete with activation levels could have shown the stages through which a speaker or hearer passes in reaching this end-state. &n words, the word token "ic"ed isa <&C<die, which in turn isa <&C<. 'ccording to the inheritance tenet $3%, "ic"ed inherits the properties of both these nodes, so it inherits from <&C<die the property of re+uiring the buc"et as its obCect@ and thanks to the workings of default inheritance, the lower nodes meaning overrides the default.

meaning kick <&C< o die <&C<die !HEdie !HE c 7FC<E!die 7FC<E!

s 'e "ic"ed the buc"et.

Figure 1-: An i"iom an" its literal counterpart

!he key notion in this analysis of idioms is sub-le"eme, one of the distinctive ideas of #ord Grammar. ?ub-le"emes are common throughout the le"icon, and not Cust in handling idioms@ for e"ample, the le"eme GB)# carries important

shared properties such as irregular morphology, but its transitive and intransitive uses combine different syntactic patterns with different meanings, so each re+uires a different sub-le"eme of GB)#. !his analysis reveals the similarities of morphology as well as the differences of synta" and meaning. !he same apparatus seems well suited to the analysis of idioms, which combine special synta" and meaning with a close link to the literal pattern. Fnfortunately, Uackendoff thinks otherwise. 'nother solution would be to say that "ic" has a second meaning, die, that only can be used in the conte"t of the buc"et. Coincidentally, in Cust this conte"t, the and buc"et must also have second meanings that happen to be null. !hen the meaning of the idiom can in fact be introduced with a single word. !he difficulty with this solution is its arbitrariness. !here is no nontheory-internal reason to concentrate the meaning in Cust one of the morphemes. $Uackendoff 8994% His obCection is strange, given the rather obvious fact that the idioms meaning die is the literal meaning of a verb, so the verb is the obvious word to receive the idioms meaning. (oreover, the analysis that he himself offered ten years earlier did concentrate the meaning in Cust one of the morphemes by linking the meaning die directly to the verb "ic" $Uackendoff 0226*0A2%. !he apparatus of sub-le"emes and default inheritance allows us to model different degrees of irregularity in idioms. !he classic discussion of idioms $.unberg and others 0225% distinguished Cust two types* idiomatic phrases such as "ic" the buc"et and idiomatically combining e"pressions such as pull strings, which allow a great deal more syntactic fle"ibility $e.g. Strings have been pulled@ Strings are easy to pull@ 'e pulled a lot o% strings.% However, the historical development of idioms argues against such a clear division. !odays metaphors tend to turn into tomorrows idioms, which become increasingly opa+ue as the original metaphor vanishes from sight. &t seems much more likely that there is a continuum of irregularity, with "ic" the buc"et at one end and pull strings near the other end. #hereas "ic" the buc"et overrides the entire meaning of <&C<, pull strings builds on a living metaphor, so the idiomatic meaning isa the literal one* if & pull strings for my nephew, this is presented as a deviant e"ample of literally pulling strings. !his e"plains the syntactic freedom. &n between these two e"amples, we find idioms such as cry wol% which derives its meaning from a fairy story which some people know, but whose meaning is related in such a complicated way to the story that it shows virtually no syntactic freedom. !urning ne"t to formulaic language such as (t all depends )on* what you mean, usage-based learning guarantees that we store a vast number of specific e"emplars, including entire utterances as well as individual word tokens. Every time we hear an e"ample of a stored utterance its stored form becomes more entrenched and more accessible, which encourages us to use it ourselves, thereby providing a feed-back loop which maintains the overall fre+uency of the pattern in the general pool of utterances. #ord Grammar provides a very simple mechanism by which formulaic e"pressions are stored* the concepts that represent the word tokens do not fade away as most tokens do, but persist and become permanent $Hudson 8996*35%. ,or instance, imagine someone hearing $or saying% our earlier e"ample* Small birds sing. &mmediately after the event, that persons mind includes a structure like the one shown in ,igure 2, but the nodes for the word tokens are destined to degenerate and

disappear within seconds. !hat is the normal fate of word tokens, but sometimes the tokens are memorable and are remembered which means that they are available if the same tokens occur again. !his process is logically re+uired by any theory which accounts for the effects of fre+uency* #hile the effects of fre+uency are often not noted until some degree of fre+uency has accumulated, there is no way for fre+uency to matter unless even the first occurrence of an item is noted in memory. )therwise, how would fre+uency accumulate1 $7ybee 8909*04% &n short, formulaic language is e"actly what we e"pect to find, in large +uantities, in the language network@ and it is represented in e"actly the same way as the utterances from which it is derived. (eaning-bearing constructions such as the caused-motion construction are more challenging precisely because they go beyond normal usage. &f anyone actually said +ran" snee$ed the tissue o%% the table, they would certainly be aware of breaking new linguistic ground, which is the e"act opposite of the situation with idioms and formulaic language. !he standard constructional analysis of such cases was presented by Goldberg $Goldberg 0223*038-62%, so we can take this analysis as our starting point. ,igure 00 shows Goldbergs diagram $from page 35% for the transitive use of ?.EEDE. !his diagram is the result of unifying two others* the one in the middle line for ordinary intransitive ?.EEDE, and the one for the caused-motion construction $which accounts for the top and botom lines%. !he letter B stands for the relation between these two patterns, which at the start of the middle line is e"plained as means, e"pressing the idea that snee ing is the means of the motion $rather than, say, its cause or manner%.

?em C'F?E-()NE B B* means ?.EEDE ?yn N

G cause G snee.er ?F7U

goal

theme H H

)7/

)7U

Figure 11: Cause" motion in constructional notation

!he constructional notation is unhelpful in a number of respects, but the most important is its semantic rigidity. !he trouble is that it re+uires a one-one match between words and semantic units, so one verb can only e"press one semantic unit, whose arguments are e"pressed by the verbs various dependents. !his is a problem because the sentence +ran" snee$ed the tissue o%% the table actually describes two separate events* ,rank snee ing, and the tissue moving off the table. ,rank is the snee er, but only in relation to the snee ing@ and the tissue is the theme of the moving, but not of the snee ing. ?imilarly, if ,rank $rather than the snee e% is a cause, it is in relation to the moving, and not the snee ing@ and o%% the table describes the direction of the moving, and not of the snee ing. Collapsing these two events into a single semantic structure is at best confusing, and arguably simply wrong.

&n contrast, the network notation of #ord Grammar in ,igure 08 provides whatever fle"ibility is needed. !he analysis keeps as close as possible to Goldbergs, and the e"ample is simplified, in order to focus on the benefits of fle"ibility in the semantic structure. !he dotted lines link the words to their meanings, and as usual the little triangles show isa relations. !he main benefit of this network notation is the possibility of separating the node labelled ,rank snee e it off from the one labelled it off. !he former is the meaning of the verb token snee$ed, which, as usual, shows the effects of all the dependents that modify the verb. !he latter is a single semantic entity which is contributed Cointly by the obCect and directional adCunct $Goldbergs obli+ue%. &n itself it isa motion, with a theme and a direction, but in relation to the snee ing, it is the result of the verbs action. .otice how the two-event analysis avoids all the problems noted above@ so ,rank is the snee er, but plays no role at all in the movement, and contrariwise it and off define the movement but have nothing directly to do with the snee ing. snee er ,rank ,rank snee e it off snee e source theme it off it off

result

caused-move

a s +ran" snee$ed o it o%%.

?.EEDE cause-move
Figure 12: Cause" motion in )or" *rammar notation

!he last kind of construction is represented here by 'ow about a cup o% tea? ?uch e"amples have always been central to #ord Grammar $Hudson 0229*3-A%, where & call them non-canonical, but they also play an important part in the literature of constructions where the classic discussion calls &hats X doing Y a non-core construction because the normal core rules are suspended $<ay and ,illmore 0222%. !o change e"amples, the core rules re+uire a sentence to have a finite verb@ but there are e"ceptions without finite verbs such as the how about pattern in 'ow about a cup o% tea?. )nce again, #ord Grammar provides the necessary amount of fle"ibility, thanks to the focus on word-word dependencies and the possibility of sub-le"emes mentioned above. &ndeed, it seems that these constructions all consist of a continuous chain of dependencies, a claim which cannot even be e"pressed in terms of phrase structure, so dependency structure is more appropriate than phrase structure

$Holmes and Hudson 8993%. !he diagram in ,igure 0: shows how the sub-le"emes H)#" and '7)F!" relate to each other and to their super-le"emes, and hints at a semantic analysis paired with the syntactic one. $.ote how the notation allows us to ignore the vertical dimension when this is convenient@ in this case, the isa triangles face upwards, reversing the normal direction.% !his semantic sketch will be developed in the ne"t section as an e"ample of what is possible in a network-based analysis of meaning.

how about "1

"

c H)#" '7)F!"

c ,

H)#

'7)F!

Figure 13: !ow about x/ &epen"ency syntax

!his section has shown how easily #ord Grammar accommodates the idiosyncratic patterns that lie at the heart of what & have called constructional analyses* idioms such as "ic" the buc"et and pull strings, clichTs or formulaic language such as (t all depends what you mean, meaning-bearing constructions such as the caused-motion construction, and non-canonical constructions such as how about X?. !he key pieces of theoretical apparatus are sub-le"emes, default inheritance, token-based learning, fle"ible networks with classified relations and, of course, dependency structure in synta". 7ut how does this argument leave the notion of construction1 &t all depends what you mean by construction $and construction grammar%, and as we have already seen, this varies a great deal from theory to theory, and from person to person. &f construction grammar is simply the same as cognitive linguistics, then nothing changes. ?ince #ord Grammar is definitely part of cognitive linguistics, it must also be an e"ample of construction grammar $though it is hard to see what is gained by this double naming%. ?imilar conclusions follow if construction grammar is a grammatical theory that reCects the distinction between grammar and le"icon, and recognises very specific grammatical patterns alongside the very general ones. Here too #ord Grammar is an ordinary e"ample of construction grammar $Gisborne 8994%. However, the debate becomes more interesting if we give construction a more precise meaning, and define construction grammar as a grammar that

recognises nothing but constructions. ,or many authors, a construction is by definition a pairing of a form $some kind of formal pattern, whether phonological, morphological or syntactic% with a meaning* !he crucial idea behind the construction is that it is a direct form-meaning pairing that has se+uential structure and may include positions that are fi"ed as well as positions that are open. $7ybee 8909*2%. !his definition implies a very strong claim indeed* that every pattern that can be recognised at the levels of synta" or morphology can be paired with a single meaning. #e have already seen $in section :% that this claim is untenable if morphology and synta" are recognised as distinct levels, because $in this view% morphological structures are typically not directly linked to meaning. ?ince #ord Grammar does recognise morphology and synta", the grammatical patterns that it recognises cannot be described as constructions. 'nd yet #ord Grammar can accommodate all the idiosyncratic patterns that are often +uoted as evidence for constructions. &n this sense, then, #ord Grammar is a radical departure from construction grammar, as radical as Crofts Badical Construction Grammar, which departs in e"actly the opposite direction. ,or #ord Grammar, the basic units of synta" are words and the dependency relations between them. &n contrast, Croft believes that the basic units are meaning-bearing constructions* Badical Construction Grammar ... proposes that constructions are the basic or primitive elements of syntactic representation and defines categories in terms of the constructions they occur in. ,or e"ample, the elements of the &ntransitive construction are defined as &ntransitive ?ubCect and &ntransitive Nerb, and the categories are defined as those words or phrases that occur in the relevant role in the &ntransitive construction. $Croft and Cruse 8995*845, repeated as Croft 8996* 52A%. Crofts e"ample of a non-construction that would not be recognised is verb. ,urthermore, the term &ntransitive ?ubCect is presumably merely shorthand for something more abstract such as the noun that e"presses the actor, or less abstract, such as the noun before the verb, because there are no syntactic relations in Badical Construction Grammar because such relations are redundant if morphosyntactic clues are related directly to semantic relations $ibid*526%. 's & commented earlier, the claims of Badical Construction Grammar are not derived from the Cognitive 'ssumption@ indeed, they conflict directly with some of the tenets, including the principle for learning that Croft himself e"pressed so well. &n discussing the choice between fully general analyses without redundancy and fully redundant listing of specific e"amples of general patterns, he concludes* grammatical and semantic generality is not a priori evidence for e"cluding the more specific models $Croft 0224%. !his is generally accepted as one of the characteristics of usage-based learning, so we can assume that any construction is stored not only as a single general pattern, but also as a collection of individual e"emplars that illustrate the pattern. &ndeed, the learning tenet re+uires the e"emplars to be learned before the general pattern, because these are the material from which the general pattern is learned. 7ut if e"emplars can be mentally represented separately from the general construction, and if they are even represented before the construction, how can the construction be more basic1 &n short, the basic tenets of the Cognitive 'ssumption support the more traditional approach which Croft critici es as reductionist, in which more abstract and general patterns are built out of more concrete and specific patterns.

".

#emantic$encyclopedic structures

&f the Cognitive 'ssumption $0% is right, it follows that there can be no boundary between linguistic meaning and general conceptual structure, and therefore no boundary between dictionary meaning and encyclopedic information. !he typical meaning of a word or a sentence is simply the part of general conceptual structure that is activated in the mind of the speaker and hearer. !his view of meaning is one of the tenets of cognitive linguistics $including #ord Grammar% in contrast with the more classical or obCectivist approaches to semantics that have dominated linguistic semantics. Cognitive linguistics cannot match the massive apparatus of formal logic that these approaches bring to bear on the analysis of meaning, but once again the Cognitive 'ssumption may be able to guide us towards somewhat more formal analyses than have been possible so far. !he most relevant conse+uence of the Cognitive 'ssumption is the recycling tenet $5%, the idea that each new concept is defined in terms of e"isting concepts. !his immediately rules out any boundary between dictionary and encyclopedia, because any dictionary entry is bound to recycle an encyclopedic entry. ,or e"ample, take a child learning the word C'! and its meaning* the child stores the word and looks for a potential meaning in general memory, where the $encyclopedic% concept cat is the obvious candidate. Becycling guarantees that this concept is the one that the child uses as the meaning of C'!. Becycling also rules out a popular approach to le"ical semantics in which le"ical meanings are defined in terms of a pre-e"isting metalanguage such as the .atural ?emantic (etalanguage suggested by #ier bicka $#ier bicka 022A%. !he argument goes like this $Hudson and Holmes 8999%* )nce a concept has been created, it is available as a property of other concepts, and should be recycled in this way whenever it is relevant. 7ut new concepts cannot be used in this way if the only elements permitted in a definition are drawn from the elementary semantic metalanguage. !o take a concrete e"ample, consider #ier bickas definition of a bicycle $#ier bicka 0243*008%, which refers to the pedals in at least three places* as a part of the structure, as the source of power, and as the place for the riders feet. !he problem is that pedal is not part of the metalanguage, so a circumlocution $parts for the feet% has to be used, obscuring the fact that each reference is to the same obCect. &n contrast, the recycling tenet re+uires us to recognise the concept pedal and to name this concept whenever it is relevant@ but this of course is totally incompatible with any attempt to define every concept soleley in terms of a fi"ed list of primitives. 'nother tenet highly relevant to semantic structure is the network tenet $:%, which re+uires every scrap of information to be e"pressed in terms of network structures. !his means that network notation has to be available for every analysis that can be e"pressed in other notations such as the predicate calculus. !ake, for e"ample, the universal and e"istential +uantifiers which distinguish semantically between sentences such as the following* $05% Everyone left. ", person$"% V left$"% $,or every ", if " is a person then " left% $03% ?omeone left. ", person$"%, left$"% $!here is an " such that " is a person and " left% !he sentences undeniably have different meanings, and the linear notation of formal semantics distinguishes them successfully, but the challenge is to translate the linear notation into a network. !hanks to the inheritance tenet $3%, the solution is surprisingly easy $Hudson 8996*::-5%. Fniversal +uantification is simply inheritance,

because any instance of a category automatically inherits all of that categorys properties $unless of course they are overridden%. Conse+uently, we can represent the meaning of -veryone le%t as shown in the first diagram of ,igure 05. 'ccording to this diagram, the typical person left, so one of the properties to be inherited by any e"ample of person is that they left. &n contrast, the diagram for Someone le%t shows that some particular person $represented by the dot% left, so leaving is not a property of person and therefore cannot be inherited by other people. leaver leaver person s Everyone left. ?omeone left person S s left. left

Figure 1 : 0veryone le#t an" someone le#t1

!his simple and natural analysis of universal and e"istential +uantification shows the benefit of starting from the Cognitive 'ssumption@ and of course this assumption also leads to an analysis which is cognitively more plausible than traditional logic because default inheritance allows e"ceptions. 's in ordinary conversation, the generalisation that everyone left is not, in fact, overturned completely if it turns out that a few people did not leave@ e"ceptions are to be e"pected in human reasoning. #ord Grammar offers structural analyses for many other areas of semantics $Gisborne 8909, Hudson 0245*0:0-809@ Hudson 0229*08:-0AA@ Hudson 8996*800854@ Hudson 8909*889-850%, all of which are informed by the Cognitive 'ssumption. !he e"ample of universal and e"istential +uantification illustrates a general characteristic of these analyses* patterns that other theories treat as special to semantics turn out to be particular cases of much more general cognitive patterns that are therefore found in other areas of language structure. #e shall consider another e"ample in the discussion of word order $section 6%, where & argue that word order rules build on two relations also found in semantics* the landmark relation e"pressed by spatio-temporal prepositions, and the temporal relations e"pressed by tense. !his sharing of patterns across linguistic levels is e"actly as e"pected given the Cognitive 'ssumption, but it is rarely discussed in other theories. !his article is not the place to summarise all the possibilities of #ord Grammar semantics, but it may be helpful to illustrate them through the analysis of one concrete e"ample. &n the previous section & gave a syntactic analysis of how about ,? in ,igure 0: as an e"ample of a non-canonical construction, with a promise of a fuller semantic analysis which & can now redeem. !he meaning of the syntactic pattern how about ,? is given as a node labelled how about ", an analysis which does at least show that " is defined by the complement of about, though it leaves the rest of the meaning completely unanaly ed and undefined. 7ut how might we define a notion like this1 Constructional analyses generally leave semantic elements without definition@ for e"ample, <ay and ,illmores analysis of the meaning of the #W=O construction recognises an element called incongruity-Cudgment made by a

pragmatically-identified Cudge called prag about the entity defined by the " word. !hat is as far as the semantic analysis goes. 7ut according to the network tenet $:%, concepts are defined by their links to other concepts, so any semantic element can be defined by links to other concepts. !he first challenge in the analysis of how about X? is its illocutionary force. &f & say 'ow about a cup o% tea?, & am asking a yes-no +uestion, Cust as in (s it raining? !he only oddity is that this yes-no +uestion is introduced by a wh-word, how $or what%. Even if the construction originated in a wh-+uestion such as &hat do you thin" about ..., the meaning has now changed as much as the synta" $Cust as it has in how come ....?%. How, then, can a semantic network indicate illocutionary force1 !his is a rather fundamental +uestion for any model, but especially for a usage-based model in which all the conte"tual features of utterances are part of the total analysis@ but cognitive linguistics has so far produced few answers. &n contrast, #ord Grammar has always had some suggestions for the structural analysis of illocutionary force. !he earliest idea was that it might be defined in terms of how knowledge was distributed among participants $Hudson 0245*04A-026%, and knowledge is clearly part of the analysis. However, & now believe that the recycling principle $5% points to a simpler first step* linking to the notions ask and tell, which are already needed as the meanings of the le"emes '?< and !E//. !his is Cust like the performative hypothesis of Generative ?emantics $Boss 0269% e"cept that the performative structure is firmly in the semantics rather than in synta" $however deep%. 'nd as in the performative analysis, the speaker is the asker or teller, the addressee is the askee or the tellee, and the content of the sentence is what we can call the theme the information transferred from one person to the other. ,or most theories, this analysis would be very hard to integrate into a linguistic structure because of the deictic semantics involved in speaker and addressee which link a word token to a person, thereby bridging the normal gulf between language and non-language@ but for #ord Grammar there is no problem because the Cognitive 'ssumption reCects any boundary between language and non-language. !he analysis of a word token is a rich package which includes a wide range of deictic information specific to the token its speaker, its addressee, its time and place, and its purpose* what the speaker was trying to achieve by uttering it. &n the case of the sentence-root, which carries the meaning of the entire sentence, its purpose may be to give information or to re+uest it in other words, the tokens purpose is its illocutionary force $Gisborne 8909%. !his treatment of illocutionary force is applied to 'ow about X1 in ,igure 03, which is based in turn on ,igure 0: above. !he one relation which is not labelled in this diagram is that between how about "1 and ". #e might be tempted to label it theme, but we should resist this temptation because " is not in fact the thing re+uested@ for e"ample, 'ow about .ohn? is not a re+uest for Uohn. & develop this point below.

asking asker y addressee purpose speaker meaning c '7)F!" c , askee how about "1 "

meaning

H)#"

Figure 1$: 2he meaning o# How about x? 3 illocutionary #orce

!he ne"t challenge, therefore, is to decide what the theme of the +uestion is. &f, for e"ample, 'ow about .ohn? is not a re+uest for Uohn, what does it want as an answer1 Clearly, it is a re+uest for either yes or no, information about the truth of some proposition which we can call simply p@ but what is p1 !his varies with the situation as illustrated by the following scenarios* $0A% #e need someone strict as e"aminer for that thesis, so how about Uohn1 $06% Oou say you dont know any linguists, but how about Uohn1 $04% &f you think (ary is cra y, how about Uohn1 ?imilar variation applies to 'ow about a cup o% tea?, but this is so entrenched and conventional that it needs no linguistic conte"t. &n every case, then, the " of 'ow about ,? is suggested as a possible answer to a currently relevant +uestion of identity the identity of a possible e"aminer in $0A%, of a linguist in $06% and of someone cra y in $04%. #hat is needed in the structural analysis of 'ow about ,?, therefore, is an e"tra sub-network representing this identity-+uestion, combined with a representation of " as a possible answer and the truth value of the answer. !his supplementary network has two parts* one part which relates p to the theme of the +uery, the choice between true and false, and another part which relates p to ". ?tarting with the choice, this involves the #ord Grammar treatment of truth in terms of the primitive relation +uantity, whose values range over numbers such as 9 and 0 $Hudson 8996*885-4%. ' nodes +uantity indicates how many e"amples of it are to be e"pected in e"perience@ so 0 indicates precisely one, and 9 none. !his contrast applies to nouns as e"pected@ so a boo" has a referent with +uantity 0, while the referent of no boo" has +uantity 9@ but it also applies to finite verbs, where it can be interpreted in terms of truth. ,or e"ample, the verb snowed in (t snowed refers to a situation with +uantity 0, meaning that there was indeed a situation where it snowed@ whereas the root-word in (t did not snow has a referent with +uantity 9, meaning that no such situation e"isted. ?een in this light, a yes;no +uestion presents a choice between 0 $true% and 9 $false% and asks the addressee to choose one of them. !he +uantity relation is labelled X in diagrams, so ,igure 0A shows that the proposition p has three +uantities* 1, 0 and 9.

L0, 9, 1M member 1 theme asking asker y askee how about "1 addressee purpose meaning speaker X p 0 X a b " 9 X e r S +

meaning c '7)F!" c ,

H)#"

Figure 1': 2he meaning o# How about x? 4 content

!he mechanics of choice in #ord Grammar are somewhat complicated because they involve two further primitive relations called or and binding@ these have a special status alongside isa and a small handful of other relations. ' choice is defined by a set that includes the alternatives in the mutually e"clusive or relation, and a variable labelled 1 which is simply a member $Hudson 8909*55-56%. #hen applied to the choice between 0 and 9, we recognise a set L0, 9, 1M which contains 0 and 9 as its mutually-e"clusive or members as well as an ordinary member called 1 which can bind to either of them. !he or relation is shown by an arrow with a diamond at its base while binding is represented by a double arrow@ so the subnetwork at the top of ,igure 0A shows that 1, the theme of how about ", is either 0 or 9. #e now have an analysis which shows that 'ow about , means & am asking you whether p is true, where p has some relation to ". !he remaining challenge is to e"plain how p relates to ". &t will be recalled that p is a proposition $which may be true or false%, but of course everything in this network must be a concept $because that is all we find in conceptual networks%, so propositions must be a particular kind of concept. &n this case, the proposition is the state of affairs $>ollard and ?ag 0225*02% in which two arguments, labelled simply a and b, are identical* the proposition a Y b. ,or instance, in $0A% the proposition p is the e"aminer we need Y Uohn. !he identity is once again shown by the primitive binding relation introduced above, which is shown in ,igure 0A as binding a concept + to ". )f these two concepts, we already know " as the referent of word ,@ for e"ample, in 'ow about .ohn?, " is Uohn. !he other concept, +, is more challenging because it is the variable concept, the hypothetical e"aminer, linguist or cra y person in $0A%, $06% and $04%. #hat these concepts have in common is that they have some currently active relationship to some

other currently active entity. !he entity and relationship may be e"plicit, as in $0A% $e"aminer of that thesis%, but 'ow about a cup o% tea? shows that they need not be. !he analysis in ,igure 0A shows the connection to currently active structures by means of the binding procedure $which triggers a search for the currently most active relevant node%@ so node e at the top of the diagram needs to be bound to some active entity node, illustrating how the permanent network can direct processes that are often considered to be merely pragmatic. (ost importantly, however, the same process applies to the relationship node labelled r, binding it to an active relationship@ so relationships and entities have similar status in the network and are subCect to similar mental operations. !his similarity of relationships and entities is e"actly as re+uired by the relation tenet $A%, which recognises relationships $other than primitives% as a particular kind of concept@ and it is built into the formal apparatus of #ord Grammar. & am not aware of any other theory that treats relationships in this way. !his completes the semantic analysis of &hat about ,?, showing that it means something like* & am asking you whether it is true that " is relevantly related to the relevant entity, where relevance is defined in terms of activation levels. !he main point, of course, is not the correctness of this particular analysis, but the formal apparatus that is needed, and that #ord Grammar provides. !he main facilities were binding, relational concepts, +uantities, mutually e"clusive or-relations and network notation $with the possibility of adding activation levels%. E"actly as we might e"pect given the Cognitive 'ssumption, none of these facilities is uni+ue to semantics.

%.

&rder o' (ords) morphs etc.

&n this section we return to the formal levels of synta", morphology and phonology, but we shall find that they share some of the formal structures found in semantics. !he +uestion is how our minds handle the ordering of elements word order in synta", morph order in morphology and phone order in phonology. )nce again the choice of notation is crucial. !he standard notation of plain vanilla synta" $or morphology or phonology% builds strongly on the conventions of our writing system, in which the left-right dimension represents time@ so it is easy to think that a diagram such as ,igure 8, the bo"-notation constructional analysis of Cats miaow, already shows word order ade+uately. 7ut the network tenet re+uires every analysis to be translated into network notation, and the fact is that a network has no before or after, or left-right dimension. &n a network, temporal ordering is one relationship among many and has to be integrated with the whole. 7ut of course the ordering of elements presents many challenges for linguistic theory beyond +uestions of notation, so what we need is an analysis which will throw some light on these theoretical +uestions. 'nother issue that arises in all +uestions of ordering is whether the ordering is spatial or temporal. )nce again, given our e"perience of left-right ordering in writing we all tend to think in spatial terms, and indeed linguists often talk about leftward movement or even fronting $assuming that the front of a sentence is its leftmost part%. 7ut given the logical primacy of spoken language, this spatial orientation is actually misleading, so we need to think of words and sounds as events in time@ and that means that ordering is temporal, rather than spatial. 'dmittedly, this choice is blurred by the tendency for temporal relations to be described metaphorically as though they were spatial@ but for consistency and simplicity the terminology will be temporal from now on. !emporal ordering of behaviour, including language, re+uires analysis at two levels of abstraction. )n the one hand we have the surface ordering which simply

records the order of elements in a chain. !his is the only kind of ordering there is if the chain is arbitrarily ordered, as in a telephone number $or a psychological memory e"periment%* so in a series such as :8:0, all we can say is that a : is followed by a 8, which is followed by another :, which is followed by a 0. !he same is true in any ordered series such as the digits, the alphabet or the days of the week. !he only relation needed to record this relation is ne"t $which, to Cudge by the difficulty of running a series backwards, seems to spread activation in only one direction, so it may be a primitive relation%. &n network notation we might show this relation as a straight but dotted arrow pointing towards the ne"t element, as in ,igure 06. .otice in this diagram how the default ne"t relations are overridden by the observed ones, and how much easier it would have been to remember the series in the default order* 0 8 : 5. number

/0 /1
Figure 1(: 2he 5next5 relation in a series o# numbers

/ike any other organised behaviour, then, a string of words has a surface ordering which can be described simply in terms of the ne"t relation@ and for some purposes, the ne"t relation is highly relevant to language processes. !his is most obviously true in phonology, where adCacency is paramount as when a morphs final consonant assimilates to the first consonant of the ne"t morph. However, cognitive science also recognises a deep ordering behind the surface order. !he point is that behaviour follows the general patterns which have sometimes been analy ed in terms of scripts for scenarios such as birthday parties or cleaning ones teeth $Cienki 8996%. #hereas the ne"t relation records what actually happens, these deeper analyses e"plain why events happened in this order rather than in other possible permutations. ,or e"ample, tooth-cleaning is organised round the brushing, so picking up the brush and putting paste on it precedes the brushing, and rinsing and putting away the brush follow it. ' general feature of organised behaviour seems to be its hierarchical structure, with problems $e.g. brushing ones teeth% generating subproblems $e.g. picking up the brush and putting it down again%@ and this structure produces a typical ordering of events which can be described as a series linked by ne"t. 7ut notice that the deep ordering of problems and sub-problems is more abstract than the ne"t relation into which they will eventually be translated@ for e"ample, although picking up the tooth brush precedes brushing, we cannot say that brushing is the ne"t of picking up the brush because they may be separated by other

events such as applying tooth-paste. &n short, deep ordering mediates between abstract hierarchical relations and surface ordering. Beturning to language, we find the Cognitive 'ssumption once again pushing us towards a particular view of language structure, in which deep ordering mediates between abstract hierarchical relations and surface ordering. !his deep ordering is especially important in synta", unsurprisingly perhaps because the ordering of words is the greatest leap in the chain of levels linking a completely unordered network of meanings to a completely ordered chain of sounds. ,or instance, in Cats miaow, the deep ordering mediates between the relation subCect and the surface relation ne"t* cats is the subCect of miaow. a typical verb follows its subCect. therefore, miaow is the ne"t of cats. #hat we need, then, is a suitable relation for describing deep ordering. Cognitive Grammar has already defined the relation that we need* the relation between a traCector and its landmark, as in the book $traCector% on the table $landmark% $/angacker 8996*5:A%. /angacker recognises that this relation applies to temporal relations, giving the e"ample of a traCector event occurring either before or after its landmark event. !his kind of analysis is normally reserved for semantics, where it plays an important part in defining the meaning of prepositions $such as be%ore and a%ter% and tenses, where a past-tense verb refers to an event whose time is before the verbs time of utterance. #ord Grammar, however, e"tends traCector;landmark analysis right into the heart of grammar, as the basis for all deep ordering. ,or e"ample, in the day a%ter Christmas, the word a%ter takes its position from the word day, on which it depends@ so the relation between the word day $landmark% and a%ter $traCector% is an e"ample of the same relation as that between Christmas $landmark% and the day after $traCector%, because in both cases the traCector $the day after or the word a%ter% follows its landmark $Christmas or the word day%. )ne of the salient +ualities of the traCector;landmark relation is that it is asymmetrical, reflecting the ine+uality of the figure $traCector% and its ground $landmark%. ,or e"ample, describing a book as on the table treats the table as in some sense more easily identified than the book as indeed it would be, given that tables are usually much bigger than books@ it would be possible to describe a table as under the book, but generally this would be unhelpful. !he same is true for events such as the parts of a tooth-cleaning routine. !he main event is the brushing, and the associated sub-events are subordinate so we naturally think of them taking their position from the brushing, rather than the other way round. 'nd of course the same is even more obviously true in synta", especially given the dependency view advocated in section 5 in which the relations between words are inherently asymmetrical. )nce again, then, we find general cognitive principles supporting the #ord Grammar version of synta" as based on word-word dependencies, and not on phrase structure. &t could of course be obCected that another kind of asymmetrical relation is meronymy, the relation between a whole and its parts@ and at least in synta" this may help to e"plain why the words in a phrase tend to stay together $Bosta 0225% though we shall see below that even this tendency can largely be e"plained without invoking phrases. 7ut meronymy cannot in itself be the abstract relation which determines deep ordering because a whole has the same relation to all its parts, so the part*whole relation, in itself, cannot determine how the parts are ordered among themselves. !he only relation which can determine this is a relation between the parts, such as wordword dependencies.

/et us suppose, then, that word-word dependencies determine deep ordering in other words, that a word acts as landmark for all its dependents. >recisely how does this determine the ordering of words1 ?aying that cats has miaow as its landmark does not in itself guarantee any particular ordering of these two words, so how can we combine the traCector;landmark relation with ordering rules1 't this point we should pay attention to the relation tenet $A%, which allows relations to be classified and $therefore% sub-classified. Given the more general relation landmark, any particular specification of the relation gives a sub-relation linked by isa to landmark@ so in this case we can recognise Cust two sub-relations* before and after, each of which isa landmark. &n this terminology, if W is the landmark of O, and more precisely W is the before of O, then O is before W $or, perhaps more helpfully, W puts O before it%. &n this terminology, miaow is the before of cats, meaning that cats is before miaow. !his analysis is shown in ,igure 04, where miaow is the ne"t of cats because it is the before of cats because cats is its subCect.

dependent

landmark

before subC

Cats

miaow.

Figure 1+: Cats miaow6 with "eep an" sur#ace or"ering

)ne of the challenges of analysing general behaviour is that events can have complicated relations to one another. Consider your tooth-cleaning routine once again. ?uppose you are a child being supervised by a parent, who wants to see you using the paste. &n this case you have two goals* to be seen using paste as well as to actually brush them. Oou pick up the paste and then the brush the reverse of your normal order so when you are ready to put paste on the brush, it is already in your hand. .otice how it is the dominant goal being seen that determines the ordering of events, and that overrides the normal ordering. !his simple e"ample allows us to predict similar comple"ities within language, and especially in synta" which, of course, we find in spade-fulls. ,or e"ample, ,igure 4 above gives the structure for sentence $02%* $02% #hich birds do you think sing best1 ?everal words have multiple dependencies, in the sense that they depend on more than one other word@ for e"ample, which is the e"tractee of do $from which it takes its position%, but it is also the subCect of sing, and you is the subCect not only of do but also of thin". $!he evidence for these claims can be found in any introduction to synta".% !his is like the multiple dependencies between picking up the paste and the two goals of being seen and of applying paste to the brush@ and as in the toothcleaning e"ample, it is the dominant $highest% dependency that determines the

surface order@ so which takes its position from do, and so does you, because do is the sentence-root the highest word in all the chains of dependencies. !he conclusion, then, is that a word may depend on more than one other word, and that some of these dependencies may determine its position, while others are irrelevant to word order. &n #ord Grammar notation the two kinds of dependency are distinguished by drawing the dependency arcs that are relevant to word order above the word-tokens, leaving the remainder to be drawn below the words@ the former, but not the latter, are paired with traCector;landmark relations which can therefore be left implicit. $!his convention can be seen in ,igure 4.% !his distinction is e"actly as we might e"pect given the logic of default inheritance. 7y default, a word takes as its landmark the word on which it depends@ but if it depends on more than one word, it takes the highest in the dependency chain in other words, raising is permitted, but lowering is not. 7ut even this generalisation has e"ceptions, in which a word has two dependents but takes the lower one as its landmark. )ne such e"ception is the German pattern illustrated by $89% $Hudson 8996*055%; $89% Eine Concorde gelandet ist hier nie. ' Concorde landed is here never. ' Concorde has never landed here. .o doubt there are many other e"ceptions. !he main point of this discussion is that general cognitive principles support #ord Grammar in its claim that word order can, and should, be handled as a separate relation, rather than simply left implicit in the left-right notation of plain-vanilla synta". However, word order raises another fundamental issue where #ord Grammar is different from other versions of cognitive linguistics. #hy do the words of a phrase tend so strongly to stay together1 &f syntactic structure consists of nothing but dependencies between individual words $as & argued above in section 5%, what is the glue that holds the words of a phrase together1 ,or e"ample, why is e"ample $80% ungrammatical1 $80% Z?he red likes wine. &n dependency terms, this structure should be permitted because all the local wordword dependencies are satisfied, as can be seen in ,igure 02. &n particular, red should be able to modify wine because the order of these two words is correct $Cust as in red wine%, and similarly she should be able to act as subCect of li"es, Cust as in She li"es red wine. 'nd yet the sentence is totally ungrammatical.

2She

red

li"es

wine.

Figure 1,: 7She red likes wine an" "epen"encies

!he notation allows an obvious e"planation in terms of crossing lines and indeed, this is very easy to e"plain when teaching syntactic analysis but why should this be relevant1 'fter all, if a network has no left-right dimension then the intersection of lines is Cust an artefact of our writing-based notation. (oreover, intersecting arrows dont matter at all when $following the #ord Grammar convention e"plained above% they are written below the words, so why should they

matter above the words1 >hrase structure e"planations face similar obCections when confronted with the need to be translated into network notation* if word order is shown, network-fashion, by a relation such as ne"t, it is meaningless to ban motherdaughter lines that intersect. !he e"planation that we need, therefore, must not depend on any particular notation for linear order. !he #ord Grammar e"planation is based on the the 7est /andmark principle $Hudson 8909*3:%, which is a principle for general cognition, and not Cust for synta". #hen we want to define the location of something, we have to choose some other obCect as its landmark, and we always have a wide range of possibilities. ,or instance, imagine a scene containing a church, a tree and a bench. 's a landmark for the bench, the church would be better than the tree because it is more prominent larger, more permanent, more important for the local community and, because of that, more accessible in most peoples minds. 7ut the preferred landmark also depends on distance, because the closer things are, the more likely it is that their relationship is distinctive, and has already been recorded mentally@ so if the bench was ne"t to the tree but a long way from the church, the tree would make a better landmark. !he best landmark, therefore, is the one that offers the best combination of prominence and nearness. !hese considerations may in fact reduce to a single issue, identifiability* a prominent landmark is easy to identify in itself, while nearness make the landmarks relation easy to identify. 7y hypothesis, this principle applies to memory by guiding our choice of landmarks when recording properties@ so in remembering the bench in the scene Cust described, one of its properties would be its physical relationship to either the church or the tree, depending on which +ualified as the best landmark. !he principle also applies to communication, so when we describe the bench to someone else we only call it the bench by the tree if the tree is the best landmark@ and, when we hear this phrase, we assume that the tree is the best landmark $at least in the speakers mind%. Beturning to synta", the 7est /andmark principle also e"plains why the words in a phrase hang together. 's hearers, we always assume that the speaker has chosen the best landmark for each word, and as speakers, we try to satisfy this e"pectation. &n synta", prominence can easily be defined in terms of dependency, so the most prominent word for a word # is always the word on which # depends, its parent. =istance can be defined in terms of the number of intervening words, so #s parent will always be separated from # by as few words as possible, given the needs of other words to be near to their respective landmarks. Conse+uently, any word # should always be as close as possible to its parent, and the only words which can separate the two words are other words which depend $directly or indirectly% on #s parent. .ow consider the e"ample ZShe red li"es wine. !he troublesome word is red, which depends on wine, so wine should be its landmark@ but it is separated from wine by li"es, which does not need to be there. !his order is misleading for a hearer, because it implies that li"es must be the best landmark for red@ so a speaker would avoid using it. &n short, the ungrammaticality of ZShe red li"es wine has Cust the same e"planation as the infelicity of She sat on the bench by the church, if the bench is in fact much closer to the tree.

*.

Conclusion

!he most general conclusion of this paper is that the Cognitive 'ssumption has important conse+uences for language structure. !his is hardly surprising considering the history of our ideas about language structure. )ur present ideas rest on four thousand years of thinking about language structure, very little of which was driven

by a desire to e"plore mental structures, or was informed by reliable information about general cognition. )ther influences were much more powerful, and not least the influence of writing. !he teaching of literacy has always been the driving force behind a lot of thinking about language structure, so it is hardly surprising that the technology of writing has profoundly influenced our thinking. !his influence is one of the themes of this paper, emerging in at least two areas* the difficulty of conceptually separating types and tokens, and the temptation to treat word order in terms of the left-right conventions of writing. 'nother maCor source of influence was the reCection of /atinbased grammar which encouraged many 'merican structuralists to look for simplified models in analysing e"otic languages@ the effect of this was the plain-vanilla synta" of the 'merican structuralists. &n some ways these two influences pull in opposite directions, but their common feature is that neither of them is concerned at all with cognition. ?imilarly, standardisation, another driving force behind the study of language, encourages us to think of language out there in the community rather than in any individuals mind@ and the publishers distinction between grammars and dictionaries suggests a similar distinction in language structure. !hese non-cognitive pressures have had a predictable impact on widely accepted views of how language works, but this is the tradition on which we all build $and within which most of us grew up%. 'nd as with any cultural change, it is all too easy to include unwarranted old beliefs in the new order. !o simplify, we have seen two different cognitive movements in the last few decades. &n 02A3, Chomsky launched the idea that a grammar could model competence, while also declaring language uni+ue, which in effect made everything we know about general cognition irrelevant to the study of language. 's a result, Chomskys claims about the nature of language structure derived more from the 'merican ?tructuralists and mathematics than from psychology, so early transformational grammar can be seen as a continuation of Harriss theory of synta" combined with some cognitive aims. ?imilarly, Cognitive Grammar developed out of generative semantics, and construction grammar out of a number of current approaches $including H>?G%. Fnsurprisingly, perhaps, the view of language structure which these theories offer is rather similar to the traditions out of which they grew. .o doubt the same can be said about #ord Grammar, but in this case the history is different, and the resulting mi" of assumptions is different. (oreover, the theory has been able to change a great deal since its birth in the early 0249s, and at every point cognitive considerations have been paramount, so it is not merely a happy coincidence that #ord Grammar tends to be compatible with the assumptions of general cognitive science. !he following list includes the main claims about language structure discussed in the paper, all of which appear to be well supported by what we know about general cognition* (orphology and synta" are distinct levels, so language cannot consist of nothing but symbols or constructions. ?yntactic structure is a network of $concepts for% words, not a tree of phrases. !his network provides the fle"ibility needed for analysing all the various kinds of construction. ?emantic structure is also a network, and allows detailed analyses of both compositional and le"ical meaning. !he order of elements in synta" or morphology involves Cust the same cognitive mechanisms as we use in thinking about how things or events are related in place or time, notably the landmark relation and a primitive ne"t relation.

Beferences 'ronoff, (ark 0225. 3orphology by (tsel%. Stems and in%lectional classes. Cambridge, ('* (&! >ress 7arabasi, 'lbert /. 8992. [?cale-,ree .etworks* ' =ecade and 7eyond[, Science :83* 508-50:. 7loomfield, /eonard 02::. Language. .ew Oork* Holt, Binehart and #inston 7olte, U. and Coenen, E. 8998. [&s phonological information mapped onto semantic information in a one-to-one manner1[, 4rain and Language 40* :45-:26. 7ybee, Uoan 8909. Language# 5sage and Cognition. Cambridge* Cambridge Fniversity >ress 7ybee, Uoan and ?lobin, =an 0248. [Bules and schemas in the development and use of the English past tense.[, Language 34* 8A3-842. Casad, Eugene and /angacker, Bonald 0243. [[&nside[ and [outside[ in Cora grammar[, (nternational .ournal o% American Linguistics 30* 856-840. Chomsky, .oam 8900. [/anguage and )ther Cognitive ?ystems. #hat &s ?pecial 'bout /anguage1[, Language Learning and 6evelopment 6* 8A:-864. Cienki, 'lan 8996. [,rames, ideali ed cognitive models, and domains[, in =irk Geeraerts Q Hubert Cuyckens $eds.% 7he 8,%ord 'andboo" o% Cognitive Linguistics. )"ford* )"ford Fniversity >ress, pp.069-046. Croft, #illiam 0224. [/inguistic evidence and mental representations[, Cognitive Linguistics 2* 030-06:. Croft, #illiam 8996. [Construction grammar[, in =irk Geeraerts Q Hubert Cuyckens $eds.% 7he 8,%ord 'andboo" o% Cognitive Linguistics. )"ford* )"ford Fnivesity >ress, pp.5A:-394. Croft, #illiam and Cruse, 'lan 8995. Cognitive Linguistics. Cambridge Fniversity >ress Evans, Nyvyan and Green, (elanie 899A. Cognitive Linguistics. An introduction. Edinburgh* Edinburgh Fniversity >ress ,erreira, ,ernanda 8993. [>sycholinguistics, formal grammars, and cognitive science[, 7he Linguistic 9eview 88* :A3-:49. ,illmore, Charles 0248. [,rame semantics.[, in 'non $ed.% Linguistics in the 3orning Calm. ?eoul* Hanshin@ /inguistic ?ociety of <orea, pp.000-0:4.

,illmore, Charles and 'tkins, ?ue 0228. [!owards a frame-based le"icon* the semantics of B&?< and its neighbours.[, in 'drienne /ehrer Q Eva <ittay $eds.% +rames# +ields and Contrasts. :ew essays in semantic and le,ical organisation. Hillsdale, .U* Erlbaum, pp.63-098. ,illmore, Charles, <ay, >aul, and )[Connor, (ary 0244. [Begularity and idiomaticity in grammatical constructions* the case of let alone.[, Language A5* 390-3:4. ,rost, Bam, =eutsch, 'vital, Gilboa, )rna, !annenbaum, (ichael, and (arslen#ilson, #illiam 8999. [(orphological priming* =issociation of phonological, semantic, and morphological factors[, 3emory ; Cognition 84* 0866-0844. ,rost, Bam., 'hissar, (., Gotesman, B., and !ayeb, ?. 899:. ['re phonological effects fragile1 !he effect of luminance and e"posure duration on form priming and phonological priming[, .ournal o% 3emory and Language 54* :5A-:64. Geeraerts, =irk and Cuyckens, Hubert 8996. 7he 8,%ord 'andboo" o% Cognitive Linguistics. )"ford* )"ford Fniversity >ress Gisborne, .ikolas 8994. [=ependencies are constructions[, in Graeme !rousdale Q .ikolas Gisborne $eds.% Constructional approaches to -nglish grammar. .ew Oork* (outon, pp. Gisborne, .ikolas 8992. [/ight verb constructions. [, .ournal o% Linguistics

Gisborne, .ikolas 8909. 7he event structure o% perception verbs. )"ford* )"ford Fniversity >ress Gisborne, .ikolas 8900. [Constructions, #ord Grammar, and grammaticali ation[, Cognitive Linguistics 88* 033-048. Goldberg, 'dele 0223. Constructions. A Construction rammar Approach to Argument Structure. Chicago* Fniversity of Chicago >ress Halle, (orris and (arant , 'lec 022:. [=istributed morphology and the pieces of inflection.[, in <en Hale Q ?amuel <eyser $eds.% 7he view %rom 4uilding 0<: essays in linguistics in honor o% Sylvain 4romberger. Cambridge, ('* (&! >ress, pp.000-06A. Halliday, (ichael 02A0. [Categories of the theory of grammar.[, &ord 06* 850-828. Halliday, (ichael 02AA. [/e"is as a linguistic level[, in Charles 7a ell, Uohn Catford, (ichael Halliday, Q Bobert Bobins $eds.% (n 3emory o% .. 9. +irth. /ondon* /ongman, pp.054-0A8. Harris, Dellig 0230. Structural Linguistics. Chicago* Fniversity of Chicago >ress Hockett, Charles 0234. A Course in 3odern Linguistics. .ew Oork* (acmillan

Holmes, Uasper and Hudson, Bichard 8993. [Constructions in #ord Grammar[, in Uan)la \stman Q (irCam ,ried $eds.% Construction rammars. Cognitive grounding and theoretical e,tensions. 'msterdam* 7enCamins, pp.85:-868. Hudson, Bichard 0245. &ord rammar. )"ford* 7lackwell. Hudson, Bichard 0229. -nglish &ord rammar. )"ford* 7lackwell. Hudson, Bichard 8996. Language networ"s: the new &ord rammar. )"ford* )"ford Fniversity >ress Hudson, Bichard 8994. [#ord Grammar and Construction Grammar[, in Graeme !rousdale Q .ikolas Gisborne $eds.% Constructional approaches to -nglish grammar. .ew Oork* (outon, pp.836-:98. Hudson, Bichard 8909. An (ntroduction to &ord rammar. Cambridge* Cambridge Fniversity >ress Hudson, Bichard and Holmes, Uasper 8999. [Be-cycling in the Encyclopedia[, in 7ert >eeters $ed.% 7he Le,icon=-ncyclopedia (nter%ace. 'msterdam* Elsevier, pp.832-829. Hutchison, <eith 899:. [&s semantic priming due to association strength or feature overlap1 ' microanalytic review.[, >sychonomic 4ulletin ; 9eview 09* 64340:. Uackendoff, Bay 0226. 7he Architecture o% the Language +aculty. Cambridge, ('* (&! >ress Uackendoff, Bay 8994. ['lternative (inimalist Nisions of /anguage.[, in B Edwards, > (idtlying, < ?tensrud, Q C ?prague $eds.% Chicago Linguistic Society ?1: 7he >anels. Chicago* Chicago /inguistic ?ociety, pp.042-88A. <armiloff-?mith, 'nnette 0228. 4eyond 3odularity. A developmental perspective on cognitive science. Cambridge, ('* (&! >ress <armiloff-?mith, 'nnette 0225. [>recis of 7eyond modularity* ' developmental perspective on cognitive science.[, 4ehavioral and 4rain Sciences 06* A2:653. <ay, >aul and ,illmore, Charles 0222. [Grammatical constructions and linguistic generali ations* !he what[s W doing O1 Construction[, Language 63* 0-::. /akoff, George 0266. [/inguistic gestalts[, >apers +rom the 9egional 3eeting o% the Chicago Linguistics Society 0:* 8:A-846. /amb, ?ydney 0224. >athways o% the 4rain. 7he neurocognitive basis o% language. 'msterdam* 7enCamins /angacker, Bonald 0246. +oundations o% Cognitive rammar: 7heoretical prere@uisites. ?tanford* ?tanford Fniversity >ress

/angacker, Bonald 0229. Concept# (mage and Symbol. 7he Cognitive 4asis o% rammar. 7erlin* (outon de Gruyter /angacker, Bonald 8996. [Cognitive grammar[, in =irk Geeraerts Q Hubert Cuyckens $eds.% 7he 8,%ord 'andboo" o% Cognitive Linguistics. )"ford* )"ford Fniversity >ress, pp.580-5A8. /indblom, 7C]rn, (ac.eilage, >eter, and ?tuddert-<ennedy, (ichael 0245. [?elforgani ing processes and the e"planation of language universals.[, in 7rian 7utterworth, 7ernard Comrie, Q \sten =ahl $eds.% -,planations %or Language 5niversals. 7erlin;.ew Oork* #alter de Gruyter, pp.040-89:. /uger, George and ?tubblefield, #illiam 022:. Arti%icial (ntelligence. Structures and strategies %or comple, problem solving. .ew Oork* 7enCamin Cummings (arslen-#ilson, #illiam 899A. [(orphology and /anguage >rocessing[, in <eith 7rown $ed.% -ncyclopedia o% Language ; Linguistics# Second edition. )"ford* Elsevier, pp.823-:99. .unberg, Geoffrey, ?ag, &van, and #asow, !homas 0225. [&dioms[, Language 69* 520-3:4. )wens, Uonathan 0244. 7he +oundations o% rammar: an (ntroduction to 3ediaeval Arabic rammatical 7heory. 'msterdam* 7enCamins >ercival, <eith 026A. [)n the historical source of immediate constituent analysis.[, in Uames (cCawley $ed.% :otes %rom the Linguistic 5nderground. /ondon* 'cademic >ress, pp.882-858. >ercival, <eith 0229. [Beflections on the History of =ependency .otions in /inguistics.[, 'istoriographia Linguistica. 06* 82-56. >eterson, Bobert B., 7urgess, Curt, =ell, Gary, and Eberhard, <athleen 8990. [=issociation between syntactic and semantic processing during idiom comprehension.[, .ournal o% -,perimental >sychology: Learning# 3emory# ; Cognition. 86* 088:-08:6. >ollard, Carl and ?ag, &van 0225. 'ead!6riven >hrase Structure rammar. Chicago* Chicago Fniversity >ress Beay, &rene 899A. [?ound ?ymbolism[, in <eith 7rown $ed.% -ncyclopedia o% Language ;ampA Linguistics )Second -dition*. )"ford* Elsevier, pp.3:0-3:2. Beisberg, =aniel 8996. Cognition. -,ploring the Science o% the 3ind. 7hird media edition. .ew Oork* .orton Bobins, Bobert 8990. [&n =efence of #> $Beprinted from !>H?, 0232%[, 7ransactions o% the >hilological Society 22* 005-055. Boss, Uohn 0269. [)n declarative sentences[, in Boderick Uacobs Q >eter Bosenbaum $eds.% 9eadings in -nglish 7rans%ormational Analysis. #altham, (ass* Ginn, pp.888-868.

Bosta, 'ndrew 0225. [=ependency and grammatical relations.[, 5CL &or"ing >apers in Linguistics A* 802-834. Bosta, 'ndrew $0226%. -nglish Synta, and &ord rammar 7heory. >h= dissertation, FC/, /ondon. Bosta, 'ndrew 8993. [?tructural and distributional heads[, in <ensei ?ugayama Q Bichard Hudson $eds.% &ord rammar: :ew >erspectives on a 7heory o% Language Structure. /ondon* Continuum, pp.060-89:. ?adock, Uerrold 0220. Autole,ical Synta,: A theory o% parallel grammatical representations. Chicago* Fniversity of Chicago >ress ?prenger, ?imone, /evelt, #illem, and <empen, Gerard 899A. [/e"ical 'ccess during the >roduction of &diomatic >hrases[, .ournal o% 3emory and Language 35* 0A0-045. ?tump, Gregory 8990. (n%lectional 3orphology: A 7heory o% >aradigm Structure. Cambridge* Cambridge Fniversity >ress ?ugayama, <ensei 8998. [!he grammar of 4e to* from a #ord Grammar point of view.[, in <ensei ?ugayama $ed.% Studies in &ord rammar. <obe* Besearch &nstitute of ,oreign ?tudies, <obe City Fniversity of ,oreign ?tudies, pp.26000. !allerman, (aggie 8992. [>hrase structure vs. dependency* the analysis of #elsh syntactic soft mutation[, .ournal o% Linguistics 53* 0A6-890. !aylor, Uohn 8996. [Cognitive linguistics and autonomous linguistics[, in =irk Geeraerts Q Hubert Cuyckens $eds.% 7he 8,%ord 'andboo" o% Cognitive Linguistics. )"ford* )"ford Fniversity >ress, pp.3AA-344. !esniPre, /ucien 0232. BlCments de synta,e structurale. >aris* <lincksieck #ier bicka, 'nna 0243. Le,icography and Conceptual Analysis. 'nn 'rbor* <aroma #ier bicka, 'nna 022A. Semantics: >rimes and universals. )"ford* )"ford Fniversity >ress Dwicky, 'rnold 0243. [!he case against plain vanilla synta"[, Studies in the Linguistic Sciences 03* 0-80.

S-ar putea să vă placă și