Sorry about the interruption in my discussion of Daniel Dor’s book, The Instruction of Imagination. Let me assure you that the problem was entirely due to life’s trivialities and had nothing to do with the book itself, which is a very important, fresh look at language. One of the real pleasures of this blog is that it gets me to read books like this one.
Dor’s thesis is twofold: language is socially constructed and it is used “to instruct imaginations.” This latter point means that we use language to bridge the gap between the speaker’s and listener’s experience, so that the listener can imagine what the speaker is saying. This may sound familiar, but Dor is radical in his insistence on the social construction part. Consider, for example, how Dor accounts for our knowledge of word meanings.
At the opposite end of Dor’s theory is that of Chomsky, who insists that word meanings come from an innate collection of symbols. This theory is so perverse that if I were to spell it out in detail, you would suppose I was attacking the airiest of straw men. A more moderate version of innate symbols is Pinker’s notion that we are born with some symbols and then develop more in accordance with experience. Common sense, of course, tells us that we get our words from the world around us. Hence, we learned to say men, the French learned hommes, and the Zanzibaris watu.
The trouble with common sense is that you cannot program a computer to learn language just by listening. You must provide your computer with a starter dictionary of symbols. And if computers require a built-in dictionary, we too must need one. After all, the brain is just a form of computer.
Then comes the next problem. Consider a sentence like Peter set a trap. My American Heritage Dictionary has 14 different definitions for set when used as a transitive verb. Then these 14 definitions have a variety of sub-definitions, giving us a minimum of 40 possible meanings in this syntactic context. Trap as a noun is simpler, but still has a possible 15 meanings, yielding 600 (15 x 40) possible meanings for what Peter did.
This multiplicity drives programmers mad, but as Dor points out, language users seem quite oblivious to the problem. They go straight for the meaning appropriate to the situation. How do they master this when computers are so troubled by it? Dor denies that humans consider each word separately. Words are part of a web held together by experiences (both real and imagined) of language users. Thus, when we hear Peter set we do not wonder which of the many definitions of set applies here. Instead, we see what experiences are evoked as the speaker continues. We cannot tell from the sentence exactly what kind of trap is referred to, but we can tell that set here means Peter caused a trap (real or metaphorical) to be in proper working order. It never even occurs to us to rule out the definition of set meaning to cause to sit.
How do we manage? Although words are mostly ambiguous, phrases point to “a single experiential cluster around an experiential anchor.” [p. 84] I take that to mean the phrase draws our attention to a general range of experience that is associated with (anchored to) a more specific experience. Let’s say that for me trap evokes a mouse trap, and setting one gets me imagining me baiting a mousetrap with a bit of cheese and putting it down. That’s my private response, but in social usage I know that setting a trap can be a broader, less precisely detailed, event. Other listeners might imagine setting a police trap in which plain-clothes detectives hang around waiting for a kidnapper to remove a briefcase from a trash bin. Even though we cannot be sure what the private experience of any particular listener will be, most of us will grasp the general idea of acting and waiting for something to be caught.
Dor proposes that for any one of us, the associations with a particular word might be so rich that we should use an encyclopedia rather than a dictionary, but we take it for granted that the speaker is not appealing to all our associations. We are trying to get the broad outlines.
This system can work if we are busy interpreting as we listen, trying to stay on track. We hear the name Peter and expect to learn something about the fellow named Peter. Next comes set so we know Peter set something and we wait to learn what. It does not even occur to us to notice that if the subject were the sun we could hear The sun set and not wonder about what the sun did.
All kinds of words might come after Peter set: … off for Shanghai, … a clock, … a deadline, … a trap, … the world on fire, etc. Each of these possibilities points the listener along a different set of experiences. We only pay attention to what is actually said and do not wonder if the kind of setting Peter did is the kind involved in setting the world ablaze. We are told Peter set a trap and we imagine the kinds of things we know to be part of that activity.
Dor’s account of how we choose the particular meaning of a word from so many possibilities is quite unlike the ones given by most semanticists and programmers. The standard versions assumes that word meanings are in the heads of speakers and listeners and we narrow down the choices until we find the best one. (This procedure is used by the IBM Watson application that beat champions in Jeopardy.) As English speakers, we all have a personal, mental dictionary that includes the word boy, but the meaning is still in our heads. Dor says no, we pluck the meaning from the way it is being used by the speaker and we are forced to try to understand the speaker's social setting as well as the words being used.
I am suddenly reminded of a long-ago conversation where confusion came up over the word boy. I was at a party where one guest was telling amusing stories about a household servant named Jason. He then referred to Jason in a newcomer’s presence and I offered as an explanation, “Jason is their man.” Immediately came the correction, “Jason is our boy.” I was confused. Surely Jason was grown... then I got it, and immediately loathed the speaker.
As Professor Henry Higgins said [sorta] it is impossible to speak without giving away the social clues that make some other chap despise you. But why should that be if our mental dictionaries are all the same?
Dor's account of language as a social network in which each person is struggling to understand what is coming over the web of usages makes sense to me and of language's social tangle. But logic alone is metaphysics. I cast about for an armchair test of Dor’s idea and came up with a sentence: The river bank was robbed this morning.
I hope the last half of that sentence came as a bit of a surprise and made the reader briefly stumble until you realized the opening did not refer to the bank of a river, but a bank beside a river. Most garden-path sentences I know lead the reader astray because of a confusing syntactic structure, but here we have a garden-path error caused by a misdirecting definition. The river bank sets up the reader to imagine one group of experiences which are then contradicted by imposing a different kind of experience. The example offers good evidence that we are interpreting as we go, and we stay with one interpretation until overwhelmed by confusion. Under the best of circumstances we never notice all the irrelevant dictionary meanings.
Dor does not deny that the head is involved in this effort to understand. He grants that understanding language “requires a wide array of cognitive capacities” and that some of these abilities are “probably partially” inherited traits.  But all the words and their associations have been learned and we must follow along attentively if we are to understand. There are no innate words or symbols or mentalese. I like this guy.