The Simpler Syntax discussed on the blog all this week appears to been developed in response to
overwhelming evidence from comparative ethology [i.e., animal psychology] that the behavior of many animals must be governed by combinatorial computation [i.e., putting 2 and 2 together]. … Thought [may be] highly structured in our nonlinguistic relatives—they just cannot express it. Combinatorial thought could well have served as a crucial preadaption for the evolution of … human language. (p. 416)
Generative grammar, as founded by Noam Chomsky, is famously Cartesian in its preference for innate knowledge, its view that there is an irretrievable divide between the basis of animal thought and linguistic thought, and its search for mathematical completeness in its formulations. These old ideas took specific form in Chomsky’s neo-Cartesianism:
- Innateness: syntactic structures are universal and built into everyone. Although it is patently obvious that spoken languages have many different forms, there is a common, underlying logic to them all.
- Difference: syntactical speech is so different from anything found in the animal world that there is no point in looking to evolution for any insights into its nature.
- Completeness: meaning emerges from the individual words and the syntactic rules that combine them.
Yesterday's post (here) noted some problems with the completeness theory. There are many kinds of sentences whose meaning can only be found by looking elsewhere. Sometimes these meanings can be imported from the immediately preceding sentence, but often they seem to require an understanding that reaches beyond both the rules and a sentence's immediate syntactical context.
The “Simpler Syntax Hypothesis” essay in the September issue of Trends in Cognitive Science by Peter. W. Culicover and Ray Jackendoff (abstract here) is not a complete schism, but it does differ significantly from the orthodox approach.
- Innateness: most syntactic structures reflect a more general ability to combine ideas that is shared with many other animals, and many specifics of a language’s syntax are learned by breaking down “stored pieces of structure” (See: Idiomatic Syntax)
- Difference: humans were extensively preadapted for syntactic speech. Their ancestors had already developed a rich ability to plan, interact socially, and maneuver their way successfully through a complex, novel, and changing environment.
- Completeness: it is often not necessary for syntactic rules to completely encompass a sentence’s meaning, because frequently the meaning is not fully derived from the syntax.
In the end Simpler Syntax will probably not prove completely compatible with an attention-based syntax. I suspect the “combinatorial computation” of primates is likely to end up meaning some passive calculation based on symbolic inputs rather than a more active, attention-driven experience.
An example of the difference in approaches can be seen in some of the essay’s passages about “hidden elements,” material crucial to meaning but omitted from the utterance. Yesterday’s post referred to a sample sentence “The trolley rattled around the corner” which the essay’s authors translate, “The trolley went around the corner, rattling.”
If you find the meaning in the first sentence through some sort of symbol analysis, rattled is a peculiar choice of verbs because the sentence is about movement, but does not use a verb about movement. If, however, you find meaning through experience, then the reference to a sound is perfectly coherent. The trolley’s noise can be loud and insistent, while the motion itself is uninteresting—“rattled around the corner” is much more vivid than “went around the corner.” There are many usages of rattle that focus on the perception—a rattle trap; she’s rattling around up in the attic; he’s just rattling his cage; death rattle—so it seems unnecessary to invent a rule (motion can be expressed by sound name + a path) for this one usage. Besides, sound is hardly the only perception that can stand for motion: the fire flies flickered fitfully on down the field; as I waited for the light to change, a garbage truck pee-yooed past my nose. Do they need special rules for each of the senses?
The authors also discuss a series of sentences that end with somebody saying, “Yeah, scotch.” The various examples show how the full meaning of the sentence can differ. They give one example that they say offers “no plausible interpretation.”
A: Ozzie doubts that Harriet’s been drinking.
B: Yeah, scotch.
If you think of language as the manipulation of symbols, B’s response seems bizarrely cryptic, but if you think in broader terms, the reply makes sense. Any competent actor given a script with that bit of dialog could read B’s line so that every audience member understood that, while Ozzie has doubts, B has none. Harriet has been drinking scotch and Ozzie is in denial about it.
In yesterday’s post, when discussing ellipses I said that opening a conversation with, “Yeah, scotch” would be puzzling. But ellipsis can begin a verbal exchange. If somebody actually came up to me and said, “Yeah, scotch,” I would probably take it to be a humorous way of asking for a drink by acting as though I had offered one.
The active effort people make to understand things often strikes me as the element most absent from the study of formal syntax. People often over-interpret things, finding messages in the entrails of birds and movement of the stars, while formal logic bends the other way, under-interpreting words, stripping them of experience, context, and memory. So, in the end, I suspect I will not be completely happy with a simpler syntax that is computational rather than attention-based, but I feel this week has given us some real progress toward
- getting rid of hidden elements (This step is not only more compatible with an attention-based approach to language, but also fits in more with the way speech is so often not subject to literal interpretation),
- understanding a step-by-step evolution of syntax (Formal syntax is so complex that it is hard to imagine it depending on just one or two mutations, but it is so intertwined that selection of the rules on a one-by-one basis looks hopeless too), and
- identifying a link between syntax and animal intelligence (Animals are capable of combinatorial thinking as well as paying attention. The key differences between speaking humans and non-speaking apes appears to be peer-assigned community structures rather than dominance-based hierarchies and curiosity about things in the world at large. These two oddities permit the existence of an attention triplet.)
The table below summarizes three approaches to syntax and what they mean for theories of speech origins.