One of the most basic quarrels about language and its origins has flared up again in a particularly fierce way.
The question at hand: do the things that all languages have in common reflect certain universals of human thought and experience, or do they reflect the workings of a universal language faculty? Fifty years ago a third answer dominated: languages are learned from scratch and have no universals. That position, however, is still so out of favor that it is not much proposed in the current quarrel.
The latest dispute arises from a stark denial that languages have any peculiar grammatical universals of their own. It amounts to a total rejection of Chomsky’s core idea that the syntax of any individual language reflects an instance of a universal grammar (UG). Nicholas Evans and Stephen C. Levinson have published a paper in the wonderful journal Behavioral and Brain Sciences, “The Myth of Language Universals: Language diversity and its importance for cognitive science” (uncorrected final draft available here). Also published with the paper were a series of responses including many sharp retorts from generative grammarians who still firmly believe in UG. They score their points, but the fact that the issue has returned underlines the basic fact: after fifty years of proclaiming the existence of a UG, we still don’t know what it is. All in all the paper and responses make for a brutal slugfest.
polysynthetic languages – which typically have extreme levels of morphological complexity in their verb, but little in the way of syntactic organization at the clause level or beyond – show scant evidence for embedding. [p. 442]
I’ve heard Derek Bickerton argue that the absence of recursion in any particular instance is no proof that it isn’t part of UG. Languages do not have to avail themselves of every available feature. I regret that Bickerton was not one of the respondents in this paper, but although his point is valid it leaves us wondering just what is the universal grammar.
A number of generative grammarians responded to the paper with the basic argument that examples of diversity are no proof of an absence of underlying universal rule. It is an undeniable point, but it is not proof of UG’s existence.
Normally in science it is up to the proposer of something’s existence to demonstrate the claim. It is very difficult to confirm a negative—e.g., there is no spaghetti on any of the planets in the Andromeda galaxy—so it is up to the proposer to show that something does exist. If you claim there is pasta in outer space, prove it. Of course, many linguists are trying to demonstrate the UG’s existence and they may mean only that their effort is still worth pursuing. Finding a UG that really works would probably score as the most impressive scientific achievement since Newton produced an equation that explained the motions of the solar system, so it should not be abandoned lightly. But E&L make the project look increasingly frayed.
Most surprising to me was the tepid response of two of the biggest guns in the generative army. Steven Pinker and Ray Jackendoff (P&J) responded with a comment titled, “The reality of a universal language faculty.” They specify their UG hypothesis:
the human brain is equipped with circuitry, partly specific to language, that makes language acquisition possible and that constrains human languages to a characteristic design. [465]
The evidence they offer is entertainingly negative. For example, they imagine a language called Abba:
Abba specifies grammatical relations not in terms of agent, patient, theme, location, goal, and so on, but in terms of evolutionarily significant relationships: predator-prey, eater-food, enemy-ally, permissible-impermissible sexual partners, and so on. All other semantic relations are metaphorical extensions of these. [465]
But in reality there is no such language. Why not? Why does it even seem absurd? Why do languages, for all their diversity, limit themselves to only some relationships while others are unlearnable? P&J say the absence supports their hypothesis that languages are limited to a characteristic design, but they do not show that the limit comes from language-specific circuitry. Their example does not contradict the E&L hypothesis that the commonalities of language come from general cognitive powers. P&J put the commonality in high-level syntactic terms— agent, patient, theme, location, goal—but these things are concrete and perceivable, so they do not seem to require syntactic thought. On the other hand, evolutionarily significant relationships are abstractions that are not perceived at a glance the way spatial relationships are. After reading P&J I have a strong sense that the searchers for UG will have a hard time finding a negative example that limits language in some way that perception is not limited..
Michael Tomasello’s gleefully titled response to E&L, “Universal grammar is dead,” goes for the jugular:
I am told that a number of supporters of universal grammar will be writing commentaries on this article. Though I have not seen them, here is what is certain. You will not be seeing arguments of the following type: I have systematically looked at a well-chosen sample of the world’s languages, and I have discerned the following universals… [470]
Yet he does acknowledge that there are a number of universalities among diverse language: e.g., reference to “agents of action,” “patients of action,” “possessors,” and “locations.” Tomasello asks why not just call this universal grammar but he answers that this kind of thinking is much older than language:
… historically, universal grammar referred to specific linguistic content, not general cognitive principles, and so it would be a misuse of the term. It is not the idea of universals of language that is dead, but rather, it is the idea that there is a biological adaptation with specific linguistic content that is dead. [471]
That’s the dispute. Are there universals with “linguistic content” or, as P&J put it “circuitry … limited to language?” If yes, then an account of linguistic origins, must include an explanation of where that circuitry came from. If no, then we have to understand how we got to be so smart and how we discovered how to translate our knowledge into language.
At the protolanguage conference in Torun, Poland last September it seemed clear that the momentum is heavily behind the thesis that we discovered how to translate our thinking into basic language and then cultural evolution accounts for the rest. I’m not sure that the origin of language was that straightforward, but it has become obvious the shaky position of UG is insufficiently recognized by many non-linguists when they try to contribute to the discussion of the evolution of language.
The misconception that the differences between languages are merely superficial, and that they can be resolved by postulating a more abstract formal level at which individual language differences disappear … pervades a great deal of work done in … theories of language evolution … and just about every branch of the cognitive sciences. … A great deal of theoretical work within the cognitive sciences thus risks being vitiated, at least if it purports to be investigating a fixed human language processing capacity, rather than just the particular form this takes in some well-known languages like English and Japanese. [429-30]
I have read many papers about language origins in which I see the unquestioned assumption that UG is real, and I often wonder if their authors know just how dubious a thread they are grasping.
Two points:
1) "agent, patient, theme, location, goal" are not "high-level syntactic terms", they are semantic roles that interface with syntactic items (nouns, verbs, adjectives).
2) Universals don't necessarily have to be things "in all languages" but could potentially be things that language has the potential for. Jackendoff, for example, does believe in a degree of nativism, but considers UG a "cognitive toolbox" — whatever the brain innately brings to the table for learning language. The results of what manifests might be vastly different based on input, but the brain has to be doing something to acquire it (whether "general" or "specific"). This is very different from a Chomskyan approach.
I'm biased as Jackendoff's student, but as I've recommended several times: it is worth reading his material to discern the very large and distinct differences between his and Chomsky's beliefs. "Generative grammar" (and its proponents) is not as homogenous as it's often made out to be.
Posted by: Neil | December 07, 2009 at 01:20 AM
People tend to misinterpret "grammar" here, too. All languages have lexical items and all languages are decomposable. That seems trivial at first, but has very far reaching implications.
Posted by: Olive | December 07, 2009 at 04:55 PM