Anyone familiar with this blog knows of my frustrations with Noam Chomsky. He seems to be so smart and logical, and yet never says anything that sounds half-way right or even usable. For example, he recently published an essay (Catalan Journal of Linguistics, “Some Puzzling Foundational Issues”) that considers the evolution of language, by which Chomsky means “the evolution of the Language Faculty." Is that what most people mean when they consider the evolution of language? I doubt it. Any ordinary person picking up an essay on “the evolution of language” would probably be expecting an account of how people began to use words. The biggest conundrum would be how people reached an agreement as to what words meant what things. How have I and my neighbors come to share the word and meaning of shoe? Or whatever the first word was?
This obvious question involves various difficulties and the answer depends on more factors than are first evident. Is one of those factors a “Language Faculty”? It would seem plausible, but Chomsky defines his concept in such a way that it is irrelevant to the study of language origins. Yet I found his latest piece interesting because it reveals that Chomsky is beginning to admit to the usual difficulties when he tries to make his logic match reality. He has had to rethink a key operation in his theory, something called Merge.
Anybody who has studied Chomsky’s work has seen the pattern many times. Chomsky proposes some simple principles about language. Then he offers a brilliant, and simple, solution to explain how the principles can be used to generate all the sentences of a language, at which point he and his disciples begin examining empirical evidence to demonstrate the simplicity and correctness of his idea. At first, the work claims success. Then difficulties appear and the simple proposal must be defended by increased claims that the difficulties can be ignored. Then the difficulties are too blatant to be waved away, and Chomsky announces a change of mind. The simple solution was wrong, a new solution has been found, and the cycle begins again.
Chomsky has won many admirers for his ability to change his mind, but he has never changed his initial principles of language—e.g., language structure reflects a computation; all languages rest on the same invariant rules; the ability to compute sentences according to the invariant rules rest on an inborn faculty of the mind. If we take another look at Chomsky’s work, we might say that instead of regularly changing his mind, he never changed it. From the beginning of his published work (in the 1950s) until today, over 60 years later, he has stuck by the same principles even though he has never found a way to make them work. All of the practical progress made in machine composition and translation has followed other techniques.
In his paper on “Some Puzzling Foundational Issues,” Chomsky is once again changing his mind without having to change his premises. The hand waving is getting desperate.
What might an alternative set of language principles look like? We might say that language allows communities to consider shared topics. It does this by drawing attention to whatever seems relevant. Different communities may be interested in widely differing topics and may have quite distinct notions of relevance. Thus, different languages can be profoundly unlike one another. Yet they will all share something abstract, a common method of directing and keeping attention on a topic and its relevant details.
There are other descriptions of language. I have given the one I happen to believe as a result of working on this blog, but my point is not that I am right but that we there are more possibilities than are found in Chomsky’s philosophy.
The aim of Chomsky’s latest paper is to present what can be rescued from the wreckage of his latest crash, and frankly I don’t care what his salvage operation looks like. I’m more interested in the elaborate hocus pocus by which he waves away rivals that might question his unchanging principles.
The biggest rival to Chomsky, because it is the most commonsensical, is the idea that the rules of language can be known by studying languages as they are used. Thus, if you want to understand French, read and parse French documents, or listen to French conversations. Sixty years ago, Chomsky got a lot of respectful attention in the intellectual press by denying this point. He said that a sentence in French, English or any language had a surface structure that could be directly examined. But he argued, things cannot all be explained by surface structure, so we have to examine a deep structure, and that deep structure follows the same rules, no matter what the language.
To the objection, what about grammatical mistakes—if we have some innate set of rules, why do we speak so badly?—Chomsky distinguished between linguistic competence (the rules we know) and linguistic performance (the errors we make as a result of a variety of frictions and impediments to perfect behavior).
Surface and deep structure, linguistic competence and performance—with these tools Chomsky set out to conquer linguistics. By the mid 1960s he had abandoned the search for a common deep structure in favor of the quest for a Universal Grammar (UG) that would generate sentences in any language. By now there is a complex set of interfaces that replaces concepts of competence and performance. The UG generates the formal sentences that define an Internal (I) Language. A second language, the External (E) Language is made public by speech, writing, hand signs and other means. The language we find around us is generated by the UG in the form of I-Language sentences which are then translated by the body’s sensory-motor system into the more familiar E-Languages of the world.
The problem with this account has always been in the translation of universal to communal languages. It did not work with the switching from deep to surface structures and it still isn’t working. What has changed is that Chomsky no longer even pretends to be looking for a way to get from one form to the other. For many years, in his lectures Chomsky would introduce the I-Language/E-Language distinction and say he was interested only in I-Language and proceed to ignore language as it is encountered everywhere. That’s why I sat up when I saw in this paper that he now attempts, at least partially, to justify ignoring E-language.
Here is his presentation:
- First, he looks at the interfaces where the language is handed off for further processing: the core I-language (internal language) generates solely representations on one interface: C-I (Conceptual Intentional interface), essentially a kind of language of thought. And that’s probably close to, or probably we will discover totally invariant among human beings. [There it is, the restatement of the all-languages-are-alike doctrine he has preached since the middle of the 20th century.]
- Next, Chomsky endows E-Language with all the properties that most people would like to understand when they think about language: It seems that the complexity, the variety of language arise overwhelmingly if not completely from the ancillary operations which lead to externalization which we know draws upon our sensory motor system. And it’s pretty natural that that should be complex and vary because you have to match two systems that essentially have nothing to do with one another. [Tada! This is a huge concession. Chomsky now grants that E-Languages are now seen as radically unlike I-Languages.] The internal system seems to have arisen pretty suddenly along with modern humans [Note: modern humans did not arise suddenly but gradually over a couple of million years!] and the SM (Sensory-Motor) system have been around for hundreds of thousands, in some cases millions of years, and have absolutely nothing to do with language. So when we try to connect these two things, it’s necessarily going to be a complex operation, and in fact the external operations, although they certainly follow principles and rules of a restricted variety they nevertheless violate just about any principle of computational complexity one can imagine, and they do vary a lot, change a lot, generation to generation and so on.[Another tada.]
- Finally, after admitting that E-Languages bear little to no resemblance to I-Language, he throws E-Language out the window and declares for the study of I-Language: So I’ll just assume that, admittedly without any arguments – it’s been discussed elsewhere – and take a look at the generative mechanisms for the core I-Language mapping to C-I.
Usually, I am stuck between frustration and admiration when Chomsky pulls one of these maneuvers. The frustration is over the use of con-man tricks, pulling notions from thin air and saying they are what matters, not the money you are being asked to contribute to the cause. Yet I have always been impressed by the audacity of the man, the sheer gall of waving aside questions that generations have wondered about, as though they were a mere bag of shells (to quote Ralph Kramden).
This time, however, I have a happier response; I smell victory here. Chomsky seems to have finally realized he cannot forever dismiss his critics and their data. At least he now concedes that we cannot learn French without examining French language as a thing in itself. Furthermore, he grants that linguistic performance (to use an abandoned bit of jargon) depends on mastering sensory-motor skill. True, he also says the sensory-motor system has absolutely nothing to do with language, but this assertion is empirically false and can be shown so. Starting with Italian linguistic philosophers and then spreading to points around the globe, a school has developed that argues the various grammars for individual, natural languages are based on how words and phrases focus and redirect attention (attention is part of the sensory-motor system). If that school is right and natural language structure depends on sensory-motor powers, the whole of I-language becomes irrelevant and can be ignored as though neither a UG nor an internal-language even existed. In the end, it is surface structure and utterances of the E-Languages that matter.
Chomsky has had a long career and a much celebrated one. Along the way many people have tried to adapt one or the other I-Language systems to their own field of interest and eventually have discovered the emptiness of the metaphor. It is worth remembering the practical question that kicked off these studies: how can you build a machine that can generate any of the sentences found in a particular language and not generate any sentences not found in the language? There is no point in going back to the drawing boards to better define an I-Language if the sentences of every public language "violate just about any principle of computational complexity one can imagine, and they do vary a lot, change a lot, generation to generation." No machine will generate all and only the sentences of any particular language if it sticks to a simple computation based on invariant syntax.
Comments