I was traveling last week and have not been focusing on my blog, but I do want to draw everyone’s attention to a paper published in the current Biolinguistics online journal. Titled, “The Biological Nature of Human Language,” it is signed by 14 authors, many of the distinguished linguists, and begins with an acknowledgment of the contribution of three others. So I suppose this is as close as we are going to get to a consensus from generative grammarians as to where they stand today in thinking about the nature of language. The big news seems to be the shift in emphasis from speech as a symbol processing operation to a biological process.
Remember it is a shift of emphasis, not of definition. Interest in the biological side of language traces to at least 1967 when Eric Lenneberg published his classic The Biolgical Foundations of Language, but the emphasis remained on the mechanical manipulation of symbols. Now the emphasis has shifted, but symbol processing is not forgotten.
Indeed the paper’s first section after the introduction seeks to find the balance between biology’s sensory-motor system that perceives and produces language and the conceptual-intentional system that manipulates symbols. Both systems are discussed in terms of biology, but I cannot help noticing that it is very easy to get a computer to perform the tasks of the conceptual-intentional system and very hard to get one to perceive and act in accordance with its perceptions. Meanwhile, the biological world is full of perceiving actors although logical, symbol processors are rare. In this paper, both systems are subordinated to “narrow syntax,” which is the minimum that must be specified by “the human genome … as the unique core of universal grammar” [p. 8]. Without that minimum, the authors claim, it would be impossible for a biological machine with finite computational power to “produce an infinity of sound-meaning pairs.”
The contrary position is that any universal grammar reflects the workings of the sensory-motor system and any conceptual-intentional system, and that the infinity of sound-meaning pairs comes from the boundless number of possible interactions with the environment. The paper, however, does not acknowledge such a school.
I got a chuckle a few pages later when the paper went on to note “the novel possibility” [10] that much of both human and animal thought rests on shared intellectual and perceptual powers. Turning to the bibliography I noticed the absence of Christine Kenneally’s book, The First Word, from a few years back in which that novel possibility was the book’s main thesis. In that case, the authors say, humanity’s linguistic competence would rest on a “unique capacity to interface syntactic structures with semantic and phonological representations” [10]. Although I instinctively recoil from this jargon, I don’t really disagree with the proposition, at least at the level of beginning speech. I would love to see these authors start thinking about language origins and development as a dynamic process.
So I was quite pleased when I got to the part in the paper where the authors reject the “traditional… false-to-fact idealization about instantaneous development as if information [about language] was available to the child at a single point in time” [17]. This refers to the long-standing linguistic argument that the development of language is irrelevant to the nature of language because, once you get to full language it does not matter how you got there. It was the intellectual teammate of the idea that language evolved in a single big-bang or great-leap-forward, and it guaranteed that the generative school would have little to contribute to this blog. However, the paper reports that:
Current practice and results turn this original notion on its head, rendering the concept of child’s biological development central rather than peripheral. [17]
As an example of why development matters, the authors refer to the observation that children in early speech often use verbs without tense markers, and they do so in a consistent way. For example, they will say ‘him go’ in which they use the pronoun’s accusative case and the verb’s infinitive form. This pattern of case agreement + tenseless verb might be explained by saying that producing both agreement and tense as the same time is too much for the young brain. Typically, as the brain matures the ability to use tense and case together appears. However, some people with Specific Language Impairment continue to speak without using tenses even though their general cognitive abilities appear to be normal. This finding suggests that there is a developmental problem specifically affecting language.
Now here we have evidence of some kind of change in linguistic competence, quite possibly with a genetic component, that appears between the onset of speech and its full form. That’s exactly the kind of thing a dynamic, evolutionary view of language origins expects and I’m glad to hope that in the future more such useful work will be coming from the linguists themselves.
Thanks for bringing this paper to my attention. It's good to see D'Arcy Thompson and Turing mentioned, but the absence of Christine Kenneally's book does seem strange.
I wonder if our understanding of language evolution would be much different (and if so, how), had the subject not been academically anathema for several decades.
Posted by: Stan | April 06, 2010 at 04:15 PM
Not only Kenneally is overlooked, but also Tomasello, Bates, MacWhinney, Bybee, Nick Ellis, Kirby, or basically anyone who has a good story for how general principles of learning and transmission might explain linguistic structure. They are pretending that their rivals don't exist, and yet pose the very questions (in the manner of a wish list) that these people address.
It’s notable that they use the terms “recursion” and “merge”, which grow out of and largely assume a Minimalist conception of syntax, and then charge psychology with explaining these phenomena. I find it more appropriate to use well-established psychological terms and then charge our own field (linguistics) with detailing their role in language acquisition and use.
Many important terms are notably absent. Probably the biggest example is“chunking”, which goes back to Miller (1956) and has been used extensively for half a century in research on motor learning, short and long-term memory, and special kinds of expertise such as chess or playing quarterback, as well as pedagogical applications. Other key terms are “schema”, “transfer (of learning)”, “analogy”, and “automatization” or “entrenchment”.
Where in this paper are the empirically established principles about how we know and learn, which provide a bridge from brain evolution to linguistic explanation? There is a lot of hand-waving, but few details or citations. Section 5, which focusses on Lennenberg (1967), claims that “very young children ‘learn’ basic properties of their language much faster than unguided learning models could predict,” but does not see fit to mention a vast literature on which principles are known to guide learning in diverse domains. Unguided learning versus scare-quoting “learn” is obviously a false dichotomy.
The authors do not seem to care about the cognitive psychology of learning very much, making huge leaps from evolutionary biology to the “language phenotype”, from hormones to language acquisition, from brain lateralization to the lexical/grammatical distinction, without the appropriate intermediary concepts. Reading the paper, I imagined a biologist who wonders how subatomic physics might produce an “elan vital”, without giving much thought to the details of organic chemistry.
The absence of key terms in the psychology of learning seems to be one reflection of a strange orientation in generative linguistics toward psychology in general. To me, as a linguist and proponent of a “usage-based” model of grammar, what psychology reveals about such phenomena as human attention and memory, motor planning, sensory discrimination, STM and LTM encoding and retrieval (with the roles of frequency and recency), social transmission, copying fidelity, and transfer, provide a framework of proven mechanisms in terms of which my job is to devise explanations of specific linguistic phenomena. To the authors, they have found something amazing about language which they characterize in quasi-psychological terms (“recursion”, “merge”, “computational constraints”), and are now setting psychology on a grand adventure to identify and characterize it. Thus, instead of constraining the higher field (linguistics) in terms of the foundational field (psychology), the authors are dictating what foundations need to be found.
Sometimes this kind of orientation works, as when the study of inheritance and natural selection charged biochemistry with finding the replicator molecule. But does linguistics have the explanatory power of Darwin and Mendel, to issue such a challenge? Generative grammarians seem to answer “yes”, but I have yet to see anything in language that would move me from a default “no”.
-----------------------------
BLOGGER: I think this is an excellent comment and want to add only that generative grammarians are not alone in thinking paying attention only to their field. Although they are perhaps the most arragant about it.
Posted by: J. Goard | April 07, 2010 at 02:18 AM