Blog Rating

Selected Books by Edmund Blair Bolles

  • Galileo's Commandment: 2500 Years of Great Science Writing
  • The Ice Finders: How a Poet, a Professor, and a Politician Discovered the Ice Age
  • Einstein Defiant: Genius vs Genius in the Quantum Revolution

« Cognitive Daily | Main | Apes and Us »



This seems like you (or Deacon) are creating a false duality. Can't a computational system have evolving parts?

There are several non-Chomskyan approaches to the language system that use hierarchy that aren't centralized. That's the very basis of Jackendoff's parallel architecture: several modular structures that each have their own hierarchies.

Beware that your invocation of Chomsky and especially of "computation" as a whole is fast becoming a straw man. While I agree that Chomsky's ideas on language evolution can't possibly be right, they aren't the only thing out there.

More importantly, his views certainly aren't the only ones regarding hierarchy. Hierarchy in syntax (and other linguistic domains) has been examined in psycholinguistic research over decades of experiments. This doesn't have to mean 'minimalism is right' (especially since they ignore psycholinguistics) but it does support that hierarchy exists in language.

John Roth

On first reading, I had the impression I'd wandered into Bizzaro Universe by mistake. My second reading hasn't done much to correct this impression.

As a professional software developer (now retired) and a long time amateur follower of evolutionary biology, I thought I knew what the limits of computation and evolution were, and I must say that I don't recognize either in what I read. What I'm seeing is, instead, the kind of gross over-simplification that sometimes appears in the popular press.

Evolution proceeds by introducing noise into the system, and then pruning the results according to some optimization (selection for fitness). There are a vast array of computations which do exactly the same thing, ranging from optimization theory to statistical modeling. The hallmark of these computations is that they quite deliberately don't create the same results twice from the same inputs: the final result is obtained by averaging multiple runs, or by taking the best result, depending on what's wanted.

The notion that computation ignores the environment is so bizarre that it's not even wrong. Computers run cars, airplanes, automated manufacturing systems, and many other things. None of these things would work at all if they ignored the environment. I see reports of robot movement where the robot learns how to move in different environments by trying various strategies and, if you'll pardon the term, learns from its experience.

I don't think that contrasting two grossly oversimplified concepts is going to get very far in determining how language works and how it evolved.

End of rant.
BLOGGER: I'd like to know more about the "pruning the results according to some optimization." If the optimization is pre-defined, then it's a computation. In evolutionary processes, the environment does the selecting.


Thank you for a very thought provoking post. The existence of creativity in language (and in thought) reminds me of the American philosopher Charles Pierce and his notion of "abduction" as a mode of logic. Deduction and induction have their uses, but, one might argue, there is a certain logic to creativity. Abduction, according to Pierce, is that leap beyond what is known to what is possible. Some have suggested that guessing is a synonym for abduction. This does not presume to offer evidence for a computational model, on the contrary. However, perhaps there is a certain logic to creativity/evolution.

The comments to this entry are closed.

Bookmark and Share

Your email address:

Powered by FeedBlitz

Visitor Data

Blog powered by Typepad