There's an interesting paper titled The Latent Structure of Dictionariesfloating around the Internet. Written by a Canadian-led team, it forces clearer thinking about words.
Dictionaries rest on a well-known paradox. They use words to define words. So I might look up the word justice and read "the quality of being just; fairness." Ok. So I look up fairness and find "free from favoritism, self-interest, or preference in judgment." Oh, boy. I could look up all those words too, but a black hole emerges before me. The task stretches out to infinity.
Thanks to the computer, however, the endless task can be accomplished. There are, after all, a finite number of words in the dictionary. Let's say there are D words defined in a dictionary. Not all of these words are used in defining other words. For example, the dictionary defines the word cockroach but does not (I'm guessing, here) use the word in any of D's vast text of definitions. We can symbolize these unused words by the letter C and remove all of them from D. That process leaves us with a shorter list, call it D1. (D1 = D – C)