Claude Shannon outlined the principles of Information Theory in 1948, and the applications of his seminal work surround us still. According to the astounding publication by Autonomy, Meaning Based Computing, 2009 at 23-24, “Shannon stated that information could be treated as a quantifiable value in communication.” For example, from a large amount of text, Information Theory provides a framework for extracting useful concepts from distracting noise. “Autonomy’s approach to concept modeling relies on Shannon’s theory that the less frequently a unit of communication occurs, the more information it conveys.”
When I compile metaposts, the ones with unusual words are easiest to do because when if find them, they shout themselves out as candidates for compilation. My hours of labor over definitions, concepts, and metaposts are efforts to apply Information Theory by hand: I quantify the presence of a term and thereby judge its importance to general counsel. In a like way, concept maps quantify the informational relationships between ideas (See my post of Jan. 8, 2009: concept mapping; and April 27, 2005: knowledge maps.).
Information Theory extends to legal departments. When an unusual statute shows up in a matter, it conveys more information to a lawyer than a common statute. When litigation support software sifts through terabytes of documents, the relatively unique names of people stand out. And so on (See my post of May 23, 2008 #4: “the first law of information theory tells us that every relay doubles the noise and cuts the message in half.”).