For Claude Shannon, the inventor of modern information theory, "ordinary language" (as a euphemistic measure of communicability) "contains greater than fifty percent redundancy in the form of sounds or letters that are not strictly necessary to convey a message." ("A Mathematical Theory of Communication," Bell System Technical Journal 27 (1948): 379-423, 623-56) Redundancy in this sense can be understood as a "predictable departure from the random." Part of the redundancy of ordinary language lies in its formal structure, by which successive terms can be deduced by virtue of the semantic, syntactic, or grammatical structures from which they are missing. Other kinds of redundancy, however, lend themselves more directly to numerical or statistical measures, such as the frequency of repetitions of particular letters, combinations of letters, common grammatical and syntactical features and so on, in any particular language. Hence: "a stream of data in ordinary language is less than random; each new bit is partly constrained by the bits that went before; thus each new bit carries less than a bit's worth of real information." ("A Mathematical Theory of Communication," 387) In this way, it is not the "message" which conveys information, but rather that which remains irreducible to, or unassimilable within, the message: "destabilising" elements such as noise or feedback. As David Ruelle has pointed out: "the message can be compressed if it is redundant, but the information is not compressible." (Chance and Chaos, 132).