Information and Order
Information
Boltzmann developed a way of measuring the probability of a state of motion, and related it to the entropy. Much later Shannon considered the probabilities of different messages. The probability of a message is related to the information it contains in the way Boltzmann described. Shannon used Boltzmann’s formula, but he prefixed it with a negative sign, and used a different multiplying factor to get a formula for calculating information. We use information to organize and order things. When information is destroyed, there is disorder. The same law, the second law of thermodynamics, applies both to information and to the entropy of heat processes.
Measuring Order
Entropy is proportional to negative information. Of course, entropy is itself a negative concept. It refers to disorder. To speak more clearly we should say that entropy is the negative of information. That way we avoid a double negative.
Shannon had to use a multiplying factor that differs from the one Boltzmann used because we measure entropy and information in different units. The word “bit” we use as a unit of information only coincidentally means “a small fragment.” Originally “bit” was a contraction of “binary digit.” We may measure information in binary digits (bits) or in units mathematicians find “natural,” units called nits. A nit is a little more than 1.442695 bits. To obtain entropy in watt-seconds per kelvin, multiply the number of nits of information by the Boltzmann constant, 0.000 000 000 000 000 000 000 013 806 62 watt-seconds per kelvin.
Shannon had to use a multiplying factor that differs from the one Boltzmann used because we measure entropy and information in different units. The word “bit” we use as a unit of information only coincidentally means “a small fragment.” Originally “bit” was a contraction of “binary digit.” We may measure information in binary digits (bits) or in units mathematicians find “natural,” units called nits. A nit is a little more than 1.442695 bits. To obtain entropy in watt-seconds per kelvin, multiply the number of nits of information by the Boltzmann constant, 0.000 000 000 000 000 000 000 013 806 62 watt-seconds per kelvin.