Boltzmann developed a way of measuring the probability of a state of motion, and related it to the entropy. Much later Shannon considered the probabilities of different messages. The probability of a message is related to the information it contains in the way Boltzmann described. Shannon used Boltzmann’s formula, but he prefixed it with a negative sign, and used a different multiplying factor to get a formula for calculating information. We use information to organize and order things. When information is destroyed, there is disorder. The same law, the second law of thermodynamics, applies both to information and to the entropy of heat processes.