Entropy, Probability, and Information
Information is now a measurable quantity, thanks to the second law of thermodynamics. But what do information and entropy have to do with probability?
Random motion is the most probable activity we could see if we could observe the motions of steam molecules. It is not the most useful activity an engineer could desire for greatest efficiency. Ideally the molecules would all move in the same direction, bounce perpendicularly off the piston, give up all their energy, and then fall out of the bottom of the cylinder. If we could get steam molecules to do that, the engine would be 100 percent efficient. It would get all the energy out of the heat and turn it into useful work. Now, it is possible that all the molecules might just happen to be moving in such an orderly way, but it is very improbable. Improbable states of motion are those that thermal action has not randomized and disordered.
Random motion is the most probable activity we could see if we could observe the motions of steam molecules. It is not the most useful activity an engineer could desire for greatest efficiency. Ideally the molecules would all move in the same direction, bounce perpendicularly off the piston, give up all their energy, and then fall out of the bottom of the cylinder. If we could get steam molecules to do that, the engine would be 100 percent efficient. It would get all the energy out of the heat and turn it into useful work. Now, it is possible that all the molecules might just happen to be moving in such an orderly way, but it is very improbable. Improbable states of motion are those that thermal action has not randomized and disordered.