 ‘S = k log W’ reads the epitaph on the grave of Ludwig Boltzmann, one of the greatest physicists and pioneers of statistical mechanics. We learnt about the Boltzmann constant during our H2 Physics tuition classes but there is much more to Boltzmann than just the constant. Let’s discuss one of the simplest experiments that can be explained using statistical mechanics. Add a drop of colored water to a beaker of clear water. The color spreads slowly and after a while the color is uniform across the entire volume of beaker. Simple it is to do. But why does the color spread? How is it connected to our great physicists formula, ‘S=k log W’? This article deals with story of Boltzmann and his definition of entropy.

Entropy is a measure of disorder in the system. The greater the disorder in the system, the larger the entropy. But how does one measure disorder in the system? Let’s assume that there is a person who randomly chooses a chair every 2 seconds and sits in it. Let’s take a situation where there is one person and one chair. The number of ways in which one can sit is only one and hence even after a long time the person sits in the same chair. Let’s say one more chair is added; now the person has two chairs to choose from. And since there is a possibility of switching the person might move around and as the number of chairs is increased there are more possibilities for person and the system is more disordered compared with the one-person-one-chair case. So the more possibilities are for a system to exist in, the more is the disorder in the system and hence more entropy. The entropy of a system is denoted by ‘S’; ‘W’ is the number possibilities for a system to exist in and ‘k’ is called as the Boltzmann constant.

Let’s come back to the example of color spreading in water. When the color causing molecules are at one point in the beaker there are very few possibilities but when they are spread across the volume of beaker they occupy more positions and hence the disorder and entropy are high. Second law of thermodynamics states that entropy of a closed system should increase with time. And hence the color spreads across the beaker with time. Since entropy should constantly increase with time, one could order two different states of a closed system in time based on the entropy values of each of these states. Thus the entropy gives us the arrow of time. For example if two snapshots of the above experiment, in which color spreads in water, where one has color at one part of the beaker and the other in which the color is uniform one can decide the order of these snapshots on the timeline based on their entropy levels. Having said this, if disorder is what universe evolves into, then why complex living forms have come up where atoms and molecules are ordered? Remember the whole universe thought of as a closed system, though there is loss of entropy in the ordering process of formation of life, this loss is compensated and the overall change in entropy across the universe is on increasing path.