back to index
The physical meaning of h-bar
TL;DR: Planck's constant sets the minimum spread imposed by the third law of thermodynamics
The reduced Planck’s constant immediately conjures quantum mechanics in the minds of most people who know a bit of physics. Therefore, when I show that classical mechanics has a similar uncertainty principle, which also depends on , they appear perplexed and wonder why appears. The issue is that Planck’s constant is really a thermodynamic constant. The fact that it appears prominently in quantum mechanics is because quantum mechanics is there to fix a major problem in classical statistical mechanics: the prediction of ensembles with negative entropy, which violate the third law of thermodynamics. Once you understand this, you are on the right track to understand what Planck’s constant represents… and why quantum mechanics is necessary.
First, note that Planck introduced his constant in his effort to derive a formula that would correctly predict the thermal radiation distribution of a black-body. That is a problem in thermodynamics/statistical mechanics. In the old quantum theory, it was speculated that the action could not take arbitrary values but rather only integer multiples of , which was called the “quantum of action”. The idea was that trajectories in phase space existed but they were hidden. All of this did not pan out, and old quantum theory was abandoned in favor of the new quantum theory, the one we now just call quantum theory, which talks about the wavefunction and its evolution. The connection between and to the action and thermodynamics was less stressed, and the constant became the link between energy and frequency, momentum and wavelength.
Now, one of the results of our project is a physical and geometric interpretation of the action principle, which allows us to say exactly what the mathematical objects represent physically. We show that the action is the line integral of the vector potential of the flow of states. What matters here is that it is a potential, much like the potential of the magnetic field. Therefore, the value of the action itself is not physically meaningful, in the same way that the value of the potential of the magnetic field at a particular point in space is not physically meaningful. Therefore, talking about “quantum of action” does not make much sense physically. However, can still be understood as an area in phase space. That has physical meaning because areas in phase space count possible configurations for a degree of freedom (DOF). Counting states is deeply connected to entropy, though the actual connection between entropy and statistical mechanics was developed a bit later, in the late 1950s. Moreover, the third law establishes that entropy cannot be less than zero.
So we have three things:
- Areas in phase space cannot be smaller than
- Entropy is a function of areas in phase space
- Entropy cannot be smaller than zero
This is not some vague analogy: we can make this connection precise.
Given a probability distribution over position and momentum, the entropy is given by . If the distribution is uniform, the expression reduces to , where is the area of the region that supports the uniform distribution. Now, if we do some dimensional checking, something that somehow theoretical physics has stopped doing, we note that does not make sense: has dimensions of phase space area. We can only take logarithms of pure numbers. How do we fix this? Let us define as the area of phase space that corresponds to a uniform distribution with zero entropy. The above formula, then, is amended to , a logarithm of a pure number, and if , then the entropy is zero.
If the entropy of a uniform distribution depends on the area, then it is clear that we cannot have distributions of smaller size. Note that we are not at all saying that areas must come in integer multiples of . Simply, that there is a minimum. If you shrink the area in position, it will have to increase in momentum. Yet, this is not the smallest uncertainty we can have. If we keep the entropy constant, the distribution that minimizes the uncertainty is the Gaussian. For a Gaussian distribution, the product of the standard deviations becomes , where is Euler’s number. This is the minimum uncertainty possible. For a generic distribution, then, we must have .
Note that there is no quantum of action, no quantization of energy, just the third law and its lower bound on entropy. Note how we go from to naturally, as the uncertainty for a Gaussian adds the factor of . Also, note that is slightly bigger than , so the classical uncertainty principle has a slightly different lower bound in classical mechanics. It’s not quite right.
To look at this further, let’s plot the uncertainty of Gaussian states for both classical and quantum mechanics as a function of uncertainty in units of .
Note that for even a couple of units of uncertainty, the entropy graph becomes basically indistinguishable. At zero entropy, the classical uncertainty is at while the quantum one is at . This is a quantitative difference. Then quantum mechanics stops, as states with lower uncertainty do not exist, while classical mechanics keeps going, violating the third law. So, why is it that quantum mechanics has a higher lower bound for uncertainty? Note that the lowest uncertainty is obtained when the distributions for position and momentum are independent. This is the correction that new quantum theory makes with respect to the old quantum theory: in pure Gaussian states, position and momentum are not independent. The ability to talk about position and momentum at the same time is exactly the ability to fix one without fixing the other. We would need ensembles at negative entropy to be able to define their value at greater precision, which is not possible.
The reason why the world is not classical is that classical mechanics does not make sense thermodynamically. The third law of thermodynamics is the only thing we need to impose to derive the existence of and an uncertainty principle in terms of . This is the road to actually understanding quantum mechanics.