Thermodynamic Entropy; Boltzmann’s Constant; Frequency.
Physics properties are defined by equations. Clausius correctly defined thermodynamic entropy by an equation that is simple. Yet, it does not have an explanation for what it is physically. Entropy, temperature, Boltzmann’s constant, and Planck’s constant will each have physical explanations.
Thermodynamic properties are properties such as pressure, temperature, and volume for which we make
macroscopic measurements that pertain to measuring internal energy. The measurement of these properties
must be done on a medium that is in equilibrium. Temperature is commonly explained as a property that
demonstrates when two systems are in thermal equilibrium. When the systems are touching with no barrier,
no measurable exchanges of heat occur between the systems.
When external forces act on a system, or when the system exerts a force that acts on its surroundings, then
the
forces must act quasi-statically. This means forces must vary so slowly that any thermodynamic imbalance is
infinitesimally small. In other words, the system is always infinitesimally near a state of true
equilibrium. If a
property such as temperature changes, it must occur so slowly that there is no more than an infinitesimal
temperature variation between any two points within the system.
In the work that follows, all parts of a system are in states of equilibrium with one another. All changes
that
occur between systems or parts of systems occur sufficiently slow that each part of all systems remain
infinitesimally close to equilibrium.
Thermodynamic Entropy is defined by an equation that establishes a direct relationship between the heat entering the engine at a constant temperature and an increase in entropy. The equation is:
Where ΔS is a change in entropy, and ΔQ is the corresponding change in heat, i.e., energy in transit, into
or out
of a system, and T is the temperature of the system in degrees Kelvin. By convention heat entering the gas
is
positive. Heat leaving the gas is negative. The equation that defines thermodynamic entropy is based upon a
Carnot engine. The engine operates in a four-part cycle.
a. There is a steady source of heat and a steady heat sink. The heat source is at temperature Thigh, the
heat
sink is at temperature Tlow. The Carnot engine operates cyclically between these two
temperatures. The
engine will absorb heat from source Thigh and reject heat to source Tlow. For this example, the
working
substance is an ideal gas.
b. The cycle begins with the engine in unrestricted contact with the heat source Thigh. This is the point
from which the cycle starts. The gas volume expands pushing a piston against an outside resistance.
This action occurs quasi-statically, i.e., so slowly that the temperature of the ideal gas remains constant
at Thigh.
c. The engine is removed from contact with Thigh. The heated gas continues to expand adiabatically, i.e.,
no heat enters or leaves the engine, until its temperature falls, due to expansion, to that of the heat sink
Tlow.
d. The engine is put in contact with the heat sink Tlow. The gas is compressed while remaining at
temperature Tlow. This is accomplished by quasi-statically absorbing heat into the heat sink which
remains at temperature T low.
e. The engine is separated from the heat sink while the returning piston adiabatically continues to
compress the ideal gas causing its temperature to rise to Thigh. This ends the four-part cycle. The cycle
repeats itself. This type of engine is a reversible Carnot engine.
It is known for the reversible Carnot engine that:
The temperatures \(T_h\) and \(T_l\) may vary, but the equality of this relationship remains true. This relationship is the basis of the definition of thermodynamic entropy. The equation defining thermodynamic entropy is the first equation, but the second equation is used more often because theoretical physics changed entropy into something different from what Clausius defined. The major difference between the two is that Clausius’s entropy is independent of temperature and the invented entropy is dependent upon temperature.
For Clausius’ entropy of a reversible Carnot engine, these equations are equivalent. There is no change in
the rate of increase or decrease of entropy. Entropy \(S\) increases at a constant rate. Heat \(Q\)
increases at a
constant rate. This condition is due to the temperature \(T\) remaining constant.
By convention, the entropy of the gas will increase when expanding while in contact with \(T_{high}\)
and will decrease when compressing while in contact with \(T_{low}\). Therefore, the increase in entropy is
given by:
And the decrease in entropy is given by:
For a reversible Carnot engine, their sum is:
There is no net change in entropy for the reversible Carnot engine, i.e., the cycle of the engine is brought
back to its initial condition with no change in entropy.
Clausius’s definition of entropy shows how thermodynamic entropy is calculated but does not make clear what
entropy is. Neither Clausius nor today’s physicists could explain what thermodynamic entropy is. The units
of entropy are joules per degree Kelvin.
It is temperature that masks the identity of entropy. Temperature is an indefinable property in theoretical
physics. It is accepted as a fundamentally unique property along with distance, time, mass, and electric
charge.
If the physical action that is temperature was identified, then entropy would be explainable.
What is entropy? It is something whose nature should be easily seen because its derivation is part of the
operation of the simple, fundamental Carnot engine. The answer can be found in the operation of the Carnot
engine. The Carnot engine is the most efficient engine, theoretically speaking. Its efficiency is
independent of the nature of the working medium, in this case, a simple gas. Efficiency depends only upon
the values of the
high and low temperatures in degrees Kelvin. Degrees Kelvin must be used because the Kelvin temperature
scale is
derived based upon the Carnot cycle.
The engine’s equation of efficiency and the definition of the Kelvin temperature scale are the basis for the
derivation of the equation:
Something very important happens during this derivation that establishes a definite rate of operation of the
Carnot cycle. The engine is defined as operating quasi-statically. The general requirement for this to be
true is
that the engine should operate so slowly that the temperature of the working medium should always measure
the same at any point within the medium. This is a condition that must be met for a system to be described
as
operating infinitesimally close to equilibrium.
There are several rates of operation that will satisfy this condition; however, there is one specific rate,
above
which, the equilibrium will be lost. Any slower rate will work fine. The question is: What is this rate of
operation
that separates equilibrium from disequilibrium? It is important to know this because this is the rate that
becomes fixed into the derivation of the Carnot engine. This occurs because the engine is defined such that
the
ratio of its heat absorbed to its heat rejected equals the ratio of the temperatures of the high and low
heat
sources:
This special rate of operation could be identified if the physical meaning of temperature was made clear. In this new theory, temperature is identified and defined as the rate of exchange of energy between molecules. Temperature is not quantitatively the same as that rate because temperature is assigned the units of degrees and its scale is arbitrarily fitted to the freezing and boiling points of water. The temperature difference between these points on the Kelvin scale is set at 100 degrees. For this reason, the quantitative measurement of temperature is not the same as the quantitative measurement of exchange of energy between molecules. However, this discrepancy can be moderated with the introduction of a constant of proportionality:
$$ dQ = kT dt T $$
Both dS and dt are variables. It is necessary to determine a value for the constant kT. This value may be contained in the ideal gas law:
Where k is Boltzmann’s constant. If I let n=1, then the equation gives the kinetic energy of a single molecule. In this case E becomes ΔE an incremental value of energy. Substituting:
This suggests that for an ideal gas molecule:
In other words, the entropy of a single ideal gas molecule is constant. The condition under which this is
true is
when the gas molecules act like billiard balls and their pressure is very close to zero. Near zero pressure
for
any practical temperature requires that the gas molecules be low in number and widely dispersed.
I interpret this to mean, under these conditions, that the thermodynamic measurement of temperature and
kinetic energy approach single molecule status. Normally, thermodynamic properties do not apply to small
numbers of molecules. However, sometimes it is instructive to establish a link between individual molecules
and thermodynamic properties, as is done in the development of the kinetic theory of gases. The case at hand
is an inherent part of the kinetic theory of gases. The ideal gas law written for a single gas molecule
gives reason
to consider that for a single molecule:
Substituting for Boltzmann’s constant:
I have defined Entropy as:
Therefore, I write:
If I could establish a value for Δt, then I could calculate kT. Since this calculation is assumed to apply to a single gas molecule and is a constant value, I assume that in this special case, t is a fundamental increment of time. In this theory, there is one fundamental increment of time. It is:
Substituting this value and solving for kT:
Substituting the units for each quantity as determined by this new theory:
The value kT is a unit free constant of proportionality. It also follows that Boltzmann’s constant is defined as:
For the ideal gas equation, the entropy of each molecule is a constant:
However, thermodynamic entropy is defined as an aggregate macroscopic function. I have a value for the constant kT, but the increment of time in the macroscopic function is not a constant. There are a great number of molecules involved and their interactions overlap and add together. It is a variable. I expand the meaning of entropy into its more general form and substitute kT into the general thermodynamic definition of entropy
The Δt in this equation is not the same as the Δtc in the equation for a single molecule. In the macroscopic version, it is the time required for a quantity of energy, in the form of heat, to be transferred at the rate represented by the temperature in degrees Kelvin. Substituting this equation for entropy into the general energy equation:
Recognizing that the increment of energy represents an increment of heat entering or leaving the engine, and solving for ΔS:
Solving for Δt:
This function of Δt is what would have become defined as the function of entropy if temperature had been defined directly as the rate of transfer of energy between molecules. The arbitrary definition of temperature made it necessary for the definition of entropy to include the proportionality constant kT. Writing an equation to show this:
In particular:
For a Carnot engine:
Therefore:
The net change in entropy is:
In this theory, Boltzmann’s constant has acquired the definition:
There is a relationship between Planck’s constant \( h \) and Boltzmann’s constant \( k \):
Substituting for \( k \) and rearranging terms:
Using the equation:
Since:
Substituting:
Since:
Substituting:
Rearranging:
Making the same change for general cases:
Defining an analogy to entropy for frequency:
Substituting:
Now I give a detailed general definition for Planck's constant:
The potential energy of the hydrogen electron in its first energy level is:
Where:
Substituting for the speed of light:
The denominator on the right side is the period of the frequency. Therefore:
Also, the potential energy for a circular orbit can be expressed as:
Therefore:
Solving for Planck’s constant:
This result defines Planck’s constant in terms of properties of the hydrogen atom.
I have defined temperature as:
I have also derived:
Solving for \(k_T\):
Since:
Then:
I have established a relationship between Planck’s constant and Boltzmann’s constant in the form of:
Substituting for Planck’s constant:
Substituting for \(k_T\):
Or, in terms of momentum:
Where:
Defining Frequency
Substituting for \(k_B\):
Also:
And, from an earlier result:
Yielding:
The first set of parenthesis contains the potential energy of the hydrogen electron in its first energy
level. The
second set is the period of time required for the electron to complete one radian. The third set is the
angular
velocity of the electron in units of radians per second.
This theory’s definition of Planck’s constant first changes frequency into radians per second. Then, it
converts
radians per second, for the subject frequency, into a measure of the number of radians traveled during the
period of time required for the hydrogen electron to travel one radian. Finally, the result of the first two
steps
is multiplied by the potential energy of the hydrogen electron in its first energy level. In other words,
Planck’s
constant uses fundamental properties of the hydrogen atom as the standard by which to convert frequencies
into quantities of energy.
This theory introduces the idea that a consequence of defining thermodynamic entropy using an ideal gas is
that, as the pressure approaches zero, the exchanges of energy between molecules theoretically reduce
to single exchanges. A point is reached where exchanges occur at a rate that can be modeled as one at
a time without delay between them. That is an equilibrium point where the temperature is close to a constant
value. Clausius’ thermodynamic entropy applies to that low pressure where the exchanges that occur can
be ideally represented as each molecule taking its turn, without delay, to pass on average molecular
kinetic energy. This process can be modeled by considering all the gas molecules lined up in a single file
and the average molecular kinetic energy of one of them is transferred down the line from molecule to
molecule until the energy has been transferred to the last molecule. The time required to complete this
process is ‘internal’ thermodynamic entropy.
Temperature is proportional to the rate of transfer of average molecular kinetic energy between molecules.
The modified temperature is the rate at which energy is transferred between molecules. The numerator of
the modified temperature is average molecular kinetic energy. The average kinetic energy of an ideal gas
depends upon temperature only. It was shown that the average kinetic energy divided by modified
temperature equals:
In the equation below, Boltzmann’s constant is defined as the first equal term and by thermodynamics as the second equal term:
N is Avogadro’s number, the number of molecules in a mole of gas. R is the universal gas constant. Solving for R:
Substituting the appropriate values:
For one mole of gas:
The universal gas constant R is directly proportional to the total time required for a mole of ideal gas to transfer average molecular kinetic energy from molecule to molecule without delay between exchanges until the number of molecules in a mole of gas is reached.
Boltzmann’s constant is the time represented by the universal gas constant R reduced to single molecule status:
Strictly speaking, the units of degrees should have been included in the two equations above. I took the liberty of not showing it for reason of readability.
The number of possible arrangements for a mole of ideal gas is infinite. Boltzmann’s entropy requires there to be a limited number of possible arrangements.
Boltzmann’s entropy is defined as:
Therefore, Boltzmann’s entropy is proportional to the time of a single transfer of ideal gas molecule energy times the logarithm of the number of microstates. Boltzmann’s entropy is not an expression of simulated internal thermodynamic entropy. Boltzmann’s entropy is no longer a direct measure of time. The units of seconds carried along by Boltzmann’s constant have become irrelevant. Boltzmann’s constant can be set to unity without units. Its connection to thermodynamic entropy is already lost.
Clausius’s thermodynamic entropy is the time it takes for heat Q1 to be absorbed into an ideal gas at the rate of temperature Thigh or for Q2 to be released out of an ideal gas at the rate of temperature Tlow. The time it takes for a single ideal gas molecule to pass its kinetic energy off to another ideal gas molecule at a distance equal to the radius of the hydrogen atom is the unit of absolute time derived in the article A Unit of Absolute Universal Time.
Sears FW. College Physics, Addison-Wesley (Reading) (1960).
Zemansky MW. Heat and Thermodynamics, McGraw-Hill, (Reading) (1943).
Fermi E. Thermodynamics, Prentice-Hall, (Reading) (1937).