Independent Researcher


Keywords

Thermodynamic Entropy; Boltzmann’s Constant; Frequency.


Introduction

Physics properties are defined by equations. Clausius correctly defined thermodynamic entropy by an equation that is simple. Yet, it does not have an explanation for what it is physically. Entropy, temperature, Boltzmann’s constant, and Planck’s constant will each have physical explanations.

Thermodynamic properties are properties such as pressure, temperature, and volume for which we make macroscopic measurements that pertain to measuring internal energy. The measurement of these properties must be done on a medium that is in equilibrium. Temperature is commonly explained as a property that demonstrates when two systems are in thermal equilibrium. When the systems are touching with no barrier, no measurable exchanges of heat occur between the systems.

When external forces act on a system, or when the system exerts a force that acts on its surroundings, then the forces must act quasi-statically. This means forces must vary so slowly that any thermodynamic imbalance is infinitesimally small. In other words, the system is always infinitesimally near a state of true equilibrium. If a property such as temperature changes, it must occur so slowly that there is no more than an infinitesimal temperature variation between any two points within the system.

In the work that follows, all parts of a system are in states of equilibrium with one another. All changes that occur between systems or parts of systems occur sufficiently slow that each part of all systems remain infinitesimally close to equilibrium.

Definition of Thermodynamic Entropy

Thermodynamic Entropy is defined by an equation that establishes a direct relationship between the heat entering the engine at a constant temperature and an increase in entropy. The equation is:


Δ𝑆 = Δ𝑄

Where ΔS is a change in entropy, and ΔQ is the corresponding change in heat, i.e., energy in transit, into or out of a system, and T is the temperature of the system in degrees Kelvin. By convention heat entering the gas is positive. Heat leaving the gas is negative. The equation that defines thermodynamic entropy is based upon a Carnot engine. The engine operates in a four-part cycle.

a. There is a steady source of heat and a steady heat sink. The heat source is at temperature Thigh, the heat sink is at temperature Tlow. The Carnot engine operates cyclically between these two temperatures. The engine will absorb heat from source Thigh and reject heat to source Tlow. For this example, the working substance is an ideal gas.

b. The cycle begins with the engine in unrestricted contact with the heat source Thigh. This is the point from which the cycle starts. The gas volume expands pushing a piston against an outside resistance. This action occurs quasi-statically, i.e., so slowly that the temperature of the ideal gas remains constant at Thigh.

c. The engine is removed from contact with Thigh. The heated gas continues to expand adiabatically, i.e., no heat enters or leaves the engine, until its temperature falls, due to expansion, to that of the heat sink Tlow.

d. The engine is put in contact with the heat sink Tlow. The gas is compressed while remaining at temperature Tlow. This is accomplished by quasi-statically absorbing heat into the heat sink which remains at temperature T low.

e. The engine is separated from the heat sink while the returning piston adiabatically continues to compress the ideal gas causing its temperature to rise to Thigh. This ends the four-part cycle. The cycle repeats itself. This type of engine is a reversible Carnot engine.

It is known for the reversible Carnot engine that:


$$ \frac{\Delta Q_h}{T_h} = \frac{\Delta Q_l}{T} $$

The temperatures \(T_h\) and \(T_l\) may vary, but the equality of this relationship remains true. This relationship is the basis of the definition of thermodynamic entropy. The equation defining thermodynamic entropy is the first equation, but the second equation is used more often because theoretical physics changed entropy into something different from what Clausius defined. The major difference between the two is that Clausius’s entropy is independent of temperature and the invented entropy is dependent upon temperature.


$$ S = \frac{Q}{T} \quad \text{or} \quad \Delta S = \frac{\Delta Q}{T} $$

For Clausius’ entropy of a reversible Carnot engine, these equations are equivalent. There is no change in the rate of increase or decrease of entropy. Entropy \(S\) increases at a constant rate. Heat \(Q\) increases at a constant rate. This condition is due to the temperature \(T\) remaining constant.

By convention, the entropy of the gas will increase when expanding while in contact with \(T_{high}\) and will decrease when compressing while in contact with \(T_{low}\). Therefore, the increase in entropy is given by:


$$ \Delta S_i = \frac{\Delta Q_h}{T_h} $$

And the decrease in entropy is given by:

$$ -\Delta S_d = \frac{\Delta Q_l}{T_l} $$

For a reversible Carnot engine, their sum is:

$$ \Delta S_i - \Delta S_d = 0 $$

There is no net change in entropy for the reversible Carnot engine, i.e., the cycle of the engine is brought back to its initial condition with no change in entropy.

Clausius’s definition of entropy shows how thermodynamic entropy is calculated but does not make clear what entropy is. Neither Clausius nor today’s physicists could explain what thermodynamic entropy is. The units of entropy are joules per degree Kelvin.

It is temperature that masks the identity of entropy. Temperature is an indefinable property in theoretical physics. It is accepted as a fundamentally unique property along with distance, time, mass, and electric charge. If the physical action that is temperature was identified, then entropy would be explainable.

What is entropy? It is something whose nature should be easily seen because its derivation is part of the operation of the simple, fundamental Carnot engine. The answer can be found in the operation of the Carnot engine. The Carnot engine is the most efficient engine, theoretically speaking. Its efficiency is independent of the nature of the working medium, in this case, a simple gas. Efficiency depends only upon the values of the high and low temperatures in degrees Kelvin. Degrees Kelvin must be used because the Kelvin temperature scale is derived based upon the Carnot cycle.

The engine’s equation of efficiency and the definition of the Kelvin temperature scale are the basis for the derivation of the equation:


$$ f_{\xi} = \frac{q_1 q_2}{4 \pi \varepsilon r^2} $$

Something very important happens during this derivation that establishes a definite rate of operation of the Carnot cycle. The engine is defined as operating quasi-statically. The general requirement for this to be true is that the engine should operate so slowly that the temperature of the working medium should always measure the same at any point within the medium. This is a condition that must be met for a system to be described as operating infinitesimally close to equilibrium.
There are several rates of operation that will satisfy this condition; however, there is one specific rate, above which, the equilibrium will be lost. Any slower rate will work fine. The question is: What is this rate of operation that separates equilibrium from disequilibrium? It is important to know this because this is the rate that becomes fixed into the derivation of the Carnot engine. This occurs because the engine is defined such that the ratio of its heat absorbed to its heat rejected equals the ratio of the temperatures of the high and low heat sources:

$$ \frac{Q_h}{Q_l} = T_h $$

This special rate of operation could be identified if the physical meaning of temperature was made clear. In this new theory, temperature is identified and defined as the rate of exchange of energy between molecules. Temperature is not quantitatively the same as that rate because temperature is assigned the units of degrees and its scale is arbitrarily fitted to the freezing and boiling points of water. The temperature difference between these points on the Kelvin scale is set at 100 degrees. For this reason, the quantitative measurement of temperature is not the same as the quantitative measurement of exchange of energy between molecules. However, this discrepancy can be moderated with the introduction of a constant of proportionality:

$$ \frac{dQ}{dt} = kT $$

$$ dQ = kT dt T $$

This equation indicates that the differential of entropy is:
$$ dS = kT dt $$

Both dS and dt are variables. It is necessary to determine a value for the constant kT. This value may be contained in the ideal gas law:

$$ E = n \frac{3}{2} k $$

Where k is Boltzmann’s constant. If I let n=1, then the equation gives the kinetic energy of a single molecule. In this case E becomes ΔE an incremental value of energy. Substituting:

$$ \Delta E = \frac{3}{2} k $$

This suggests that for an ideal gas molecule:

$$ \Delta S = \frac{3}{2} k $$

In other words, the entropy of a single ideal gas molecule is constant. The condition under which this is true is when the gas molecules act like billiard balls and their pressure is very close to zero. Near zero pressure for any practical temperature requires that the gas molecules be low in number and widely dispersed.

I interpret this to mean, under these conditions, that the thermodynamic measurement of temperature and kinetic energy approach single molecule status. Normally, thermodynamic properties do not apply to small numbers of molecules. However, sometimes it is instructive to establish a link between individual molecules and thermodynamic properties, as is done in the development of the kinetic theory of gases. The case at hand is an inherent part of the kinetic theory of gases. The ideal gas law written for a single gas molecule gives reason to consider that for a single molecule:

$$ \Delta S = \frac{3}{2} $$

Substituting for Boltzmann’s constant:

$$ \Delta S = \frac{3}{2} (1.38 \times 10^{-23} \frac{J}{K}) = 2.07 \times 10^{-23} J $$

I have defined Entropy as:

$$ \Delta S = kT \Delta $$

Therefore, I write:

$$ kT \Delta t = 2.07 \times 10^{-23} J $$

If I could establish a value for Δt, then I could calculate kT. Since this calculation is assumed to apply to a single gas molecule and is a constant value, I assume that in this special case, t is a fundamental increment of time. In this theory, there is one fundamental increment of time. It is:

$$ \Delta t_c = 1.602 \times 10^{-19} $$

Substituting this value and solving for kT:

$$ kT = \frac{2.07 \times 10^{-23} J}{K} {1.602 \times 10^{-19} s} = 1.292 \times 10^{-4} \frac{J}{s} $$

Substituting the units for each quantity as determined by this new theory:

$$ kT = 1.292 \times 10^{-4} $$

The value kT is a unit free constant of proportionality. It also follows that Boltzmann’s constant is defined as:

$$ k = \frac{2}{3} kT \Delta t $$

For the ideal gas equation, the entropy of each molecule is a constant:

$$ \Delta S = kT \Delta t $$

However, thermodynamic entropy is defined as an aggregate macroscopic function. I have a value for the constant kT, but the increment of time in the macroscopic function is not a constant. There are a great number of molecules involved and their interactions overlap and add together. It is a variable. I expand the meaning of entropy into its more general form and substitute kT into the general thermodynamic definition of entropy

$$ \Delta S = kT \Delta $$

The Δt in this equation is not the same as the Δtc in the equation for a single molecule. In the macroscopic version, it is the time required for a quantity of energy, in the form of heat, to be transferred at the rate represented by the temperature in degrees Kelvin. Substituting this equation for entropy into the general energy equation:

$$ \Delta E = \Delta S T = kT \Delta t $$

Recognizing that the increment of energy represents an increment of heat entering or leaving the engine, and solving for ΔS:

$$ \Delta S = \frac{\Delta E}{T} = \frac{\Delta Q}{T} = kT \Delta t $$

Solving for Δt:

$$ \Delta t = \frac{\Delta S}{kT} = \frac{\Delta Q}{kT} $$

This function of Δt is what would have become defined as the function of entropy if temperature had been defined directly as the rate of transfer of energy between molecules. The arbitrary definition of temperature made it necessary for the definition of entropy to include the proportionality constant kT. Writing an equation to show this:

\( \frac{\Delta Q}{k T T} = \frac{\Delta Q}{\Delta Q} \Delta \)

In particular:

\( \frac{\Delta Q_h}{k T T_h} = \frac{\Delta Q_h}{\Delta Q_h} \frac{\Delta t}{\Delta t} \)

For a Carnot engine:

\( \frac{\Delta Q_h}{k T T_h} = \frac{\Delta Q_l}{k T T} \)

Therefore:

\( \frac{\Delta Q_h}{\Delta Q_h} \frac{\Delta t}{\Delta t} = \frac{\Delta Q_l}{\Delta Q_l} \Delta \)

The net change in entropy is:

\( \Delta S = \frac{Q_z}{T_{low}} - \frac{Q_z}{T_{high}} \)

In this theory, Boltzmann’s constant has acquired the definition:

\( k = \frac{2}{3} k T \Delta t c \)

There is a relationship between Planck’s constant \( h \) and Boltzmann’s constant \( k \):

\( h = k \Delta x \)

Substituting for \( k \) and rearranging terms:

\( h = k T \left( \frac{2}{3} \Delta x c \right) \Delta t \)

Using the equation:

\( \Delta E = \Delta S \)

Since:

\( T = \frac{2}{3} \Delta x c \)

Substituting:

\( \Delta E = \Delta S \frac{2}{3} \Delta x c \)

Since:

\( \Delta S = k T \Delta t \)

Substituting:

\( \Delta E = k T \Delta t c \frac{2}{3} \Delta x c \)

Rearranging:

\( \Delta E = k T \frac{2}{3} \Delta x c \Delta t c \omega = h \)

Making the same change for general cases:

\( \Delta E = \left( k T \frac{2}{3} \Delta x c \Delta t \right) \)

Defining an analogy to entropy for frequency:

\( \Delta S_p = \left( k T \frac{2}{3} \Delta x c \Delta t \right) \)

Substituting:

\( \Delta E = \Delta S_p \)

Now I give a detailed general definition for Planck's constant:

\( \frac{d\xi}{dx} = \frac{dB}{d} \)

The potential energy of the hydrogen electron in its first energy level is:

\( f = q \xi \)

Where:

\( \omega_{eH1} = \frac{\nu_{eH1}}{\lambda_{eH1}} = \frac{\nu c \alpha}{2 \pi \Delta x} \)

Substituting for the speed of light:

\( \omega_{eH1} = \left( \frac{\Delta x c}{\Delta t c} \right) \alpha \frac{1}{2 \pi \Delta x c} = \frac{\alpha}{2 \pi \Delta t c} = \frac{1}{2 \pi (137) \Delta t} \)

The denominator on the right side is the period of the frequency. Therefore:

\(\Delta E_{eH1} = \frac{h}{2\pi\alpha^{-1} \Delta}\)

Also, the potential energy for a circular orbit can be expressed as:

\(\Delta E_{eH1} = f_{eH1} \Delta x\)

Therefore:

\(f_{eH1} \Delta x_c = \frac{h}{2\pi\alpha^{-1} \Delta t}\)

Solving for Planck’s constant:

\(h = f_{eH1} \Delta x_c \Delta t_c 2\pi\alpha^{-1}\)

This result defines Planck’s constant in terms of properties of the hydrogen atom.

Defining Temperature

I have defined temperature as:

\(T = \frac{k_T \Delta E}{\Delta t}\)

I have also derived:

\(h = k_T \left( \frac{2}{3} \Delta x_c \right) \Delta t\)

Solving for \(k_T\):

\(k_T = \frac{h}{\left( \frac{2}{3} \Delta x_c \right) \Delta t}\)

Since:

\(h = f_{eH1} \Delta x_c \Delta t_c 2\pi\alpha^{-1}\)

Then:

\(k_T = \frac{f_{eH1} \Delta x_c \Delta t_c 2\pi\alpha^{-1} \left( \frac{2}{3} \Delta x_c \right) \Delta t_c}{3} = 3 f_{eH1} \pi \alpha^{-1}\)

Defining Boltzmann’s Constant

I have established a relationship between Planck’s constant and Boltzmann’s constant in the form of:

\(k_B = \frac{h}{\Delta x}\)

Substituting for Planck’s constant:

\(k_B = \frac{k_T \left( \frac{2}{3} \Delta x_c \right) \Delta t_c}{\Delta x_c} = k_T \frac{2}{3} \Delta t\)

Substituting for \(k_T\):

\(k_B = \frac{3 f_{eH1} 2\pi\alpha^{-1} 2}{3} \Delta t_c = f_{eH1} 2\pi\alpha^{-1} \Delta t_c\)

Or, in terms of momentum:

\(k_B = \Delta P_{eH1} 2\pi\)

Where:

\(\Delta P_{eH1} = f_{eH1} \alpha^{-1} \Delta t_c = f_{eH1} \Delta t_{eH1}\)

Defining Frequency

Defining Frequency

\(h = k_B \Delta x\)

Substituting for \(k_B\):

\(h = \Delta P_{eH1} 2\pi \Delta x_c = \Delta P_{eH1} \lambda_{eH1}\)

Also:

\(\Delta E_{eH1} = h\omega = \Delta P_{eH1} \lambda_{eH1} \omega_{eH1} = \Delta P_{eH1} v_{eH1}\)

And, from an earlier result:

\(h = f_{eH1} \Delta x_c \Delta t_c 2\pi\alpha^{-1}\)

Yielding:

\(\Delta E = h\omega = f_{eH1} \Delta x_c \Delta t_c \alpha^{-1} 2\pi\omega = (f_{eH1} \Delta x_c)(\alpha^{-1} \Delta t_c)(2\pi\omega)\)

The first set of parenthesis contains the potential energy of the hydrogen electron in its first energy level. The second set is the period of time required for the electron to complete one radian. The third set is the angular velocity of the electron in units of radians per second.

This theory’s definition of Planck’s constant first changes frequency into radians per second. Then, it converts radians per second, for the subject frequency, into a measure of the number of radians traveled during the period of time required for the hydrogen electron to travel one radian. Finally, the result of the first two steps is multiplied by the potential energy of the hydrogen electron in its first energy level. In other words, Planck’s constant uses fundamental properties of the hydrogen atom as the standard by which to convert frequencies into quantities of energy.

Boltzmann’s Entropy

This theory introduces the idea that a consequence of defining thermodynamic entropy using an ideal gas is that, as the pressure approaches zero, the exchanges of energy between molecules theoretically reduce to single exchanges. A point is reached where exchanges occur at a rate that can be modeled as one at a time without delay between them. That is an equilibrium point where the temperature is close to a constant value. Clausius’ thermodynamic entropy applies to that low pressure where the exchanges that occur can be ideally represented as each molecule taking its turn, without delay, to pass on average molecular kinetic energy. This process can be modeled by considering all the gas molecules lined up in a single file and the average molecular kinetic energy of one of them is transferred down the line from molecule to molecule until the energy has been transferred to the last molecule. The time required to complete this process is ‘internal’ thermodynamic entropy.

Temperature is proportional to the rate of transfer of average molecular kinetic energy between molecules. The modified temperature is the rate at which energy is transferred between molecules. The numerator of the modified temperature is average molecular kinetic energy. The average kinetic energy of an ideal gas depends upon temperature only. It was shown that the average kinetic energy divided by modified temperature equals:

\[ \frac{1}{2} m v^2 = k_T T \Delta t \]

In the equation below, Boltzmann’s constant is defined as the first equal term and by thermodynamics as the second equal term:

\[ k = \frac{2}{3} k_T \Delta t_c = R \]

N is Avogadro’s number, the number of molecules in a mole of gas. R is the universal gas constant. Solving for R:

\[ R = \frac{2}{3} N k_T \Delta t \]

Substituting the appropriate values:

\[ R = \frac{2}{3} (6.02 \times 10^{23} \text{ mole}^{-1}) (1.292 \times 10^{-4}) (1.602 \times 10^{-19}s) = 8.31(s \times \text{mole}^{-1}) \]

For one mole of gas:

\[ R = \frac{2}{3} (6.02 \times 10^{23}) (1.292 \times 10^{-4}) (1.602 \times 10^{-19}s) = 8.31 \]

The universal gas constant R is directly proportional to the total time required for a mole of ideal gas to transfer average molecular kinetic energy from molecule to molecule without delay between exchanges until the number of molecules in a mole of gas is reached.

\[ \frac{3}{2} \left( \frac{R}{k_T} \right) = N(\Delta t_c) = (6.02 \times 10^{23} \text{ mole}^{-1}) (1.602 \times 10^{-19}s) \] \[ = 96,440 s \times \text{mole}^{-1} = 26.8 \text{ hrs} \times \text{mole}^{-1} \]

Boltzmann’s constant is the time represented by the universal gas constant R reduced to single molecule status:

\[ k = \frac{R}{N} = \frac{8.312}{6.02 \times 10^{23} \text{ mole}^{-1}} = 1.38 \times 10^{-23} \]

Strictly speaking, the units of degrees should have been included in the two equations above. I took the liberty of not showing it for reason of readability.

\[ \Delta t_c = \frac{3}{2} \left( \frac{k}{k_T} \right) = 1.602 \times 10^{-19} \]

The number of possible arrangements for a mole of ideal gas is infinite. Boltzmann’s entropy requires there to be a limited number of possible arrangements.

\[ S = k \Omega \]

Boltzmann’s entropy is defined as:

\[ S = k \log \Omega \]

Therefore, Boltzmann’s entropy is proportional to the time of a single transfer of ideal gas molecule energy times the logarithm of the number of microstates. Boltzmann’s entropy is not an expression of simulated internal thermodynamic entropy. Boltzmann’s entropy is no longer a direct measure of time. The units of seconds carried along by Boltzmann’s constant have become irrelevant. Boltzmann’s constant can be set to unity without units. Its connection to thermodynamic entropy is already lost.


Conclusion:


Clausius’s thermodynamic entropy is the time it takes for heat Q1 to be absorbed into an ideal gas at the rate of temperature Thigh or for Q2 to be released out of an ideal gas at the rate of temperature Tlow. The time it takes for a single ideal gas molecule to pass its kinetic energy off to another ideal gas molecule at a distance equal to the radius of the hydrogen atom is the unit of absolute time derived in the article A Unit of Absolute Universal Time.


Bibliography


Sears FW. College Physics, Addison-Wesley (Reading) (1960).

Zemansky MW. Heat and Thermodynamics, McGraw-Hill, (Reading) (1943).

Fermi E. Thermodynamics, Prentice-Hall, (Reading) (1937).