Sunteți pe pagina 1din 11

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/307959483

Gibbs Paradox Solution

Research May 2013


DOI: 10.13140/RG.2.2.30927.28320

CITATIONS READS

0 33

1 author:

James Putnam
New Jersey Institute of Technology
12 PUBLICATIONS 0 CITATIONS

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Presenting the results of reconstructing the units of physics into their empirical forms. View project

All content following this page was uploaded by James Putnam on 10 September 2016.

The user has requested enhancement of the downloaded file.


Gibbs Paradox Solution
James A. Putnam
James@newphysicstheory.com

FQXi.org Essay Contest


It from Bit or Bit from It
April 29, 2013

The Gibbs paradox results from analyzing mixing entropy as if it was a type of thermodynamic entropy. It
begins with an adiabatic box divided in half by an adiabatic removable partition. There are two ideal
gases, at equal temperatures and pressures, distinguishable as gas A and gas B, each separately
contained in separate halves of the box. The partition is removed, the two gases mix. Mixing entropy
theory predicts a significant change in temperature for the gases due to mixing. However, experimental
results show that the mixing process produces no detectable change in temperature. The solution
presented in this essay introduces new explanations for both thermodynamic entropy and mixing entropy.
It is shown that the paradox is not real. The prediction of mixing entropy is illusory due to an incorrect
assumption: The mixing entropy is not like Clausius thermodynamic entropy. The subject of this essay
was chosen to demonstrate the negative consequences of theorists bypassing an understanding of what
is Clausius thermodynamic entropy. There is preliminary work to this essay that should be read first in
Reference (4). Please read up to What is Thermodynamic Entropy? In lieu of that, please refer to
Appendix (B) for a list of important changes. For example: Mass is made a defined property with defined
units. A new system of units is introduced. The solution uses new physics theory that is results justified.

What is Thermodynamic Entropy?


Entropy is a theoretical pathway for moving from It to Bit. Clausius discovered entropy and defined it
precisely as thermodynamic entropy. What is thermodynamic entropy? It is something whose nature
should be easily established; because, its derivation is part of the operation of the simple Carnot engine.
The answer can be found in the operation of the Carnot engine. The Carnot engine is theoretically the
most efficient engine. Its efficiency is independent of the nature of the working medium, in this case a
gas. The efficiency depends only upon the values of the high and low temperatures in degrees Kelvin.
Degrees Kelvin is used because the Kelvin temperature scale is derived based upon the Carnot cycle.

The engines equation of efficiency and the definition of the Kelvin temperature scale are the basis for the
derivation of the equation:
 
  

    

Something very important happens during this derivation that establishes a definite rate of operation for
the Carnot cycle. The engine is defined as operating quasi-statically. The general requirement for this to
be true is that the engine should operate so slowly that the temperature of the working medium should
always measure the same at any point within the medium. This is a condition that must be met for a
system to be described as operating infinitesimally close to equilibrium.

There are a number of rates of operation that will satisfy this condition; however, there is one specific
rate, above which, the equilibrium will be lost. Any slower rate will work fine. The question is: What is this
rate of operation that separates equilibrium from disequilibrium? It is important to know this because it is
the rate that becomes fixed into the derivation of the Carnot engine. This occurs because the engine is
2

defined such that the ratio of its heat absorbed to its heat rejected equals the ratio of the temperatures of
the high and low heat sources:
   

     

Temperature is proportional to the rate of exchange of energy between molecules. It is not quantitatively
the same, because, temperature is assigned arbitrary units of measurement. Temperature is assigned
units of degrees Kelvin and its scale is arbitrarily fitted to the freezing and boiling points of water. The
temperature difference between these points is set at degrees. For this reason, the quantitative
measurement of temperature is not the same as the quantitative measurement of exchange of energy
between molecules. However, this discrepancy can be moderated with the introduction of a constant of
proportionality :T

The ratio is the definition of the modified temperature. Multiplying by dt:

This equation shows that the differential of entropy appears in the above equation as:

Both and are variables. It is necessary to determine a value for the constant kT. This value might be
contained in the ideal gas law:

Where k is Boltzmanns constant. When n the equation gives the kinetic energy of a single molecule.
In the case of a single molecule, E becomes E an incremental value of energy:

This suggests that for an ideal gas molecule:

In other words, the thermodynamic entropy of a single ideal gas molecule is a constant. Substituting for
Boltzmanns constant:

     

Entropy, from five steps above as differentials, now in incremental form is:

Therefore, I write:

  


3

If I could establish a value for t, then I could calculate kT. Since this calculation applies to a single ideal
gas molecule and is a constant value, I assume that t is a fundamental increment of time or is directly
proportional to a fundamental increment of time. There is one immensely useful fundamental increment of
time, introduced in Reference (4). It is:
  

(See Appendix A for an example of the fundamental unity tc achieves.) Substituting and solving for kT:

  

 

  

The body of work, Reference (4), supporting this essay includes theoretical changes beginning right from
the start of physics theory. The units change (Appendix B). Substituting the derived empirical units and
dropping the molecule indicator:
 

The value is a unit free constant of proportionality. It follows, from seven steps above, that Boltzmanns
T

constant is:

For the ideal gas, the thermodynamic entropy of each molecule is a constant:

Substituting this expression for entropy into the defining equation for thermodynamic entropy:

E = S T = kT tc T

Recognizing that the increment of energy represents an increment of heat and solving for S:

E Q
S = = = kT tc
T T

Heat is energy in transit. Solving for tc:

S Q
tc = =
kT kT T

This period of time tc would have been definable as thermodynamic entropy if temperature had been
defined as the rate of transit of energy between molecules The arbitrary temperature units make it
necessary to include the constant in the definition for a modified thermodynamic entropy
T . The m

equation showing this is:

Q Q
S m = =
kT T Q
tc

For an ideal gas receiving energy from a high temperature reservoir:


4

Qhigh Qhigh
S m = =
kT Thigh Qhigh
tc

For a Carnot engine, the modified entropy equals expressions for heat both received and released:

Qhigh Qlow
S m = =
kT Thigh kT Tlow

Substituting from two steps above:

Qhigh Qlow
Sm = =
Qhigh Qlow
tc tc

And the increments of time for the rates of transit of energy are equivalent. The time periods for average
molecular kinetic energy entering the engine and leaving the engine are the same. The rates of exchange
of kinetic energy are exactly what they need to be for the modified entropy to remain constant. This is why
the increase in entropy is exactly the opposite of the decrease in entropy for the Carnot engine. Energy
entering the engine carries the positive sign, and energy leaving the engine carries the negative sign.

Temperature is proportional to average kinetic energy, because, it is proportional to the rate at which
average kinetic energy is transferred between individual molecules. The numerator of the modified
temperature is the average kinetic energy. The denominator is the constant tc. The modified
temperature establishes the point where equilibrium exists. Equilibrium exists when kinetic energy is
exchanged at the rate set by the modified temperature.

Next, I consider an engine that has heat loss that does not result in work. The heat successfully
converted into work can be represented by a series of Carnot engines. For the series of Carnot engines,
the change in entropy per cycle is zero. The lost heat just passes through the engines unnoticed. The
series of engines is an unaffected pathway for the lost heat to travel to the low temperature sink.

The lost heat becomes energy no longer available for producing work by the series of Carnot engines.
The entropies that are affected are those of the high heat source and the low heat source. Their entropies
are measures of time required for the lost heat to be released by the high heat source and later absorbed
by the low heat source. The net change in thermodynamic entropy is:
     

    

The quantity of heat is the same in both cases. The rates at which energy is transferred are different. The
low temperature represents a slower rate of exchange of heat than for the high temperature. This means
it takes longer for the low temperature sink to absorb the quantity of lost heat than it does for the high
temperature source to supply it. This time difference is the cause of the greater than sign in Clausius
definition of thermodynamic entropy. The high heat source loses entropy because it requires extra time
for the lost heat to leave the source. The low heat source gains entropy because it requires extra time to
absorb the heat that is simply passing through the engine without being converted into work. This time
difference is the origin of thermodynamic entropy. Thermodynamic entropy, referred to as an arrow of
time, is an arrow of time.
5

Boltzmanns Entropy
This essay introduces the idea that a consequence of defining thermodynamic entropy using an ideal gas
is that, as the pressure approaches zero, the exchanges of energy between molecules theoretically
reduce down to single exchanges. A point is reached where exchanges occur at a rate that can be
modeled as one at a time without delay between them. That is an equilibrium point where the temperature
is infinitesimally close to a constant value. Clausius thermodynamic entropy applies to that low pressure
where the exchanges that occur can be ideally represented as each molecule taking its turn, without
delay, to pass on average molecular kinetic energy. This process can be modeled by considering all of
the gas molecules lined up single file and the average molecular kinetic energy of one of them is
transferred down the line from molecule to molecule until the energy has been transferred to the last
molecule. The time required to complete this process is internal thermodynamic entropy.

Temperature is proportional to the rate of transfer of average molecular kinetic energy between
molecules. The modified temperature is the rate at which energy is transferred between molecules. The
numerator of the modified temperature is average molecular kinetic energy. The average kinetic energy of
an ideal gas depends upon temperature only. On page 3 it was shown that the average kinetic energy
divided by modified temperature equals tc (Reference 4):

In the equation below, Boltzmanns constant is defined in this paper as the first equal term and by
thermodynamics as the second equal term:

is Avogadros number, the number of molecules in a mole of gas. R is the universal gas constant.
Solving for

Substituting the appropriate values:

      

For one mole of gas and dropping the molecule indicator:

2 6.02 x1023 1.292 x10 4 1.602 x10 19 sec onds = 8.31sec onds
R= ( )( )( )
3

The universal gas constant is directly proportional to the total time required for a mole of ideal gas to
transfer average molecular kinetic energy from molecule to molecule without delay between exchanges
until the number of molecules in a mole of gas is reached. The solution above is not that time. The actual
time requires the use of modified temperature.

R is defined using degrees Kelvin. R must be divided by kt so that it becomes defined using modified
temperature. Another adjustment that is needed is to multiply by 3/2 to remove the 2/3 that resulted from
the kinetic theory of an ideal gas:
6

3 R = N t = 6.02 x1023 molecules / mole 1.602 x1019 sec onds / molecule


0 c ( )( )
2 kT

= 96, 440 sec onds / mole = 1, 607 min utes / mole = 26.8 hours / mole

Boltzmanns constant is the time period represented by the universal gas constant reduced to single
molecule status:

R 8.312 sec onds / mole


k= = 23
= 1.38 x1023 sec onds / molecule
N0 6.02 x10 molecules / mole

Strictly speaking, the units of degrees should have been included in the two equations above. I took the
liberty of not showing it for reason of readability.

Boltzmanns constant is directly proportional to the period of time necessary for a single exchange of
average kinetic energy to occur between two molecules of an ideal gas, independent of temperature. The
actual time period, devoid of the molecule indicator, is given by:

  

The reason for eliminating the kinetic theory of an ideal gas fraction of 2/3 is that it pertains to
macroscopic properties while the time of exchange of kinetic energy between individual molecules is a
microscopic property.

The number of possible arrangements for a mole of ideal gas is infinite. Boltzmanns entropy requires
there to be a limited number of possible arrangements. His entropy assumed that the volume of the mole
of gas could be divided into a limited number of cells available to be occupied. In quantum theory, there
are a naturally limited number of available arrangements. Instead of arbitrary cells, there are microstates
which particles might occupy. If the concept of microstates is idealized so that all microstates are equally
likely to be occupied, then, I can write:

This is not the definition of Boltzmanns entropy even though is the number of microstates. The
inclusion of Boltzmanns constant causes this calculation to be analogous to that of thermodynamic
entropy. The number of microstates simulates a number of ideal gas molecules. The entropy calculation
simulates the calculation of internal entropy of an ideal gas. The solution is proportional to the time period
required for the simulated ideal gas molecules to transfer their simulated individual average kinetic
energies from one molecule to the next, without delay, until the number of simulated molecules equals .

The calculation of the entropy for any number of microstates will yield a solution identical to an analogous
calculation for an equal number of ideal gas molecules. However, Boltzmanns entropy is defined as:

Therefore, Boltzmanns entropy is proportional to the time period of a single transfer of ideal gas molecule
energy times the logarithm of the number of microstates. Boltzmanns entropy is not an expression of
simulated internal thermodynamic entropy. The entropy is no longer a direct measure of time. The units of
seconds carried along by Boltzmanns constant have become irrelevant. Boltzmanns constant can be set
to unity without units. Its connection to thermodynamic entropy is already lost.
7

Gibbs Paradox Solution


The Gibbs paradox problem will be analyzed here in its original form, including its discontinuity problem
pertaining to its interpretation of distinct versus indistinct or distinguishable versus indistinguishable.
There is a rectangular adiabatic container divided in half by a removable adiabatic partition. In one half of
the container there is the ideal gas A, and, in the other half there is the ideal gas B. The molecules of both
gases are idealized as inelastic spheres. The gas A molecules have mass m and size s. The gas B
molecules have mass 2m and size 2s. Both gases are at the same temperature and pressure. The
partition prevents either gas from being affected by the other. There are two moles of gas A, and,
because the molecules of gas B are twice as large, there is one mole of gas B.

In Reference (4), temperature is identified as being directly proportional to the rate at which energy is
transferred from molecule to molecules. The molecules temperatures are the same so both gases are
transferring energy between molecules at the same constant rate. The initial thermodynamic entropy of
each of the gases is internal thermodynamic entropy. The internal thermodynamic entropy of gas A is the
time required for one molecule to transfer energy to another and twice Avogadros number. Avogadros
number is the number of molecules in a mole of gas.

Internal thermodynamic entropy is calculated as if the molecules were lined up single file with energy
being passed from one to the other down the line until the last molecule receives the energy. This
treatment is dictated by the definition of an ideal gas. Specifically, the requirement that pressure
approaches zero. There is an initial point of equilibrium reached as pressure approaches zero. That
equilibrium establishes the internal thermodynamic entropy of an ideal gas. Therefore, the internal
thermodynamic entropy of gas A is twice the internal thermodynamic entropy of gas B even though they
are at the same temperature. That is because there are twice as many molecules of gas A as for gas B.

In contrast to the Gibbs description, the following is what the author would expect to be the case. The
dividing partition is removed allowing the molecules of both gases to mix. The temperature remains
constant for both gases. The rates-of-exchange of energy between any two molecules is always the same
value. There is no heat either gained or lost. The temperature continues to remain the same initial
constant value for the duration of the mixing process. The internal thermodynamic entropy of the two
gases, immediately after the partition is removed, is equal to the sum of the two separate internal
thermodynamic entropies. It remains this same value throughout the mixing process.

The Gibbs interpretation predicts that the mixing process increases the internal thermodynamic entropy of
the combined gases. It is assumed that thermodynamic properties such as heat would change. If this
were true, then there would be a significant change in temperature due to the mixing of the gases. The
experimental results show that there is no detectable change in temperature. This next example will
demonstrate that there is no change in internal thermodynamic entropy as a result of mixing two
distinguishable gases having equal initial temperatures and pressures. It is proposed that the choice to
treat mixing entropy like internal thermodynamic entropy caused the apparent paradox.

In the Gibbs treatment the property of distinguishable was interpreted as being absolute. It was concluded
that there are no grades of distinguishable. The gases were either distinguishable or they werent.
Distinguishable was as opposite from indistinguishable as binary 1 is from 0. Before the gases are mixed,
the mixing entropy is zero. As the gases mixed, the mixing entropy was expected to add to the internal
thermodynamic entropy of the two gases. Since a large increase in mixing entropy was expected, based
upon its statistical analysis, a large change in internal thermodynamic entropy was predicted. A
corresponding change of temperature was also predicted.

Because a lowering of temperature is associated with an increase in internal thermodynamic entropy,


mixing entropy will be treated here in the same manner. If mixing entropy is predicted to increase internal
thermodynamic entropy, then a corresponding drop in temperature should be observed. Because of the
assumption of absolute opposites for unmixed versus mixed, the fully mixed gases should have reached
their maximum possible mixing entropy. That maximum would occur at the lowest possible temperature.
8

This absolute opposites treatment of mixed versus unmixed is modeled here as being analogous to the
mixed gases having an initial temperature T1 and a final temperature of approximately zero degrees
Kelvin. The thermodynamic entropy at a temperature of approximately zero degrees Kelvin is very large.
The thermodynamic entropy of the two gases together is equal to the time required for energy to be
transferred from one molecule to another multiplied by 3 times Avogadros number. At near absolute zero
temperature, the rate of exchange of energy between molecules is extremely slow. Therefore the value of
mixing entropy, when treated as being like internal thermodynamic entropy, should be extremely large.
Such a large change in internal thermodynamic entropy would be detectable as a large drop in
temperature. The experimental results show that there is no change in temperature.

It was shown that internal thermodynamic entropy for an ideal gas is independent of temperature. If the
mixing process changed the gases temperature, the internal thermodynamic entropy would not change.
However, there is no change in temperature due to mixing. Even a single gas undergoes constant mixing
of its molecules. Although mixing entropys mathematical expression borrows Boltzmanns constant and
the name entropy, it, the mixing entropy, is not internal thermodynamic entropy. The theoretical
thermodynamic pathway from It to Bit, as in mixing entropy and the Bits of microstates, does not exist.
The incremental time tc is a universal constant that can provide the pathway from It to Bit.

Appendix (A): Fundamental Increment of Time and the Fine Structure Constant

This is an excerpt, with some changes, from The Absoluteness of Time, in the first contest The Nature
of Time. It employs the same tc as is used in the present essay and is presented as theoretical support
for its use. The following expression for the fine structure constant contains constants that come from
electromagnetic theory, relativity theory and quantum theory. It will receive new definitions of properties:

2 k e 2 2 k e 2
= =
hC h vc

I will use Planck's constant as it would normally be used. For the rest I substitute the expressions from
this new work for the constants in the equation. The subscript c denotes a measurement that involves
light. The expression derived for the proportionality constant k from Coulombs Law is:

E Kc 2 E Kc x c2 x
k = f H 1 C 2 = C = = E Kc 2c
x c x c t c
2
t c

Electric charge will be replaced with tc the fundamental increment of time used in both essays:

e = t c

Therefore:

x c 2
ke 2 = E Kc t c = E Kc xc
t c2

My expression for the speed of light is:

xc
vc =
tc
9

Where, xc is the radius of the hydrogen atom. The normal use of Plancks constant h is:

E Kc
h=

Substituting these identities into the equation for the fine structure constant:

2 k e2 2 k e2 2 EKc xc
= = =
hC hvc EKc xc
t c

= 2 t c

The fine structure constant has a magnitude equivalent to the ratio of the speed of the electron in the first
energy level of the hydrogen atom to the speed of light:

x p
vp tc x p
= = =
C xc xc
tc

The incremental distance in the denominator is the radius of the hydrogen atom. The incremental
distance in the numerator is that traveled by the hydrogen electron during the time light travels the radius:

xp
=
xc

The time required for light to travel the radius of the hydrogen atom is:

xc 4.8 x1011 meters


tc = = 8
= 1.602 x1019 sec onds
C 2.998 x10 meters / sec ond

The time it takes for the electron to travel one radian is:

tc 1
t p = = 137 t c =
2

Where is the electrons orbital frequency. Solving for alpha:

= 2 t c

It is shown here that the two expressions of the fine structure constant are derivable one from the other.
10

Appendix (B): Proposed Changes to Physics Theory

(1) Mass is made a definable property with units formed from the units of its empirical evidence, meters
and seconds. Its new units are those of inverse acceleration.

(2) Force, as the ratio of two accelerations, becomes unit free.

(3) There is a universally constant increment of time. It is the time for light to traverse the radius of the
hydrogen atom. Its magnitude is identical to that of fundamental electric charge.

(4) The ratio of Plancks constant to Boltzmanns constant has the magnitude and units of the radius of
the hydrogen atom.

(5) The units of energy, as the product of force and distance, become meters.

(6) Temperature has units of meters per second. It is interpreted as energy divided by time. It is directly
proportional to the rate of exchange of energy between two molecules.

References:
(1) Zemansky, M. W., (ed.) 1943, Heat and Thermodynamics, Mcgraw Hill, (Reading).

(2) Fermi, E., (ed.), 1937, Thermodynamics, Prentice-Hall, (Reading).

(3) Sears F W, Zemansky M W, 1960, College Physics, Addison-Wesley. (Reading)

(4) Putnam, J. A., 2011, The Nature of Thermodynamic Entropy,


http://newphysicstheory.com/THE%20NATURE%20OF%20THERMODYNAMIC%20ENTROPY%20Essay.pdf

View publication stats

S-ar putea să vă placă și