Entropy: A Parameter Lacking Clarity
- Kent W. Mayhew
- Mar 25, 2015
- 6 min read
Entropy is often referred to as the thermal parameter, which when multiplied by temperature gives energy. It sounds so simple, yet of all the thermodynamic parameters, entropy remains the one that lacks real clarity.
By the middle of the 20th century, entropy was accepted as being the “randomness of matter in incessant motion”(1). Again sounding so simple; upon my hearing this in high school science, I just simply accepted it, without pondering what does randomness have to do with energy anyhow. It was blind acceptance resulting in a strong foundation, making me ready for my university indoctrination.
Consider a barrel of hot water. Certainly we can envision that this barrel required, and then stored, energy as it was heated. But the heating of the water resulted in no real observable randomness changes at a macroscopic level. Even at a microscopic level, although the water molecules are vibrating with more energy than they were before being heated, are the molecules really any more random? I can certainly understand the argument that the molecular vibrational energies have increased, but to say that equates to randomness just seems so subjective.
Interesting, Arieh Ben-Naim (2) (in his book) points out that the interpretation of randomness of gaseous molecules really lay in the eye of the beholder. Arieh gives several examples and he is right. Obviously the term “randomness” is one that is not particularly scientific.
Let us now consider, Clausius’s 19th century choice of the term “Entropy” (2): “I prefer going to the ancient languages for the names of important scientific quantities, so that they mean the same in all living tongues. I propose, accordingly, to call S the entropy of a body, after the Greek word “transformation”. I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance, that an analogy of denominations seems to be helpful”
In his book’s preface Arieh Ben–Naim(2) then goes on to write: “Right after quoting Clausius’ explanation on his reasons for the choice of the word “Entropy”,” Leon Cooper (1968) “commented: “By doing this, rather than extracting a name from the body of current language (say: lost heat), he succeeded in coining a word that meant the same thing to everybody: nothing””
As do I, Ben-Naim(2) agrees with Cooper’s assessment of entropy. Ben-Naim further rightfully states that energy and entropy are not analogous and that “lost work” might not be the best choice either. In context of this my book, one can plainly see that a reason for entropy’s initial conceptualization was to explain lost work by expanding systems, i.e. Carnot engine. Of course my simple explanation of the displacement of earth’s atmosphere was not envisioned by Ben-Naim. Rather Ben-Naim choice was to tweak the sciences in a bizarre yet interesting marriage of Shannon’s information with traditional thermodynamics.
A more recent 21st century definition is that entropy is “the dispersal of a system’s molecular energy” (3). This is Frank Lambert’s beloved statement. It can certainly be argued that Frank’s interpretation is an improvement; even so, I am sorry to say that to me, Frank’s definition still feels like it is lacking.
Certainly, I do prefer the term dispersal over randomness because it invokes the visualization that molecules and/or energy will tend to disperse. Their dispersion is only limited by the constraints imposed upon a system, so long as the system is given sufficient time to attain equlibrium. Constraints can take many forms. Walls tend to contain gaseous molecules preventing them from simply traveling outwards until their next collisions with another gaseous molecules. Gravity being the mother of all constraints, as it holds galaxies and all their various components together. And of course pressure can be considered as a force constraint upon a system, but often it really is just a subset of gravity as a force of constraint.
Note: I apologize the use of symbols below for parameters. In the book I use the word math editor, but here in html well it is do the best I can. anyhow the importance is the points that are made and not mathemtical symbols.
Returning to entropy. As previously stated entropy lacks clarity. This is not due to the efforts of Frank Lambert, and countless others, to give it real meaning. The reason entropy lacks clarity is in part due to numerous applications in which it is blindly applied. Consider the enthalpy relation:
TS=E+PV 1)
Certainly based upon 1) we can see that entropy (S) is something that when multiplied by temperature (T), gives something whose units are energy (SI units: Joules). Specifically, TS equates to the internal energy (E) plus pressure (P) multiplied by volume (V).
I was also taught the ideal gas law. For an N molecule ideal gas we can write:
PV=NkT 2)
Again it all seems so simple; so why does entropy lack clarity? Well ask what does 1) represent? Be it right or wrong, in high school I envisioned 1) as representing the energy of a system and happily went to bed at night, pondering about my teenager life. And then in University I learned kinetic theory, and that for the total energy (Et) of an ideal gas was:
Et= 3NkT/2 3)
I was also taught that the heat capacity (Cy) times temperature change (dT) gave change to a systems energy change (dQ):
CydT = dQ 4)
And finally that 4) can be written in terms of entropy (S):
SdT=dQ 5)
Comparing 5) to 4), one not well versed in thermodynamics might actually conclude that:
S=Cy 6)
Heaven forbid such simple constructive logic be utilized. If we embrace 6) then does this mean in going from absolute zero to the current temperature of a gas, that: dT=T-0=T. And that we might then boldly write for that ideal gas:
ST=CyT = 3NkT/2 7)
Although we cannot fully embrace 6), hence 7), now compared them to 2). Constructive logic might invite one to write:
ST=3PV/2 8)
Surely 6),7) and 8) all embrace constructive logic but if they were applied to thermodynamics then we would end up with the following conundrum. Why does 8) not equate to 2) for an ideal gas? Moreover perhaps we now can write internal energy (E) equals:
E = PV/2 9)
Certainly not! 9) is NOT acceptable at any level.
Even so, now ask what is entropy (S). Well it is:
1) Something that when multiplied by temperature gives a system’s enthalpy
and/or
2) Something that when multiplied by temperature gives a systems energy.
BUT a system’s enthalpy is NOT necessarily equal to a system’s energy. Consider an ideal gas , its entahlpy is: NkT but its energy is: 3NkT/2. So entropy has to be 1) or 2) but it cannot be both. Does this not imply that from a mathematical perspective entropy lacks clarity in traditional thermodynamics? Certainly it does.
Okay we can prevent the use of the above constructive logic by limiting entropy to being something that when multiplied by temperature changes gives a system’s energy change. But we are just fooling ourselve are we not. If entropy is something that when multiplied by temperature changes gives a system’s energy change, then entropy must also be something that when multiplied by temperature gives a systems energy. Or at least that is the case if one employs constructive logic.
Certainly we can bury entropy further by saying that entropy is defined in terms of the number of microstates (N). I.e. we now have a third (or is it a fourth) mathematical definition for entropy, that being:
S=klogN 10)
I am sorry for using N to signify the number of molecules earlier, and then now to use N to define the number of microstates but that is the way it is trdaitionally done. Still it is less confusing then trying to ascertain entropy’s true guise mathematically speaking.
No wonder entropy lacks clarity when we try to express it in a language of mere words like English. Our reality is; until we decide upon what entropy is mathematically, there can be no hope. Entropy can no longer possess an array of interpretations because in so doing it becomes nothing but a mathematical contrivance and a very poorly used one at that.
In my book I go into the above in more detail. Even so choices must be made by the powers of the science, all in the name of clarity
References:
1) “Contemporary College Physics”, Donald E. Tilley, Benjamin Cummings Ontario, 1979
2) “A Farewell to Entropy: Statistical Thermodynamics Based on Information”. Arieh Ben-Naim World Scientific Publishing Co., London, New York 2011
3) Lambert, F.L. J. of Chem. Educ. 79,187-192 (2002)
Recent Posts
See All“In the late 1860’s, both Boltzmann and Clauisus attempted to derive the second law of thermodynamics from mechanics, an approach known...
In 1867 Maxwell presents his famous argument: As a consequence of kinetic theory of gases it was possible to transfer heat from cold to...
Comments