The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:
The randomness or disorder of the components of a chemical system is expressed as entropy,
Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.
Gag!
And from the textbook written by our very own Larry Moran:
enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.
Principles of Biochemistry, 5th Edition, Glossary
Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto
Choke!
Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:
On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.
…
“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?
Dan Styer criticizing Granville Sewell
Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.
Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:
Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.
Mike Elzinga
2lot trouble
Hear, hear.
But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.
“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.”
Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.
Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:
That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?
Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”
There is no basis in physical science for interpreting entropy change as involving order and disorder.
So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!
Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu
April 2014
The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..
Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:
The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham
…
Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002
…
A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate
Here is chemistry professor Frank Lambert’s informal definition of entropy:
Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.
The is the correct Ludwig Boltzman Entropy as written by Planck:
S = k log W
where
S = entropy
k = boltzman’s constant
W = number of microstates
Also there is Clausius:
delta-S = Integral (dq/T)
where
delta-S = change in entropy
dq = inexact differential of q (heat)
As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”
My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.
Hahaha. I’m not disputing observer-dependence given YOUR defintion! I dispute the utility of switching to your apparently useless definition.
walto,
Entropy is already observer-dependent. I’m not changing anything.
No it isn’t. Only MEASUREMENTS of entropy are observer-dependent given accepted definitions of the term.
Just like weight or wave-length.
This is comical.
Yes indeed.
You have no reason for suggesting a change in a widely used scientific definition, but you’re sure everybody should like it anyhow. That is funny.
Let me ask you this, keiths.What is your opposition to calling your ignorance measurement “k-entropy”?It would then be around for anybody who wanted to use it.What, exactly, is your problem with allowing Lambert and all the chemists to use whatever definition for “entropy” that they like and just having your additional concept handy?
ETA: You’ve got to admit that it’s quite weird (or, if you insist, funny) to insist that all these scientists mean something different by the term than they say they mean by it, no?
That’s hand waving, show the formal mathematical derivation showing W changes by a factor of 2^N when Volume increases from V to 2V.
Not so easy eh? Show the formal derivation, not offer hand waves.
Go ahead and try to do the same instead of hand waving to show W increases by a factor of 2^N, let’s see if you’ll be as confident as saying “easy” after you show the derivation . It can be done, but don’t pretend it’s easy.
In contrast, my derivation using energy dispersal (above) can be compactly described as follows (just substitute nR with NkB).
Heavens, Sal, I fail to see how you interpreted my comment as implying that you don’t know what kB is. I left kB in my final equation in order to demonstrate that the “ignorance” calculation yields precisely the same result as your calculation, which I quoted:
Your challenge, which I also quoted, was:
Now I only took undergraduate statistical thermodynamics, but I still learnt how statistical thermodynamics underpins and explains classical thermodynamics, which is otherwise mere phenomenology (think Carnot), so your rambling about dq/T is off-point.
Finally, when you quoted me as writing “Remember Sal” you omitted the rest of my sentence:
The word “given” totally pre-empts your strange opening gambit:
That’s naughty, Sal.
walto,
You’ve said that Xavier and Yolanda calculate the wrong entropy values because their knowledge of the system is incomplete.
Damon has complete knowledge of the system — he knows the exact microstate. He calculates an entropy of zero.
Yet you say that he is wrong, too.
If he has complete information but gets the wrong answer, he must be doing something wrong. What is his mistake?
Be specific.
walto:
It already has a name, walto. It’s called “entropy”.
Sal and walto,
Don’t forget to address the units problem.
Good luck.
His mistake is your mistake, that of thinking that by “entropy” we were looking for a measurement of ignorance rather than of dispersal.
Jock:
Sal replies:
No, it isn’t hand-waving, Sal. I will grant you that calculating W is not easy, but given your isothermal expansion example, W has increased by a factor of precisely 2^N, whatever the starting W was. This result is completely independent of the amount of energy that is being “dispersed”, which should give you pause for thought.
I can’t believe I just suggested that Sal pause for thought.
Heh.
That’s not the definition of entropy I find on the net, other than in your posts. See Lambert’s paper linked above by Joe F. for a nice discussion of your equivocation.
walto:
He used Boltzmann’s equation, walto.
DNA_Jock,
Let me ask you, Jock. Do you have some problem with having two different terms, say “entropy” and “k-entropy,” with one being absolute and the other observer-dependent? Why would you insist that the observer-dependent term is what chemists mean if they deny that?
That’s hand waving, show the formal mathematical derivation showing W changes by a factor of 2^N when Volume increases from V to 2V.
C’mon DNA_jock or Keiths, or (gasp) Mung. Show the derivation of this claim mathematically. I’ll go easy on you, try it for monatomic ideal gases in reasonable domains of temperature and conditions. I started trying to do the formal derivation myself, not so tidy.
That’s the price of arguing from ignorance rather than from energy.
walto,
walto,
What is the “absolute” entropy of the system Xavier, Yolanda, and Damon are studying? Show your work.
Hahaha! Told you so. That falsifies your claim:
You have to justify your claim mathematically not hand wave if you want to say it’s easy since we are talking about specific numbers now. Hahaha.
I’ll go easy on you, use a monoatomic ideal gas and incorporate quantum considerations. This means your starting W will include planck’s constant somewhere. Ouch!
This will give you an approximation of absolute W. Now that you have absolute W for initial conditions stated, you can then compute the factor of change in W when V increases.
Kind of hard to state that W changes by a factor of 2^N when volume changes by a factor of 2 if you don’t know what W is to begin with. I suppose you might find a way to do it. Personally, I wouldn’t know how to do the derivation if I didn’t have an approximation of W_initial to begin with (W = the absolute number of microstates to compute some sort of of absolute entropy).
I think I might (with some work) be able to do the derivation, but since you boast that it’s easy, why don’t you do it for the sake of our readers instead of hand waving.
Go ahead. Here’s your chance to show off how easy it is. Your method of ignorance over energy isn’t looking so attractive now to students of thermodynamics is it?
For the reader’s benefit, for mono atomic gases, to do the derivation I suggest starting with the count of micros states.
To count the microstates DNA_jock might start by recasting the equation below.
Bit ouch, there is a term U in that equation which is internal ENERGY. But you can’t even use that equation I guess, because Keiths wants to do all this based on ignorance, not energy. Talk about taking a rock and hitting yourself on the head.
But the equation below is only for the monoatomic case, try generalizing your results to N-atomic case. Maybe I won’t go easy on you because your claim was this was easy.
This equation is just the start of the derivation. If DNA_jock wants to use another, he can, I’m just trying to help the poor guy along. Have at it DNA_jock, show the students how easy it is to each thermodynamics with Keiths’ method of ignorance.
Remind me of the system. In the meantime, let me ask you this. Do you think the probability of a fair coin coming up 50% heads over the long haul is a function of what we know, or, rather, if our calculation of the probability at 50% is a function of the way the world is? Which is the dependent variable–our estimate or the probability?
Keiths,
Show change of entropy when 500 copper pennies go from 298K to 373K using your methods of ignorance without reference to the energy dispersed from the external world into the copper pennies.
Not so easy, eh?
Sal, my sense is that this dispute is largely a function of the equivocal meaning of a bunch of terms. In particular, “uncertainty”, “probability”, “information” can be taken as observer-relative or not.
The solution is to disambiguate, not to fight.
ETA: I also believe that, once disambiguated, the non-relative meanings will generally be more useful, just as you have urged throughout this thread. But if some people like the observer-relative version…..may as well let ’em have it around.
Sal,
It’s amusing to see you assigning homework when you can’t even answer a simple conceptual question about entropy:
If entropy is a measure of energy dispersal, and not of missing information, then why can it be expressed in bits, which are the units of information, but not in joules per cubic meter, which are the units of energy dispersal?
Why is that hopeless? Surely mathematics (and statistical mechanics) has ways of dealing with such numbers.
walto:
keiths:
walto:
Here’s the original comment:
What sort of thermodynamics system is Salvador talking about here?
keiths,
Ah. You’ve asked me that before. I likened it to a request for me to calculate the cube root of a large number. I wondered what was supposed to hinge on whether I know the answer to this. And I still wonder that.
I haven’t heard Yolanda’s estimate. All you’ve said is that it’s much higher than Xavier’s. And I haven’t heard yours either, come to think of it. But it’s irrelevant. From the abolutistist point of view, what matters is that her additional information (that there are two isotopes rather than one), is relevant to the question of dispersal. It does not suggest (on that view) that complete knowledge would yield an estimate of zero. On your take, Damon’s estimate must be zero. That difference seems to me to illustrate dispositively that the two concepts are different. I honestly don’t see how you can deny that….or why you should want to.
walto,
Damon uses the Boltzmann equation. He plugs in the number of possible microstates, which is one. (Remember, he knows the microstate exactly.) He gets an entropy of zero.
You claim he’s wrong. Why?
LoL. No, because the system is transitioning to the most probable state. “Dispersed” seems to lack any definition in thermodynamics. It’s a nonsense term.
ETA: Show the formal derivation of “DISPERSED,” not offer hand waves.
I actually I just caught an error in my post a year ago.
2lot calcutions, editorial corrections welcome
Why didn’t anyone else catch it for a year?
I stated the molecular weight of copper as 65.546. Nope it’s: 63.546 grams/mole
http://www.convertunits.com/molarmass/Copper
Also a typo, I was missing a “/” sign in the equation below:
With the corrections.
I was off by 1.7%.
Any complaints? See, I can admit a mistake. In fact it will bother my conscience if I don’t. Keiths on the other hand…..
To simplify the question posed to Keiths, instead of 500 copper pennies, I’ll use a block of copper of the same mass.
500 pure copper pennies has a mass of
3.11 grams / penny x 500 pennies = 1555 grams
So I’ll request Keiths do his calculations for a block of pure copper of 1555 grams.
The heat capacity calculation still look correct.
Specific heat of copper 0.39 J/gram, thus the heat capacity for 1555 grams of copper is
C = 0.39 J/gram/K * 1555 grams = 606 J/K
If I’m doing this right, the amount of heat dispersed from the surroundings into the copper pennies is
.39 J/gram/K * (373K-298K) * 1555 grams = 45,483.75 Joule (or Watt seconds)
So if 45,484 (rounded) Joules of energy are dispersed from the environment into a 1555-gram block of copper and consequently raising its temperature from 298K to 373K, the change in entropy in the copper pennies is:
1555 grams x (0.39 J/gram/K) * ln (373/298) = 136.1 J/K
which is materially the same answer of last year.
Now Keiths should try to calculate the change of entropy using his ignorance methods, but he’ll have to bypass using the energy dispersed from the environment into the 1555 gram copper block since he says the dispersals of such energy from the environment into the copper block is a misconception.
So he dug his own hole, and he’ll have to dig out now.
Well Sal, if you take your equation and use it to calculate the change in entropy, you might notice that the terms in U (and h and m) all cancel, and the delta-S is reduced to
delta-S =kB N ln(2)
And the reason that those other terms all cancel is because they are entirely irrelevant to the change in entropy.
So, while calculating the absolute entropy might be a pain in the neck, calculating the difference is simple. W has increased by 2^N.
So it’s not “Kind of hard to state that W changes by a factor of 2^N when volume changes by a factor of 2 if you don’t know what W is to begin with.” because it is always true, irrespective of W. It’s not hand-waving, its fundamental.
Perhaps this point is relevant to walto’s questions to me:
Yes.
What the absolute entropy is may be a rather problematic value to calculate – it may depend critically on exactly how the macrostate has been defined by any particular observer. But practically speaking, chemists don’t care about the absolute entropy, they only care about the delta-S. In a similar way absolute enthalpies are completely made up, based on a goofy convention for enthalpies of formation. You could make up your own convention if you wanted to: it won’t matter, because it won’t change any delta-H calculations. So, like enthalpies, maybe the intuitively appealing idea that there is a “real” “correct” “observer-independent” absolute entropy is not helpful. So long as we can agree on delta-S. Even Damon would agree that the gas expansion increases the entropy by N kB ln(2), until he acquires that extra N bits of information about which half everyone is in….then it’s back to zero.
But I could be wrong (re Walto’s question).
Stop dodging, Sal.
Calculations are irrelevant when you can’t even explain why entropy has the wrong units for your definition.
Bits are not the units of energy dispersal, nor are joules per kelvin.
Your definition cannot be correct. The units are wrong.
It’s over, Sal.
What’s a Jolue/Kelvin?
Safe to say that no-one actually reads your high school chemistry homework.
But that wasn’t the point, the formula had U in it. You’ll have to find a formula without U in it since you are arguing from first principles of ignorance rather than energy.
You have to justify mathematically why can compute absolute entropy without direct reference to internal energy U but rather your ignorance of the system. I gave you the formula to give you a start, but now you’re on your own to find an expression of entropy that doesn’t include U.
You are cheating because you are starting with an equation of entropy that is stated in terms of ENERGY and Volume (uh, ENERGY and volume, as in dispersal of energy, get it).
Then show the algebra for the students for the N-atomic case. Not so tidy eh? More DNA_jock hand waving rather than actually demonstrations.
contrast with
Oh Sal, have you already forgotten that W is the number of possible microstates that correspond to the given macrostate?
W may be hard to calculate, but in many cases the ratio of W1 : W2 is quite trivial to calculate. As I showed.
As for the rest, go whine to Boltzman.
By definition, suggested by the nature of macroscopic observations, thermodynamics describes only static states of macroscopic systems.
– Herbert B. Callen, Thermodynamics
https://en.wikipedia.org/wiki/Herbert_Callen
ETA: “Spreading” and “dispersing” don’t sound like static states to me.
You suggest units of Keiths’ ignorance then?
Gee Keiths, I got my units right in the case of 45,484 Joules of energy dispersing from the environment into a 1555 gram block of copper.
One could imagine an ideal electric heater operating at 1000 watts running for 45.484 seconds to disperse 45,484 Joules of energy from that distant electric power plant somewhere in West Virginia to that 1555 gram block of copper starting at 298K.
We probably will have to waste some electricity in the real world, but we know from the specific heat of copper, that when the temperature is raised to 375K, 45,484 Joules of energy have been dispersed from the environment into the copper block.
When a small amount of heat is put into a system like the copper block, the temperature rises so little, the change in entropy can be approximated as
delta-S ~= delta-q / T
Say we disperse 1 joule of energy at 298K
delta-S ~= 1 Joule/ 289K = 1 J/K
There Keiths, I got the right answer with energy dispersal.
In the limiting case of delta-q approaching zero, the exact expression of the differential is
dS = dq/T where T is actually T(q) since it depends on the amount of heat input.
But, this shows how the dispersal of energy at a temperature gives a figure for entropy in the right units. If we continuously integrate this over all temperatures in the process, we get the Clausius Integral
delta-S = Integral (dq/T)
And you end up with the right units. Dispersal of energy at various temperatures, just a Lambert said:
But in any case, you can vindicate yourself if you use your ignorance methods to calculate change in entropy of 1555 gram copper block from 298K to 373K without references to the dispersal of energy from the environment into the copper block.
Calculate the entropy Keiths for our readers, but do so without reference to the 45,484 Joules of energy dispersed into that copper block. I look forward to you showcasing your methods based on ignorance.
Exactly, but that is only part of the dispute. Keiths hates the energy dispersal metaphor for entropy which is adopted by Dr. Lambert, Dr. Mike, and probably even Lord Kelvin himself.
But I’m enjoying fighting with Keiths. Man, I guess the fun has to stop eventually.
I had a good time. I hope you did too.
I spar with Keiths partly as an extension of my own thought process, to sharpen my understanding and get some corrective feedback. I quoted a few numbers incorrectly and had a typo in a formula.
I plan to use the comments in these exchanges when I pass on reports to some in the ID community, and the ID community has some bad misconceptions about entropy and they should stop making arguments for ID that are easily shot down. Maybe some of them can learn what entropy really is. Ironically, this is discussion is indirectly a sharp criticism of the work of many of my colleagues in ID.
I hope I’ve defended Dr. Lambert and Dr. Mike’s qualitative description of entropy adequately in contrast to Keiths’ observer ignorance approach.
You did not show, you hand waved, and you argued your case from a formula which I provided that implicitly contains the energy dispersal concept since Volume and ENERGY are in that formula. You can’t use the formula I provided as a hint. You have to come up with your own formula that erases reference to Energy and Volume (you know, energy dispersal).
You have to work starting from Keiths’ ignorance and avoid equations that express entropy by describing the energy dispersal as this equation does:
Because the number of “possible microstates” is not a function of what Damon knows. As Lambert writes in the above-cited paper,
He also writes:
That is not a function of what anybody, even a Laplacian genius, knows. Thus the claim that the number I should plug in for Boltzmann’s W is 1 is nothing but a restatement of your ignorance-theory.
I’ll close with these two quotes from Lambert:
Bits. B I T S. Binary digits. As in how many yes/no questions. As in the game of 20 questions.
LoL. No, it’s VOLUME DISPERSAL that is being measured by entropy. You have to get rid of the V before you can use your U.
Freaking hell. Lambert admits we’re right. The key term is “probability distribution.”
The don’t shy away from it in Physical Biology of the Cell. They even have an entire Chapter with the title Entropy Rules!
Here’s a quote from p. 271:
Well hell, in a modern textbook at that. What is the (physical) world coming to.
Molecular Driving Forces: Statistical Thermodynamics in Biology, Chemistry, Physics and Nanoscience
I guess it takes an entire textbook.
ETA: …we follow the axiomatic approach to thermodynamics developed by HB Callen, rather than the more traditional inductive approach; and the Maximum Entropy approach of Jaynes, Skilling, and Livesay, in preference to the Gibbs ensemble method.
Anyone know what they are referring to?
Keiths must now suffer some final defeats in this discussion. Bwahaha.
So how do we define “energy dispersal” quantitatively?
Well, we need the amount of energy E, the Volume in which it is dispersed V, and the number of particle N.
The specific dispersal then would be Joule/Cubic Meter/Particle or
dispersal = E/V/N
But it’s nicer to work with the actual system, so lets scale up the dispersal to the entire system and we get the macro states of the system that define the exact and total dispersal:
Energy E(or U)
Volume V
Number of particles N
Unfortunately for Keiths, he can’t calculate his ignorance of the system without these 3 macrostate variables. Hahaha! And these variables are the exact description of energy dispersal.
So unfortunately for Keiths, he needs the energy dispersal variable to figure out his amount of ignorance or Mungs ignorance (some astronomical number).
Now I cited the infamous Sakur-Tetrode equation below that approximates the absolute entropy of a ideal monatomic gas (below in next comment).
https://en.wikipedia.org/wiki/Sackur%E2%80%93Tetrode_equation
The units of E(or U) are in Joules or m^2 kg/s.
The units of V are in cubic meters.
The units of N are dimensionless (just a count of particles)
Let’s plug in the energy dispersal variables into the Sakur-Tetrode equation.
I will use the essential example I’ve been harping one several times, but use helium as the approximation of the ideal monoatomic gas.
The basic problem:
The quick answer is R ln (2) = 5.7631 J/K
But I want to drive the stake in Keiths arguments, so I will be tedious and prolong the agony. As a bonus I’ll even calculate absolute entropy!
R = gas constant = 8.3144598 j/k/mol
h = Planck’s constant = 6.62607004 x 10^-34 kg m^2/s
kB = Boltzmann’s constant = 1.38064825 x 10^-23
Avogadro’s number = 6.022140857 x 10^23
molecular weight of helium = 4.00 x 10^-3 kg / mol
m = weight of 1 helium molecule = 6.6422 x 10^-27
T = 300 K
V = 1 m^3
n = moles gas = 1
N = number molecules of helium = 6.022140857 x 10^23
U = E = internal energy of the helium molecules = (3/2) n R T =
(3/2) (1) (8.3144598) (300) = 3741.5069 Joules
Now let’s plug this into the Sakur-Tetrode equation below:
S_initial = (1.38064825 x 10^-23)(6.022140857 x 10^23)(ln((1/6.022140857 x 10^23)( (4 * pi *6.6422 x 10^-27 *3741.5069 )/(3*(6.62607004 x 10^-34)^2)*6.022140857 x 10^23 )^(3/2)) + 5/2) =
156.963 J/K
Tada!
S_initial = (1.38064825 x 10^-23)(6.022140857 x 10^23)(ln((2/6.022140857 x 10^23)( (4 * pi *6.6422 x 10^-27 *3741.5069 )/(3*(6.62607004 x 10^-34)^2)*6.022140857 x 10^23 )^(3/2)) + 5/2) =
162.726 J/K
Tada!
Thus under the Boltzman definition:
delta-S_expansion_Boltzmann_method = S_final – S_initial = 162.726 – 156.963 = 5.763 J/K
This is in good agreement with entropy calculated via Clausius:
delta-S_expansion_Clausius_method = n R ( V_final / V_initial ) = 5.763 J/K
Hence I showed that Delta-S is the same whether one uses the agonizing Boltzmann/Sakur-Tetrode approach or the Clausius approach.
delta-S_expansion_Boltzmann_method = delta-S_expansion_Clausius_method =
5.763 J/K
But unfortunately
delta-S_expansion_Keiths_method = ignorance
It should be noted that we can state the number of microstates in the initial and final situations. We know this from
S= kB ln W, thus
W = exp (S/kB)
W_initial = number of initial microstates
= exp (156.963/1.38064825 x 10^-23) = exp( 1.1369 x 10^25) which is an astronomically large number
W_final = number of final microstates
= exp (162.726/1.38064825 x 10^-23) = exp( 1.1786 x 10^25) which is an astronomically large number
Thus I can describe the energy dispersal by the thermodynamic coordinates of Energy E (or U) , Volume V, number of particles N.
When I use the appropriate equations, this coordinates of energy dispersal will yield the change in number microstates and the change in entropy expressed in J/K (or nats or bits if preferred).
Ergo, Keiths, Mung, DNA_jock have been pummeled in this exchange but they still keep showing up to get more punishment.
I cite Keiths comments about 331 comments ago:
That claim, I feel has been brutally refuted.
And so is this claim by DNA_jock:
You want some more punishment Keiths, show up here again, because I’m enjoying dishing it out.
[EDIT the equation immediately following is the one using the Clausius method for change in entropy of an ideal gas isothermally expanding.
I showed how the figure for change in entropy from the Clausius method agrees with the change in entropy using the Boltzman/Sakur-Tetrode method whose formula is in the next comment]
The Sakur Tetrode equation for absolute entropy in an ideal monoatomic gas:
You see keiths, if you even use the word equilibrium you lose! Because that just is energy dispersion! Therefore entropy is a measure of energy dispersion.
In this section, we make a huge conceptual leap from SMI of games to a fundamental concept of thermodynamics. As we shall soon see, this is leap is rendered possible by recognizing that the SMI associated with the equilibrium distribution of locations and momenta of a large number of indistinguishable particles is identical (up to a multiplicative constant) to the statistical mechanical entropy of an ideal gas. Since the statistical mechanical entropy of an ideal gas has the same properties as the thermodynamic entropy as defined by Clausius, we can declare that this special SMI is identical to the entropy of an ideal gas. This is a very remarkable achievement. Recall that von Neumann suggested calling the SMI, entropy. This was a mistake. In general, the SMI has nothing to do with entropy. Only when you apply the SMI to a special distribution does it become identical to entropy.
– Ben-Naim (2012)
Chapter 3 is titled, The Astounding Emergence of the Entropy of a Classical Ideal Gas out of Shannon’s Measure of Information.