Talk:Entropy (order and disorder)
This level-5 vital article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
Archives
edit- Archive: Talk:Entropy/Disorder ['04-(Nov)'05]
- Archive: Talk:Entropy/Archive1#disorder
- Archive: Talk:Entropy/Archive2#Entropy, order, and disorder
Comments from Talk:Entropy page
editHi, in continuation from kenosis' comments (above), first I'm going to add a short referenced sentence as per kenosis’ suggestion about recent incorporation of the "energy dispersal" perspective. Second, to help diffuse this issue, e.g. noting that “order and disorder” are hot topics on the talk page archives (ex. Talk:Entropy/Archive3#Disorder) and current ones, I’m going to start a order and disorder header section (with a link to its own article: entropy (order and disorder)). Third I collected the following stats to help clarify the prevalence of terms via Google search results:
- entropy energy time – 6,490,000 results
- entropy energy order – 5,310,000 results
- entropy energy information – 5,070,000 results
- entropy energy life - 1,670,000
- entropy energy chaos – 1,050,000 results
- entropy energy disorder – 694,000 results
- entropy energy dispersion – 639,000 results
- entropy energy dissipation – 503,000 results
- entropy energy irreversibility – 164,000 results
- entropy energy dispersal – 63,000 results
- entropy energy disgregation – 88 results
I hope this clarifies things. I suggest that we work together to build the entropy (order and disorder) article (where we put most of the chaos discussion) over the next few weeks and put a short intro paragraph with a “see main” link on the entropy page. I'll take a chunk out of the statisitical perspective section to give it a quick start and we can build on this. Is this fine with all? --Sadi Carnot 10:32, 31 October 2006 (UTC)
- Hi Sadi, all!
- How's your German (I'll translate later). I've found this in a scriptum by Norbert Straumann [1]:
In der Thermodynamik ist es wichtig, dass man Wärme und Arbeit streng unterscheidet, obwohl im 1. Hauptsatz ihre Aequivalenz festgestellt wird. Diese Unterscheidung ist nur möglich, wenn man sich auf die Existenz adiabatischer Wände beruft. Diese sollen zwar selber keine thermodynamischen Eigenschaften haben, aber dennoch den Ausgleich von Temperaturdifferenzen verhindern. Aehnliche Fiktionen (z.B. ideale Antikatalysatoren) werden wir später benötigen. Für deren Existenz kann man sich nicht wirklich auf die Erfahrung berufen, und deshalb hat diese Pauli mit Recht "Zaubermittel" genannt. Erst in der statistischen Mechanik hat man diese Zaubermittel nicht mehr nötig. In dieser ist die Wärme derjenige Anteil der Energie, der den makroskopisch nicht beobachteten Freiheitsgraden zugeschrieben werden muss. Was aber makroskopisch beobachtet wird, hängt wesentlich von den Beobachtungsmöglichkeiten ab. Darum ist die Wärme, und damit die Entropie (s.u.), in der statistischen Mechanik strenggenommen immer nur relativ zu einem makroskopischen Beobachter definiert.
- Short essence: Whereas in statistal mechanics heat and therefore entropy are only defined relavtive to a choice of macro variables, they are unique in classical thermodynamics, but only because of use of "adiabatic walls", which are a fiction themselves.
- And from the preface:
Die phänomenologische Thermodynamik wird von Dozenten und Studenten mit Recht immer wieder als eine Disziplin empfunden, die zwar mathematisch anspruchslos, aber begrifflich doch recht subtil ist. Hinzu kommt, dass diese Theorie anderen Gebieten der theoretischen Physik in ihrem ganzen Aufbau fremdartig gegenüber steht. Für die meisten Studierenden ist es vernünftig, nicht zuviel Zeit in das Studium der Thermodynamik zu investieren und sich möglichst rasch der Statistischen Mechanik zuzuwenden.
- My translation: Teachers and students a like see phenomenological thermodynamics as a topic, which is mathematically undemanding but with some subtle points. In addition the theory in its entire composition is rather alien to other topics in theoretical physics. Most students shouldn't invest too much time for studying thermodynamics but proceed swiftly to statistical mechanics.
- Of course, after having stated that thermodynamics is mathemically rather trivial, he proceeds to the modern treatment via differential forms and manifolds in all glory ;-)
- Pjacobi 11:00, 31 October 2006 (UTC)
Hi Pjacobi, my German is in need of improvement (I hope to be semi-fluent before I go to Octoberfest some day). Here's one of my favorite quotes, from the chemical thermodynamics page, which exemplifies my perspective: In the preface section to popular book Basic Chemical Thermodynamics by physical chemist Brian Smith, originally published in 1973, and now in the 5th edition, we find the following overview of the subject as it is perceived in college:[1]
“ | The first time I heard about chemical thermodynamics was when a second-year undergraduate brought me the news early in my freshman year. He told me a spine-chilling story of endless lectures with almost three-hundred numbered equations, all of which, it appeared, had to be committed to memory and reproduced in exactly the same form in subsequent examinations. Not only did these equations contain all the normal algebraic symbols but in addition they were liberally sprinkled with stars, daggers, and circles so as to stretch even the most powerful of minds. | ” |
Also, I gave the new header section to "order and disorder" a good start (and added seven new sources and uploaded two images); we can all now build it, e.g. I'm pretty sure that Hermann von Helmholtz and possibly others had some ideas on order and disorder, then later when it gets too big paste most of it to its own page. Out of time for today. Happy Halloween - ! --Sadi Carnot 13:57, 31 October 2006 (UTC)
- A good start, but you still don't seem to have provided a clear explanation of what "disorder" means in relation to modern science as an introduction for beginners, and indeed seem to be using the Encarta reference which you consider to be not exactly correct. No doubt these points can be improved, The section leads naturally on to the "dispersal" perspective, so I've revised that section to bring it closer to the modern approach, and have incorporated the Atkins reference you provided in its context. The Scott reference appears to have no obvious relevancy and is clearly not the first to use the concept, so I've moved that to the main dispersal article. Lets hope you can clarify the disorder approach in a way that no longer stretches the most powerful of minds! Please remember that while you're very familiar with the jargon, this page should be aimed at helping newcomers to the subject. ... dave souza, talk 15:58, 1 November 2006 (UTC)
- I just added the adiabatic demagnetization example (with a new image); I hope this helps. Also, I moved most of the "order/disorder" stuff to its own page: Entropy (order and disorder). I also, likewise, trim down the dispersal section a bit so that it is in proportion to the rest of the sub-articles (with a new source) and moved some of it to its own page. --Sadi Carnot 17:13, 1 November 2006 (UTC)
References
- ^ Smith, Brian, E. (2004). Basic Chemical Thermodynamics. Oxford University Press. ISBN 1-86094-446-9.
{{cite book}}
: CS1 maint: multiple names: authors list (link)
Adiabatic demagnetization
editI've revised this section a bit to clarify the process. However, the bit about the second law seems a bit dodgy as magnetic energy clearly goes through the insulated chamber – perhaps you could reconsider that bit? .. dave souza, talk 11:16, 17 November 2006 (UTC)
Although it wouldn't have the practical application of ultra-cooling, the relevance to entropy might be more clear if magneto-caloric effects were discussed in an isothermal context (temperature held constant) rather than an adiabatic context (insulated - no heat flow). In an isothermal context, heat will flow into or out of the sample to maintain the temperature as the magnetic field is changed. This heat flow can be related directly to the definition of entropy (heat flow divided by temperature). Compbiowes 22:35, 10 January 2007 (UTC)
- I think the discussion is incorrect here. If the magnetic field is reduced adiabatically, the amount of order in the spin has to remain the same (Delta S = 0). The temperature does indeed decrease because the entropy is roughly a function of B mu / T, and in order to keep entropy fixed with decreasing B, then T must also drop. —Preceding unsigned comment added by 135.245.8.36 (talk) 19:57, 27 April 2008 (UTC)
- Have a look at the cited NASA page on the effect – it's to do with the energy involved in aligning the magnetic field, rather than any appearance of "order" or "disorder". .. dave souza, talk 20:07, 27 April 2008 (UTC)
The "difficulties" section, or, the dispersal section.
editThis section is giving too much prominence to the hobbyhorse of a few chemistry teachers. This Frank L. Lambert, in particular, is just someone with a bee in his bonnet. The entire section should just be removed. 98.109.238.95 (talk) 19:32, 24 June 2013 (UTC)
- Dear anon, you appear to disagree with a number of very good textbooks. Perhaps you prefer a disordered approach, but this method is useful for many people. . dave souza, talk 20:48, 24 June 2013 (UTC)
Cf. https://en.wikipedia.org/wiki/Talk:Entropy/Archive4 98.109.238.95 (talk) 20:59, 24 June 2013 (UTC)
I would just like to know what proportion of standard textbooks published by reputable publishers (not alphascript, the only one that shows up on page one of google) use this definition. For various reasons, I was recently looking at Fermi, Pauli, Fowler, Jeans, Planck, von Plato, and Ford, and not one uses it. Neither does Thermodynamics Demystified, written by a chemist. NPOV requires that a point of view be given its due weight. If the proportion I was asking for is very small, this hobbyhorse should not be included at all. 98.109.238.95 (talk) 22:34, 24 June 2013 (UTC)
- Taking the more recent and concrete example you cite, Thermodynamics Demystified doesn't use the terms "order" or "disorder" to explain entropy, so it seems to have taken on board the difficulties introduced by these terms. When were the others published? . . dave souza, talk 05:45, 25 June 2013 (UTC)
- The "Difficulties" section is at a low level. It is not scientific argumentation or even good pedagogy; it's a series of unfocused metaphors to the point of ranting. 84.227.244.192 (talk) 01:18, 16 September 2014 (UTC)
Disorder
editThis entire Wikipedia page seems to be devoted to a controversy rather than scientific exposition. Perhaps it should be retitled to reflect that? Or -- since it is pretty muddy -- just thrown away?
In the introductory paragraphs:
1) ...entropy is commonly associated with the amount of order, disorder, or chaos...
Not what it is, but what people think about it.
2) This stems from Rudolf Clausius' 1862 assertion...
For the next paragraph, we fly through through a historical development rather than giving a definition.
3) In the 2002 encyclopedia Encarta, for example, entropy is defined as a thermodynamic property which serves as a measure of how close a system is to equilibrium, as well as a measure of the disorder in the system.
This is not a definition either.
4) In the context of entropy, "perfect internal disorder" is synonymous with "equilibrium", but since that definition is so far different from the usual definition implied in normal speech, the use of the term in science has caused a great deal of confusion and misunderstanding.
What definition? Do you mean the "synonymy" of perfect internal disorder with equilibrium? That is not a definition in a scientific sense; I cannot even tell which term is intended here to be the definiendum.
5) Locally, the entropy can be lowered by external action. This applies to machines, such as a refrigerator, where the entropy in the cold chamber is being reduced, and to living organisms. This local decrease in entropy is, however, only possible at the expense of an entropy increase in the surroundings.
This instructive commentary seems entirely out of place in the introduction. You have not even told the reader that entropy increases, and in any case, what does this have to do with entropy as disorder? Something, I'm sure, but it hasn't been said here.
The whole article continues in this way. Spongy.
The article may have one positive function: to serve as a dump site for these kinds of controversies while other articles get freed of them. 84.227.245.245 (talk) 10:51, 15 September 2014 (UTC)
- As an example of a thermodynamics article that exhibits all the historical talk and verbal wordage that I criticized in this article, yet manages to be crisp and informative, see diathermal wall. 84.227.255.16 (talk) 08:08, 16 September 2014 (UTC)
Overview error?
editThe overview states If you have more than one particle, or define states as being further locational subdivisions of the box, the entropy is lower because the number of states is greater. Surely that lower should be higher (and maybe it should say microstates)? Terrycojones (talk) 13:16, 14 February 2016 (UTC)
- I agree with you.
- 177.138.210.163 (talk) 02:36, 25 March 2017 (UTC)
References follow
edit"Entropy depends on the total disorder and not just on structural disorder."
editPart of the confusion regarding entropy and disorder is that we frequently only consider structural or spatial ordering (phase space) when analyzing the entropy of a system. A good example of where this can lead to mistakes is the entropy of helium-3 in its liquid and solid phases. If one only considers the spatial degrees of freedom of a quantity of helium-3, one would be led to believe that the liquid phase was more disordered (greater entropy) than the solid phase, since the size of the spatial phase space of the liquid phase is greater than that of the solid phase.
But that is not always the case. Solid helium-3 can have greater entropy than liquid helium-3 due to the greater spin disorder (greater size of the spin phase space) of solid helium-3 than liquid helium-3. In other words, when you consider both the degree of disorder of the spins and the degree of disorder of the positions, the solid phase has more overall disorder than the liquid phase, even though the former has less positional/spatial disorder than the latter.
For a good discussion of this phenomenon see https://physics.stackexchange.com/questions/410550/why-is-the-latent-heat-of-fusion-of-helium-3-a-negative-quantity . For the source of the quote (made in a discussion of the same phenomenon), see https://books.google.com/books/about/First_Order_Phase_Transitions_of_Magneti.html?id=rgZADwAAQBAJ . Here is context of the quote:
Entropy has contributions from all phase space, and disorder is not solely about structural disorder. All liquids are more [structurally NLG] disordered than their corresponding solid phases in terms of [the the greater volume of their positional/spatial phase space determined by NLG] their positions in real space, and this governs the common observation that solids melt on heating. Nevertheless, liquid helium-3 is a quantum Fermi liquid and is highly ordered in momentum [phase NLG] space [which is independent of positional/spatial phase space NLG]. The entropy of the liquid [due to momentum phase space NLG] varies approximately linearly with T as [(T/Tf*)ln2], where Tf* is the effective Fermi temperature. On the other hand, the spins in solid helium-3 are not aligned until it is cooled to about 1 mK, and its [momentum space NLG] entropy is approximately constant at Rln2 [where R is the gas constant == Boltzmann's constant per mole NLG] above about 10 mK.
...
The statement "increasing the temperature causes a transition to the phase with higher entropy" has no known exception.
...
Entropy depends on the total disorder and not just on structural disorder.
I think this example would help illuminate the critical point that there are different kinds of disorder/degrees of freedom/phase spaces, and all of them must be analyzed when measuring entropy. -- Nick (talk) 17:44, 4 September 2019 (UTC)