Talk:Laws of thermodynamics

Latest comment: 6 months ago by 105.113.106.24 in topic Biology


Entropy is not always a measure of disorder

edit

The recently added words in the Second Law which imply entropy is (always) a measure of disorder should be removed. This is an outdated concept. Entropy increases if and only if there are unbalanced energy potentials which are dissipating, where "energy" is not restricted to just kinetic energy which relates to temperature. Thus the Clausius (hot to cold) statement is merely a corollary of the Second Law that applies in a horizontal plane because we can only claim that heat transfer is always from hot to cold if there is no other change in any other form of energy such as gravitational potential energy. The Second Law is not just about temperatures, and entropy is not just about disorder. For more on this read http://entropylaw.com and the cited papers. — Preceding unsigned comment added by 101.175.24.168 (talk) 22:09, 19 August 2018 (UTC)Reply

I removed that addition. We have a whole entropy article, as linked there, where readers can learn what that concept is. DMacks (talk) 22:22, 19 August 2018 (UTC)Reply
The statement "Another form of the statement is that heat does not spontaneously pass from a colder body to a warmer body." (for the Second Law) should be removed because it is merely a corollary which does not always apply in a force field like gravity or centrifugal force. See my paper "Planetary Core and Surface Temperatures" at https://ssrn.com/author=2627605 for reasons why we physicists only ever define the Second Law in terms of entropy which is, in effect, a measure of progress in the dissipation of unbalanced energy potentials, such energy potentials including any form of internal energy (including gravitational potential energy) not just kinetic energy that is associated with heat and temperatures. 2001:8003:26E2:3300:9565:F799:71E6:6727 (talk) 21:56, 22 September 2022 (UTC)Reply
You still need to add to the words in the Second Law (after corollary) "which does not always apply in a force field like gravity or centrifugal force." For example, in an operating vortex cooling tube as the speed of the air accelerates there is radial heat transfer from the cooler (and cooling) central regions to the warmer (and warming) outer regions. This is an extremely important phenomenon to understand as it also happens due to gravity in planetary tropospheres. 101.178.34.44 (talk) 04:23, 6 October 2022 (UTC)Reply
A good account of how to apply thermodynamics including the second law in the presence of a gravitational field strong enough to create gradients within the overall system is given in Guggenheim[1]
  1. ^ Guggenheim, E.A. (1949/1985). Thermodynamics. An Advanced Treatment for Chemists and Physicists, North-Holland Publishing Company., Amsterdam, fifth edition, Chapter 9, pp. 327–332.
Chjoaygame(talk) 05:58, 6 October 2022 (UTC)Reply
There was no need to alter the previous statement of the Second Law which read "In a natural thermodynamic process the sum of the entropies of the interacting thermodynamic systems never decreases." Nor should we add to this statement corollaries about heat being from hot to cold because that need not necessarily be the case in a force field. (Read my relevant explanation as to how a Vortex Cooling Tube works in the Talk Page for that article. Also read what I wrote in the Talk page for Entropy.)
When people understand that the Second Law enables us to explain (and quantify) the observed gradients of the graphs of both density and temperature v. altitude in the tropospheres of six planets, then they will understand that it is gravity which raises those graphs at the surface end and causes them to tend to repair themselves because the gradients in both density and temperature graphs (against altitude) are formed simultaneously: such gradients are the one and only state of maximum entropy, namely thermodynamic equilibrium. Both gradients then lead to the pressure gradient which is thus a result of the Second Law, not a cause of the temperature and density gradients - by the Ideal Gas Law. The gradients are the result of the Second Law process of maximum entropy production: they enable us to understand the necessary, non-radiative input of thermal energy into the surfaces of Earth, Venus and similar planets anywhere in the Universe that I explained in detail in my 2013 paper "Planetary Core and Surface Temperatures" which is on Researchgate, LinkedIn and SSRN. Douglas Cotton 2001:8003:2601:7900:35ED:7727:F375:A4AD (talk) 23:04, 17 April 2023 (UTC)Reply

First law: Consistent sign convention needed

edit

The First law section at the moment switches sign convention without warning.

The two sign conventions are well explained in the article on the First law of thermodynamics, which mostly defines W as per Clausius as work done BY the system, but also has a section on Sign conventions which explains the alternate IUPAC convention that W is work done ON the system.

In this article, however, the section now uses the Clausius convention for the first equation and the paragraph containing it, and then switches without warning to the IUPAC convention. This is highly confusing.

Possible fixes: 1.Use Clausius throughout this article. 2.Use IUPAC throughout. 3.Explain both conventions in this article as well as in the First Law article.

Can we agree which way is best before the editing? Dirac66 (talk) 02:17, 22 April 2020 (UTC)Reply

Thank you for your valuable discussion.
I would be reluctant to try to contradict the IUPAC. So I suppose it may be safer to follow your suggestion 3.
I find the Clausius convention more natural, for the following reason.
Thermodynamic work is not just any old work. It is a specialized kind of work. For a thermodynamic process, it is defined by the changes in the values of the mechanical, electrical, or magnetic thermodynamic state variables of the working body. There is a reason for this. Thermodynamic work is done "spontaneously" in what Planck calls a "natural process". The classic example is when a process passes energy into the body as heat. This may be either as in Joule's classic experiment, in which the paddle agitates the liquid body so as to cause internal viscous friction, or as in a heat engine, when heat passes into the body by conduction and radiation. The liquid can in principle be contained in a cylinder with rigid impermeable walls, with a movable piston that exerts force on the surroundings. Provided the temperature is not in an anomalous range (as for example in water around 4°C), the liquid will expand as its temperature rises, doing P–V work on its surroundings. (Joule did not do his experiment this way.) The P–V work is thermodynamic work. But if the work is done by forces exerted by devices in the surroundings, then the second law dictates that at least some part of it (or, as in the shaft work of Joule's experiment, all of it) will be accepted by the working body as heat, and not as thermodynamic work. In a sense, the expansion of the heated body is "spontaneous", or "natural". On the other hand, the body of liquid will on no occasion spontaneously or naturally shrink so as to make the paddles go round and drive something in the surroundings. In a nutshell, thermodynamic work is always "natural" when it is done by the working body. One might feel that "natural" work should be counted as positive. I suppose that Clausius felt that way, because he was considering steam engines.
Counter to this, IUPAC may say that work is an energy transfer and should be counted as positive when energy is added to the system as work.Chjoaygame (talk) 16:36, 22 April 2020 (UTC)Reply
Thanks for your reply which I have been considering. I agree that the Clausius convention is preferable here. The two conventions are compared in the First Law article, so here I think we can just use Clausius except for one mention of IUPAC with a link to that article.
Also I now realize that this article does now use Clausius throughout, contrary to my statement above. I was confused by trying to understand the incorrect equation
 
which should read  
I will try to fix these problems in the next few days. Dirac66 (talk) 00:07, 27 April 2020 (UTC)Reply
Yes, I felt a bit confused by the various 'work' terms. Good to see you working on it.Chjoaygame (talk) 00:43, 27 April 2020 (UTC)Reply
OK, I have made a few improvements as detailed in my edit summary. They are along the lines discussed above, including removing one work term wnet. Probably more could be done, but I will leave this article for now. Thanks for your encouragement. Dirac66 (talk) 19:13, 29 April 2020 (UTC)Reply

You what?

edit

Wow, talk about an article that's inebriated with the exuberance of its own verbosity. I mean ...

"Thermodynamics has traditionally recognized three fundamental laws, simply named by an ordinal identification, the first law, the second law, and the third law.[1][2][3][4][5]. In addition, after the first three laws were established, it was recognized that another law, more fundamental to all three, could be stated, which was named the zeroth law."

... I need a PhD in English linguistics to figure out that there were originally three laws, called one, two and three and a more fundamental law, called the Zeroth law was added later.

OK, so that one feels like the use of complex language for the sake of it, but then we see...

"The second law of thermodynamics: In a natural thermodynamic process, the sum of the entropies of the interacting thermodynamic systems increases."

... which is precise, and true, but only understandable if you already know what it means. Just say, or add, something like "the second law states that over time things will inevitably decay unless an external energy source is available to prevent it." I know it's not precise, but it's probably close enough and it's understandable if you come to the article cold.

84.92.32.221 (talk) 07:38, 19 August 2020 (UTC)Reply

In my opinion, the Clausius statement is the most understandable for nontechnical readers. i.e. "Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time." Or, even more to the point in modern terminology, you could say "If heat passes from a colder to a warmer body, then it must be the result of external work applied to the system" (like a cross with the Kelvin Statement). Describing entropy as decay or disorder is for popular science articles/books. This is an encyclopedia and that terminology should be qualified if and when used to remain scientifically accurate. So I think you are right that some parts can be written better, but there is no need to resort to inaccurate statements. Footlessmouse (talk) 08:32, 19 August 2020 (UTC)Reply

I have edited the second law statement described to

The second law of thermodynamics can be formulated in a variety of different ways. One way of stating it is that, in an isolated system, heat cannot spontaneously flow from cold regions to hot regions. Another is that the total entropy of an isolated system can never decrease over time, and is constant if and only if all processes are reversible.

I hope this is more understandable. It is easier to talk about it in terms of heat and energy to by entropy, but the link to entropy is why it's famous, so it should be mentioned. I pulled the material from the second law article and it included a reference for the entropy statement, while the heat flow statement is a reformulation of the Clausius statement. Footlessmouse (talk) 09:20, 19 August 2020 (UTC)Reply

Hi Footlessmouse, Yep a definite improvement. Thank you. I do take your point about not resorting to inaccurate statements and I concede I did that. The trick is to come up with something that is accessible yet still accurate. My test is always to think what would happen if I stood up in front of a class of reasonably smart final year school kids from a different discipline and said it. If the blank-face count is 100% I've missed the target, 50% and I'm sort of OK. 10% and either I'm on a roll or they are lying!

I wonder if we sometimes forget that an encyclopedia should explain as well as record. (No criticism of the authors intended. Ask me to write about APIs and I'd put you to sleep in a second)

84.92.32.221 (talk) 14:10, 19 August 2020 (UTC)Reply

Richard Feynman had a philosophy that if you cannot explain something in simple terms, then you do not truly understand it. And he has a point (I read his book on QED prior to beginning as an undergrad and it was amazing: he explained complex numbers and modular arithmetic without ever naming them - he just used a clock). Some concepts are very difficult for me to explain to nontechnical readers, and this implies I do not fully understand all the concepts. We can all do better by understanding our limits and thinking about how it reads to persons who do not have a college education in physics. I believe the lede should both provide an accurate summary of the material in the article, while also being easily understood by virtually all readers. Technical points are sometimes inevitable, but the introduction should not be confusing or filled with unnecessary detail. Footlessmouse (talk) 00:55, 20 August 2020 (UTC)Reply
I read in a new edit that "Gradually, this resolved itself and a zeroth law was first formulated in the 1930s, so that temperature could be defined in a self-consistent manner." The zeroth law was stated as a fundamental principle about the nature of thermodynamic equilibrium by Maxwell, by my fallible memory in 1873, though not labeled as a law. The label 'zeroth law' was invented, perhaps jocularly, by Fowler, in a book review in the 1930's. Need I say more?Chjoaygame (talk) 14:45, 21 August 2020 (UTC)Reply
Also, I read the following edit cover note: "Pulled information from zeroth law, third law, history, and timeline of thermodynamics to clean up this section." Wikipedia is not a reliable source.Chjoaygame (talk) 18:59, 21 August 2020 (UTC)Reply
Sometimes, though, the other articles you are consulting contain references which are suitable sources for the article you are working on, in which case you can just copy the source from one article to the other. One or both of you may wish to check if the sources of the other articles are suitable here. (I haven't checked.) Dirac66 (talk) 19:34, 21 August 2020 (UTC)Reply
I understand, I also should have done a better job attributing those, I apologize for the inconvenience. For the most part, I pulled only referenced statements and what I otherwise considered as "common knowledge" statements from the other articles. I apologize for the confusion on the zeroth law, the statement is in reference to Fowler. I was meaning that it was realized that a zeroth law was needed in the 1930s and was adopted sometime after, but it could be written better. I also didn't realize Maxwell thought of it first, I had only heard of the half-joking introduction by Fowler. Maybe we should go back to the old statement there, which says something to the effect of "Gradually, this resolved itself and a zeroth law, more fundamental to all 3, was added later". Footlessmouse (talk) 22:58, 21 August 2020 (UTC)Reply
I am confusing article, sorry. In this case, all previous statements were reworded. No citations were included as they are all "common knowledge". The dates are the main thing I pulled from the other articles. I also rewrote the first sentence borrowing from the style used in the introduction in history of thermodynamics. I added a reference to Carnot's book in lieu of a reference. And I mention Nernst, his theorem, and the date range he worked on it. Finally, the final sentence was rewritten to incorporate 1930s and self-consistent definition of temperature. So statements weren't directly copied, though the dates could possibly use references. Footlessmouse (talk) 23:09, 21 August 2020 (UTC)Reply
@Chjoaygame, feel free to revert that edit, it was the last I made. I believe the section could use a little cleaning up and possible a little expansion, but perhaps my first attempt did not do such a great job at this. Footlessmouse (talk) 23:20, 21 August 2020 (UTC)Reply
As a quantity defined in thermodynamics, entropy describes a system in its own internal state of thermodynamic equilibrium. It is the physical import of the zeroth law thatChjoaygame (talk) 06:23, 22 August 2020 (UTC) when isolated, such a system undergoes no change. Slick statements hide that fact.Chjoaygame (talk) 04:22, 22 August 2020 (UTC)Reply
Hi @Chjoaygame, are you referring to a statement I added or to the original conversation above? In my own dealings, I refer to entropy as either entropy or Shannon entropy, both defined mathematically and not necessarily the same thing. Entropy is the conjugate variable of temperature in thermodynamic equations of state and is necessary for calculations. The rest is philosophy. Though its link to the number of accessible micro-states of a system is a beautiful fact. I sort of like how Boltzmann originally referred to it in his principle, he simply realized the entropy of a system in some state state was somehow related to the probability of the system acquiring that state. Footlessmouse (talk) 04:57, 22 August 2020 (UTC)Reply
For the second law, even Sommerfeld says that "During real (i.e. non-ideal) processes the entropy of an isolated system increases". (Pg. 26, Vol 5, Lectures on Theoretical Physics) The statement gets its own paragraph immediately following his first definition of the second law. Isolated systems with reversible or non-reversible processes can include the combination of two initially independent isolated systems into a single isolated system (e.g. by removal of an impenetrable wall). We can ideally imagine that, during this process, the entropy remains unchanged, but in real life it must increase. This is where the philosophy comes in. Footlessmouse (talk) 05:15, 22 August 2020 (UTC)Reply
On that point, though fortunately it uses the words 'real processes', that statement of Sommerfeld is slick and is not a reliable source.Chjoaygame (talk) 06:23, 22 August 2020 (UTC)Reply
I can see the need for more updated sources, but I was not aware that Sommerfeld was "not a reliable source", when did this happen? I am pretty sure Sommerfeld was a supplementary book for my last course, but it also has historical significance. Unfortunately, most of the definitions of the laws used in my textbooks are extremely technical and not particularly appropriate for an encyclopedia (Sommerfeld included). Thus I have not had an easy time looking for references (except the third law was pretty straightforward, it's the easiest to state in an unambiguous manner). If you have an idea that you think will improve it, please feel free to edit the article. Though I think some mention of entropy should happen in the lead, I am not particularly opposed to just deleting the entropy part of the second law statement there; the easiest way to state it, IMO, is with the Clausius statement. Footlessmouse (talk) 06:48, 22 August 2020 (UTC)Reply
In spite of the sarcasm of our IP address leader, it is no easy task to write this article.Chjoaygame (talk) 07:43, 22 August 2020 (UTC)Reply
@Chjoaygame I agree, it is fairly technical and the concepts are unintuitive. A few highly qualified statements and a lot of ramifications. I assume you are talking about the original user, as I was in no way being sarcastic, but if I have come of as somehow controlling or dismissive, I apologize, that was definitely never my intention. Any edits I made, I made just trying to help. I will not be offended if they are, in part or whole, reverted or altered by others trying to help (I may state an objection on the talk page if I deem it necessary, like anyone else). Footlessmouse (talk) 08:29, 22 August 2020 (UTC)Reply
Yes, I was referring just to the IP address user's remarks.Chjoaygame (talk) 09:31, 22 August 2020 (UTC)Reply

Yes, I would not endorse using hyperbole in the article and I know first-hand that it is difficult, even with formal training on the subject. Changing topics slightly without feeling the need for opening a new section: the statement "Additional laws have been suggested, but have not achieved the generality of the four accepted laws, and are not discussed in standard textbooks." is only referenced with standard textbooks written through the 1980s. The statement is objective and what I would call "common knowledge", but the references are not valid, I believe. The only one written as recently as 1980 was Kittel's and it indeed does not mention any other laws. Lack of evidence for something is not equal to evidence for a lack of something, though. It seems like it may be better to not mention that fact, or put it in the history section, as it is fairly obvious to anyone acquainted with physics or chemistry or the history of either, it therefore may not count as notably descriptive. Alternative theories to literally every subject in science have been proposed, one could go so far as to call that a requirement of science. Footlessmouse (talk) 13:11, 22 August 2020 (UTC)Reply

I wouldn't object to removing the statement, on the grounds that you propose. I seem to remember that the statement was put in in order to stop urgings that other unsuitable propositions were laws of thermodynamics.Chjoaygame (talk) 13:57, 22 August 2020 (UTC)Reply
An argument for precision and accuracy of statement is that Wikipedia is not only for cold comers, but is also widely quoted, even by experienced scientists, as if it were a reliable source.Chjoaygame (talk) 14:06, 22 August 2020 (UTC)Reply
In spite of the objection of our IP aesthete, I still favour "In a natural thermodynamic process, the sum of the entropies of the interacting thermodynamic systems increases." The lead is a summary of the article and doesn't need direct reliable sourcing such as is desirable or mandatory for the body of the article.Chjoaygame (talk) 14:21, 22 August 2020 (UTC)Reply
@Chjoaygame Compromise? First sentence is intuitive and easy to swallow, the second is more informative but harder to understand. Footlessmouse (talk) 15:07, 22 August 2020 (UTC)Reply
By 'the first sentence', you mean "... heat cannot spontaneously flow from cold regions to hot regions"? What worries me about that is that, besides being stationary in time, as defined in thermodynamics, particularly by Planck, a thermodynamic system is homogeneous, and does not differ from region to region. Because Wikipedia is widely quoted, I do not favour slippery language. Also, I would rather say 'increases' than 'will increase'.
On a different topic, my fallible memory does not recall Guggenheim as participating in the invention or early propagation of the label 'zeroth law'.
Enough for one post. More later.Chjoaygame (talk) 20:19, 22 August 2020 (UTC)Reply
Um, a thermodynamic system is homogeneous, and does not differ from region to region? But our article on Thermodynamic system says that "A thermodynamic system may comprise several phases, such as ice, liquid water, and water vapour, in mutual thermodynamic equilibrium, mutually unseparated by any wall." There seems to be a contradiction between these two statements. Dirac66 (talk) 21:58, 22 August 2020 (UTC)Reply
As always, a good point! In defence, Planck does say he deals "chiefly with homogeneous bodies". It remains the case that it is undesirably slippery language to express the second law in terms of regions, without explicit recognition of that point.Chjoaygame (talk) 22:27, 22 August 2020 (UTC)Reply
A further defence. Though the phase may vary from region to region in the system, its temperature will still be homogeneous when the system is in its own internal thermodynamic equilibrium.Chjoaygame (talk) 07:56, 24 August 2020 (UTC)Reply
Is this better? The statement is more qualified and slightly more general now. I was tempted to add "(i.e. a refrigerator requires an external power source)" to the Clausius statement, but I refrained, though I think it is a nice summary of the spirit of the second law. Also, I know nothing of Guggenheim's contributions to the zeroth law - that information predates my involvement with the page. Footlessmouse (talk) 00:04, 23 August 2020 (UTC)Reply
Nice idea, thanks @Chjoaygame, it now seems obvious to split the statements up like that. It retains the benefits of both approaches. Unless other editors wish to comment, I believe the issue(s) is resolved. Footlessmouse (talk) 01:09, 23 August 2020 (UTC)Reply

The lede definitely reads better, and I have to thank you for taking my sarcastic OP seriously. Many would not have done so, especially from a “drive by” anonymous IP.

At the risk of coming across as a complete ingrate I wonder if there is more that can be done on the second law. Quite reasonably you don’t like the terms decay or disorder, but there is something about those words that describes the 2nd law cogently. It’s why ice melts in a warm room, a glass breaks when you drop it and yes, why things decay. Read, and think, on and then comes the lightbulb moment. You understand it’s also why time is directional. engines are never 100% efficient and why heat death is a perfectly plausible fate for the universe. Suddenly it becomes wonderous how something so simple can also be so profound. The beauty of science revealed!

I wish I had the words, I don’t. All I can do is articulate the magic and wonder of stumbling across something accessible enough to understand, draw you in, and which, subsequently encourages you to put the hard miles into a deeper understanding of the science. That’s why I value simplicity, as well as clarity and accuracy.

Maybe I’m wrong, maybe that isn’t what encyclopaedias are for. Leave that to Sci American and the like! 84.92.32.221 (talk) 13:15, 24 August 2020 (UTC)Reply

You are a little wrong, but I understand EXACTLY what you mean. Popular physics books and articles written for those without a degree in physics are written elegantly and discuss a lot of fundamental philosophy that is highly enlightening. Before I ever went to school for physics, I became obsessed with those such as Richard Feynman and Stephen Hawking. Especially Feynman, he introduces you to worlds you did not think exist. The problems come when we start to treat the philosophy like it's physics. Remember the old mantra, be it wrong or right: "shut up and calculate". That's what physics is. Unfortunately, as Wikipedia is an encyclopedia which purports to disclose facts as agreed upon by scientific consensus, it too is stuck giving only the technically correct physical meaning of the laws. This isn't to say that I disagree with you or that I think it is a bad thing for people to learn of "decay" or "disorder" before actually learning what entropy is, but it is saying that the place to learn of these philosophical arguments is either on the philosophy of thermodynamics article or a source otherwise outside of Wikipedia. There is always more that can be done, and perhaps I will try to go back through sometime relatively soon, but there are also many other pages with more significant issues. Footlessmouse (talk) 19:35, 24 August 2020 (UTC)Reply
Perhaps I am making a mistake by writing the following.
The word 'disorder' is used in a sort of tradition. We are so brainwashed by that tradition that we blithely accept that the tradition makes sense. It is recited without thought or criticism in very many otherwise reliable sources. Nevertheless, a number of critical thinkers who in my view qualify as generally reliable sources, and who are not brainwashed, agree that the word 'disorder', in itself, is inappropriate and even baffling or nonsensical for the ordinary language understanding of the physical meaning of entropy in thermodynamics. Amongst widely used textbook writers, Peter Atkins comes to mind. I won't right now try to give a list of other sources.
Von Neumann suggested to Shannon to use 'entropy' for his quantity because no one knew what it meant, and Shannon also called his quantity 'information'. This is hardly an advertisement for the proposition that 'information' is a term that ordinary people will be helped by to understand the physical meaning of the thermodynamic quantity entropy. Still ben Naim thinks that 'information' is a suitable concept to describe thermodynamic entropy, presumably thinking that 'information' is a good word for Shannon's quantity. Our IP user says that disorder is why ice melts in a warm room, and thinks that such is a cogent description. Tradition is indeed cogent. We can acknowledge it without enslaving ourselves to it.
Next, one may ask 'what is a more suitable ordinary word to convey to cold comers the physical meaning of thermodynamic entropy?' I hesitate to state the obvious at this point. Guggenheim was well familiar with the tradition.
Again, it is partly traditional to say that the second law gives a reason why time is directional. Many have been brainwashed by that tradition too. Here is probably not the place to analyse that tradition. But will say that I think that tradition sheds no light on the question of why time is directional, and that I think it misleading to say that it does.
Again, it is traditional, but nonsensical, to say that the laws of thermodynamics predict the "heat death of the universe". A little thought is enough to make that obvious, if the thought has been preceded by a study of thermodynamics.
I hesitate also to say that 'shut up and calculate' does not cover all of physics. Also, I don't like giving cold comers the whip hand. It is not good for Wikipedia to make slick but inaccurate statements that are likely to be widely quoted, no matter how much such statements may beguile cold comers and traditionalists.Chjoaygame (talk) 23:29, 24 August 2020 (UTC)Reply
As soon as I wrote that, I kind of regretted it. (shut up and calculate, I mean) Physics is all about tying it into the real world, that's what distinguishes it from mathematics. The point I was trying to make is that physics is not philosophy. Philosophical ventures without any potential impact on observation are not part of physics. The entropy arrow of time, heat death, decay, disorder, all that is just philosophy and doesn't belong here. They are philosophical interpretations of physical phenomena. Information content for Shannon has a real basis in physics, though, so it's borderline. The equations and what they describe are real: if we want temperature, we must have entropy. I, personally, loathe decay and disorder and find them to be highly lacking descriptions from a statistical mechanics perspective. Footlessmouse (talk) 00:15, 25 August 2020 (UTC)Reply
No worries about a passing comment. Yes, physics is not just mathematics. I am not denying the usefulness and physical significance of Shannon's quantity. I am just saying that I think it doesn't help the cold comer much to use the words 'entropy' or 'information' for it; I acknowledge that they are pretty widely used. I am sad to say that there seem to be physicists who like to engage in talk about such grandiosities as the "heat death of the universe".Chjoaygame (talk) 00:48, 25 August 2020 (UTC)Reply

new edit

edit

I don't like to move here without using the talk page. I have a worry about this edit.

I think the edit gives undue emphasis to a triviality. If there is no transfer and no entropy change, one is referring to a process only in a trivial sense. Every actual process exhibits transfer and increase of entropy. Yes, mathematical calculations about the equilibrium surface are spoken of as referring to "reversible" or "quasi-static" or "quasi-equilibrium" "processes". But that locution is only virtual, fictive, or imagined; it does not refer to an actual physical process; it refers only to a mathematical exercise.

I think the point of the law is that every actual process exhibits increase of entropy. Guggenheim writes "Natural processes are all such as actually do occur; they proceed in a direction towards equilibrium." He devotes a separate paragraph to the limiting case. I don't like to see the main statement of the law fuzzed by making it include a reference to what doesn't actually happen. I think the law is too important to let it be fuzzed by slipping in a grammatically neat though opaque reference to non-events. If it is felt that "reversible processes" are important enough to need a mention, then I feel that they deserve a sentence of their own, that makes their non-actual character explicit. I would like to undo the edit, but I don't like to do so without talk.Chjoaygame (talk) 13:50, 8 November 2020 (UTC)Reply

I agree that the words "or remains the same" should not be included in the introductory summary. However further down in the section Second Law, we should retain the existing mention of reversible processes which is more nuanced. Here the usual definition   is followed by the clarification that "While reversible processes are a useful and convenient theoretical limiting case, all natural processes are irreversible." This sentence should be retained as it explains how to interpret the many calculations based on the reversible limiting case. Dirac66 (talk) 15:11, 8 November 2020 (UTC)Reply
Basically, the second law says that in the limit of infinite time, an isolated system will undergo no entropy change and will then be in thermodynamic equilibrium. As I understand the objection, it is that, in practice, infinite time means "never". This implies that equilibrium is never seen in nature. In a strict, theoretical sense, I suppose that is true, but in practice, I dont agree. The concept of a system in "practical" equilibrium is vital to experimental thermodynamics, and so the idea of "practically" no entropy change is equally vital.
The use of the word "infinite" is "code" for an explicit mathematical concept which does not deal with infinity. It roughly amounts to "the greater the time, the less the entropy increases". This means that in practice, "infinite time" means that if you wait for a long enough (finite) time the increase in entropy will be PRACTICALLY zero. Here "practically" has a precise mathematical meaning. It means if you assume zero increase, your experiment wont contradict you to within your experimental error. This means "practical" equilibrium is attainable, and practical constancy of entropy is likewise attainable. I wont revert the edit, I just ask that this point be considered. PAR (talk) 11:54, 9 November 2020 (UTC)Reply
Basically, the second law says that in the limit of infinite time, an isolated system will undergo no entropy change and will then be in thermodynamic equilibrium. It is not entirely clear to me what this sentence intends to mean.
Callen has an axiom scheme that differs from the more usual orthodoxy. He writes
Postulate 1. There exist particular states (called equilibrium states) of simple systems that, macroscopically, are characterized completely by the internal energy  , the volume  , and the mole numbers   of the chemical components.
Planck expresses what to me is the same idea thus
§6. In the following we shall deal chiefly with homogeneous, isotropic bodies of any form, possessing throughout their substance the same temperature and density, and subject to a uniform pressure acting everywhere perpendicular to the surface. They therefore exert the same pressure outwards........ [more details].
Bailyn writes
THE LAW OF EQUILIBRIUM: A macroscopic, bounded, non-gravitating system that is otherwise isolated or in a uniform environment attempts to reach an asymptotic state called equilibrium characterized by constant and piece-wise uniform values of its intensive state variables, unless it is already in equilibrium, in which case it will remain indefinitely in this state unless acted on by systems with dfferent intensive state variables, or systems in relative motion.
As I read these, they are attempts to state what is sometimes called 'the minus oneth law of thermodynamics'. I find Bailyn's attempt full of grave flaws, but still we get the idea. It defines a thermodynamic state of a simple system, and says that such states are amongst the basic existents of the subject.
In my view, the second law has a very different content. It is about natural processes.Chjoaygame (talk) 13:08, 9 November 2020 (UTC)Reply

@PAR:, the statement is not a matter of practicality, entropy increases in all real processes, period. You are discussing an isolated system when the statement is about interacting systems (you cannot view it simultaneously as one isolated system and two interacting systems, it is one or the other, obviously no entropy will change if it is viewed as an isolated system, but the entropies of the two individual systems will not be constant). Besides that, everyone here knows how quantities decay as their dependent variable goes to infinity, it approaches an asymptote and we estimate the long-term value as the asymptote, but saying we reach it in finite time is fallacious, infinity most certainly means infinite, without bound, goes on forever, etc. I agree with Dirac66, please remove the misleading statement from the lead, but keep that information in the section below where the explanation is correct and informative. Footlessmouse (talk) 21:31, 9 November 2020 (UTC)Reply

I think it a serious mistake to regard thermodynamic equilibrium as an asymptotic destination. To do so is to prejudice or make nonsense of the possibilities of fluctuations. Thermodynamic equilibrium is well regarded as defining averages over a long, even infinite, time, of a single macroscopically specified system. Or it can well be regarded as defining an ensemble of generally defined microscopically specified states.Chjoaygame (talk) 23:22, 9 November 2020 (UTC)Reply
I don't really think of it as an asymptotic destination, I was making a point of how variables decay because the previous editor took great pains to explain how we estimate values that approach an asymptote. Also, a damped sine-wave has an asymptote, fluctuations don't really effect the status of the asymptote as the destination that is always just out of reach; I would therefore regard the tendency of a system to attain thermodynamic equilibrium as an asymptotic process. Footlessmouse (talk) 23:44, 9 November 2020 (UTC)Reply
That line of thinking would be fine, but for Poincaré's recurrence theorem. Along that line of thinking, one might propose that Aristotle discovered the second law of thermodynamics. He observed that in the ordinary terrestrial world, moving bodies come to rest unless they are driven by something; it is because friction is a generator of entropy.Chjoaygame (talk) 06:10, 10 November 2020 (UTC)Reply
Entropy is the conjugate variable to temperature, Aristotle certainly did not have the necessary information to formulate any significant piece of it... Not the least of which, a theory of differential equations... You realize I agreed with you, right, and the whole asymptotic decay was a side note? My argument about that is purely semantic and in no way was a reference to Poincare's recurrence theorem, which I had to look up because I'm pretty sure they did not teach us that. My semantic argument says that if you have something that is approaching a constant finite value but will never reach it after infinite time, regardless of fluctuations so long as there is a clear trend, it is asymptotic and that finite value is the asymptote. I'm not saying anything at all about the physics involved. Frankly, the point does not matter and I am not sure why I am responding, but I am super confused by this whole thing. Footlessmouse (talk) 07:08, 10 November 2020 (UTC)Reply
Good comment. Glad that we agree. Yes, the asymptotic thing is a sidenote.Chjoaygame (talk) 07:16, 10 November 2020 (UTC)Reply
@Footlessmouse: " the statement is not a matter of practicality, entropy increases in all real processes, period.".
Ok, then, then there is no such thing as equilibrium in the real world, period.
Thermodynamics does not deal with fluctuations. Thermodynamics deals with systems "in the thermodynamic limit" which means an "infinite" amount of mass and energy, or, in the atomic view, an infinite number of particles. For these systems, fluctuations in an intensive state variable are ZERO, otherwise there are fluctuations in intensive state variables, and the second law is violated by the fluctuations in entropy.
Ok, then, there are no systems in the real world at the thermodynamic limit, period. Therefore, the second law is false, period.
You see where this kind of logic leads? Are these helpful statements? The funny thing is that I agree that all of the above statements are true. Yet thermodynamics DOES deal with the real world, in a practical sense. You can go on and on about how in the real world, entropy is never constant, there is no such thing as equilibrium, fluctuations prove the second law to be false and, that essentially, theoretical thermodynamics, with all its "infinities", does not deal with the real world. But you misunderstand "infinity". Infinity is not a number. It is a code word for a more complicated concept. If some number is approached in the limit of infinite time, it means, roughly, that the longer you wait, the closer you get to that number. If some number is approached in the limit of infinite number of particles, it means that the more particles you have, the closer you get to that number. There is a point in experimentation, the experimental error, which defines "close enough". And this means that now, and only now, is the theory of thermodynamics capable of dealing with the real world. And that's the whole point, isn't it? PAR (talk) 15:16, 10 November 2020 (UTC)Reply
I don't think many physicists talk about a system in equilibrium undergoing non-trivial processes. Beyond all the philosophy, the second law is stated equivalently as the impossibility of perp. motion machines of the second kind, a statement that no serious physicist has argued with in over a hundred years. Also, no one here misunderstands infinity, we've all taken calculus... There is a chasm of difference between what experimentalists accept as close enough and what theorists accept as real. Footlessmouse (talk) 18:08, 10 November 2020 (UTC)Reply
Also, I don't think I ever took a class explicitly on thermodynamics, they teach a lot of statistical physics these days. I think stat phys is a little more generous with concepts like thermodynamic equilibrium. Everything is a statistical system and you can show using your favorite type of statistics: you got the Maxwell statistics, the Bose-Einstein, and the Fermi-Dirac statistics. Regardless of which regime chosen, the entropy of two individual systems interacting can be modeled with statistics and shown to always lead to an increase in entropy. Footlessmouse (talk) 18:24, 10 November 2020 (UTC)Reply
But you have not addressed the argument: You say "the statement is not a matter of practicality, entropy increases in all real processes, period."
No, in all real processes there are fluctuations and entropy does not always increase, period.
You are refusing to invoke the concept of infinite time, yet you invoke the concept of infinite mass when you make your statement.
infinite time infinite mass Statement
NO NO Entropy may increase or decrease (THE REAL WORLD)
NO YES Entropy always increases (YOUR STATEMENT)
YES NO Entropy may increase or decrease
YES YES Entropy increases or stays the same (THIS IS THEORETICAL THERMODYNAMICS!)
You want to say that entropy always increases, and that thermodynamics statement that it becomes constant in the limit of infinite time is a quibble to be resolved later, yet the assumption that a system be at the thermodynamic limit of infinite mass is not a quibble, nor is it mentioned.
All I am proposing is to say that entropy increases, or remains the same in the case of a system in equilibrium, and then treat the fact that no real system can wait an infinite time, nor be of infinite mass, as a quibble to be resolved later.
Please explain to me why this is a bad idea. PAR (talk) 00:43, 11 November 2020 (UTC)Reply
I think I understand better, but the main reason for not including anything else there is that WP is not a textbook, and we should aim to make the article understandable to anyone interested in it, which for this article means many high school students. There is no reason to confuse them with too much detail. It's been quite a while since my quals and I haven't done much in thermodynamics or stat mech since, so I am a little rusty, but what you called theoretical thermodynamics is a lot of what I was trained in. (With the caveat mentioned earlier that theoretical processes can be reversible and therefore have constant entropy but that cannot happen in real life) I'll spend some time reading and get back to you. I accept that there are fluctuations in entropy, but absolutely under no circumstance accept that some real-life processes allow for overall constant or decreasing entropy (when everything is taken into account). Footlessmouse (talk) 00:58, 11 November 2020 (UTC)Reply
Yes, fluctuations in entropy cause momentary actual decreases in entropy in real systems, but if you had a large number of identical systems, and averaged those fluctuations, those averages would grow smaller as the number of systems increased, and would approach zero as the number of systems "approached infinity". Put another way, if you have a system with a large number of particles, those fluctuations divided by the number of particles would grow smaller as the number of particles increased, and would approach zero as the number of particles "approached infinity".PAR (talk) 21:03, 11 November 2020 (UTC)Reply
Can you just provide us with some references wo we can understand what you are talking about, please? I'm confused about you bringing up fluctuations again when no one was arguing against them. The fluctuations cannot help keep a system at constant or decreasing entropy over the course of a non-trivial interaction. If you have references that say otherwise, I am sure everyone would love to read them. Footlessmouse (talk) 01:34, 12 November 2020 (UTC)Reply
Wikipedia has some good articles, including references, which I have linked here: I bring up fluctuations because they cause temporary decreases in entropy in real-life systems. They cause temporary increases too, beyond what the second law predicts. The second law only holds for "infinitely large" systems, where "infinitely" is a code word for a more complicated mathematical concept. There is also a concept called ergodicity which says, roughly, that, for a system in equilibrium, if you average the fluctuations over an "infinite" time period, you get the same answer as the second law prediction. That's what I was trying to say in that above paragraph, trying to be careful about the relationship of "infinity" to the real world (it's not just some big number that never happens, so we don't have to worry about it), but not getting totally opaque with a lot of jargon.PAR (talk) 21:24, 14 November 2020 (UTC)Reply
Just so you know, WP is not a reliable source and no one is likely to take the time to dig through the pages to find a reliable source to make your point for you. Here is what this conversation is discussing "the sum of the entropies of interacting thermodynamic systems either increases or remains the same" is incorrect for real life physical processes and the statement was changed back to "the sum of the entropies of interacting thermodynamic systems increases". Fluctuations cannot ensure that the entropy of an interacting thermodynamic system remains constant throughout the process. If you are discussing something new, please start a new topic so it will be less confusing. If you are trying to make another point related to this wording, please provide references to reliable sources. Thanks! Footlessmouse (talk) 21:36, 14 November 2020 (UTC)Reply
Just to emphasize this, I spent several days purging the mass-energy equivalence article of numerous statements about how invariant mass is always conserved if you view the system just right. There are uncountably many inaccuracies in Wikipedia and that is why it is never used to justify new edits to other pages. Also, your insistence to keep explaining the concept of infinity in each new post is bordering on offensive at this point. Footlessmouse (talk) 21:46, 14 November 2020 (UTC)Reply
I agree, wikipedia is not a reliable source, but the references are, or may be verified to be. You ask for references, I gave you some, now you complain that they don't directly adress this question and this question only. I'm sorry there is no reference entitled "Why editor PAR is right". I have had to do a lot of digging and thinking about this, falling into the same traps, then digging and thinking my way out of them, and now you can't be bothered to do the same?
You say the entropy of real systems always increases. The presence of fluctuations proves this statement to be false. Dead wrong. In real systems entropy jumps up and down, mostly up when its heading for equilibrium, but sometimes down. If you want to talk about the unreal case of "infinitely" large systems, then yes, it always increases during a "real" (i.e. finite) time interval. If you want to talk about the doubly unreal case which includes "infinite" time, then it increases or stays the same. Thermodynamics only addresses this doubly unreal case and for large but finite systems it's extremely accurate, but not perfect. I was simply trying to "fix" a patently false statement. PAR (talk) 22:42, 14 November 2020 (UTC)Reply
Asking you to find explicit references has the dual purpose of (1) if you are right, the article can be modified accordingly and (2) if you are wrong, you will realize such. Please look at the wording again. There is nothing wrong with it, fluctuations are not part of the conversation there. If you wish to say we need to speak more about fluctuations in the article, go ahead and start editing or begin a new topic below on it. You will not find reliable references to justify changing the wording here because they don't exist, the wording in the lead which avoids going into too much detail is correct. Footlessmouse (talk) 22:49, 14 November 2020 (UTC)Reply
If fluctuations are not part of the conversation, then real systems are not part of the conversation. The statement does not "avoid too much detail", it is demonstrably false.
Also, the reason I keep trying to explain infinity is because you keep offering arguments which illustrate that you don't understand what I am saying and keep defending a false statement. I might find that offensive if I didn't respect that you are trying to improve this article, just like I am. PAR (talk) 23:02, 14 November 2020 (UTC)Reply
I keep discarding fluctuations because nearly everything of physical significance fluctuates and that has never stopped us from drawing trend lines. This is the language adopted by textbooks and taught in physics courses, so it is the language that will be used here. I appreciate you are trying to improve the encyclopedia and I would certainly spend more time helping if I weren't so busy in other areas. However, given that you are the only one of four editors in the discussion who proposes to change the already existing wording, the burden of proof is on you for this one. Also, I have an undergraduate degree in mathematics and a PhD in physics, I am not ignorant of basic concepts, there may be a miscommunication problem but I am just generally confused as to what you want. All I see is you trying to explain basic concepts like infinity and fluctuations without ever providing justification for the entropy of a non-trivial process as a whole remaining constant or decreasing in interacting physical systems. Momentary fluctuations do not change anything with regards to the trend of overall change in entropy for the process as a whole. If there is something we overlooked or there is a miscommunication problem, the very best way to fix it is to provide an explicit reliable reference that no one can argue with. Footlessmouse (talk) 23:25, 14 November 2020 (UTC)Reply

This is absolutely the last post I write on the subject, if anyone else wishes to reach me, either ping me or visit my talk page. I apologize for not making this post days ago:

  • Kittel and Kroemer Thermal Physics (one of the books I used in grad school) page 45, section "Law of Increase of Entropy" begins: "We can show that the total entropy always increases when two systems are brought into thermal contact."
  • Kardar Statistical Physics of Particles (my other main grad school book - other than Kardar's Statistical Physics of Fields) on page 15 caption for figure 1.15 states "The initially isolated subsystems are allowed to interact, resulting in an increase of entropy." it then elaborates in the text and explains mathematically why it increases for interacting systems and states later on page 16: "The statement about the increase of entropy is thus no more mysterious than the observation that objects tend to fall down under gravity!"
I could go on with other books I have as references and with undergrad books, but I very much do not believe it necessary. As stated before, this is the language adopted by the textbooks. Footlessmouse (talk) 23:57, 14 November 2020 (UTC)Reply

Ok, this will be my last word as well, unless Chjoaygame wishes to change things. First of all, the statement by Kittel and Kroemer is simply false, unless there are qualifications that you did not mention. The total entropy remains the same if the two systems were at the same temperature, or in other words, were in equilibrium with each other. Same goes for Kardar. Systems tend to equilibrate with a net increase in entropy unless they are already equilibrated. Objects tend to fall down under gravity unless they are sitting on the ground already.

Maybe you are right, we are talking past each other. You keep using terms like entropy "as whole" and "overall" and "trend lines", even though these terms are not used in the present statement:

"In terms of entropy, in a natural thermodynamic process, the sum of the entropies of interacting thermodynamic systems increases."

I think you agree that in real systems, entropy may momentarily decrease due to thermal fluctuations, but not "as a whole" or "overall", there is a "trend line" that goes through these fluctuations that always increases. These are vague, undefined concepts, and are not even mentioned in the present statement.

What I am saying is that the "trend line" that you speak of is just the behavior in the limit of an infinitely large system, where there are no fluctuations. In a real system, entropy fluctuates evenly about this precisely defined trend line. So we are talking about the same thing here, I'm just saying it more precisely, I hope. So to be more precise, we could say:

"In terms of entropy, in a natural thermodynamic process, neglecting fluctuations, the sum of the entropies of interacting thermodynamic systems increases.

But then I think you would say that "neglecting fluctuations" is an unnecessary complicating nit-pick that clouds the issue. Another nit-pick - thermodynamic systems are, by definition, infinitely large, so there is no such thing as a real-life thermodynamic system, only ones that are so large that it's ok to treat it as one.

Ok, how about:

"In terms of entropy, the sum of the entropies of interacting thermodynamic systems increases or remains the same."

and we regard "real systems" as an unnecessary complicating nit-pick, and later go on to explain that real systems never actually reach perfect equilibrium where the entropy is constant, and fluctuations can cause momentary decreases in entropy? That's what I want. As I said above, I will not revert the reversion, I don't have time for that nonsense. If my arguments are not convincing to you then so be it. PAR (talk) 07:00, 15 November 2020 (UTC)Reply

Perhaps I may join the chat here.

In my view, the thermodynamic limit is a concept of statistical mechanics. It does not belong to thermodynamics proper.

In my view, a thermodynamic system needs purposeful definition. The purpose is to account for a process. Generally, one should try for what might be regarded as 'canonical' (my own word for the present discussion) processes. An example might be a body enclosed in walls that prevent matter transfer, and that impose a rigidly determined volume on the body, and are perfectly reflective of light, and do not conduct heat.

The example of a 'canonical' process that I have chosen for discussion here is to let the body expand while an arbitrarily dictated volume change is imposed on it by the surroundings, slowly enough to let us neglect the kinetic energy given to the surroundings by the body during the expansion, and to let the pressure in the body, and its volume, remain measurable. The volume change is controlled by a piston-and-cylinder arrangement. The process is imposed by the thermodynamic operation of slowly moving the piston by means of a device, in the surroundings, that is so strong that the position of the piston is not affected by the pressures on it. This may be described as an adiabatic transfer of volume from surroundings to body, with a transfer of energy as thermodynamic work. Given its volume, the body is allowed to choose its own pressure as it pleases. This canonical kind of process is conveniently described in terms of the internal energy of the body, as a function of its entropy and volume.

This may be expressed in mathematical formulas.   denotes the internal energy at entropy   and volume  . We suppose that we can measure the pressure in the body,  . The initial state of the body is { ,  ,  ,  }, the final state { ,  ,  ,  }, the volume increase  , and the work of expansion  .

For this purpose, we admit fluctuation neither in the volume  , nor in the entropy  . The initial and final internal energies belong to equilibrium states, which persist without fluctuation respectively for long times, one might even say from the beginning of time till the start of the thermodynamic operation, and from the end of the thermodynamic operation till the end of time. During those respective long times, the pressures in the body are permitted to fluctuate. For example, for a moment all the molecules may cluster in the middle of the volume; or for another moment they may all rush at the pressure-measuring device. Such moments will be rare, but, as thermodynamicists, we are very patient, our clock being calibrated in multiples of the Poincaré recurrence time. In practice, perhaps the only occasions when we will be able to detect fluctuations is when we set the volume to be, for example, at a critical point, and we see critical slowing down.

In short, since we may be interested in fluctuations, we will need to consider long times. I prefer that to considering infinitely large bodies, and ergodic theory. Since, for our process of interest, we do not allow heat transfer, I think we have settled on unfluctuating entropy  , volume  , and internal energy  . We can ask statistical mechanics to work from those unfluctuating parameters to predict the statistics of the pressure fluctuations,   as a function of time  .

If we were to allow heat transfer in our process, we would do it differently. For example, we would allow changes in entropy, or we would consider temperature.Chjoaygame (talk) 00:53, 15 November 2020 (UTC)Reply

I don't fully agree that "thermodynamic limit" is a purely statmech concept. Brownian motion is not explained by classical thermodynamics because the "system" is too small. (A tiny particle being affected by thousands of molecules). The same is true for photon noise from a black body. Quoting Einstein (Emphaisis mine):
"(Thermodynamics) is the only physical theory of universal content concerning which I am convinced that WITHIN THE FRAMEWORK OF THE APPLICABILITY OF ITS BASIC CONCEPTS, it will never be overthrown."
In other words, Brownian motion, etc. are outside thermodynamics, but not part of statistical mechanics (they are phenomenological) because the systems involved are too small. Statistical mechanics explains thermodynamics, and more. PAR (talk) 07:00, 15 November 2020 (UTC)Reply
  • Replying to the above double post of Editor PAR. Forgive me. I think it worth looking at fine questions here. Fluctuation in thermodynamics is a fine question.
Editor PAR writes
I think you agree that in real systems, entropy may momentarily decrease due to thermal fluctuations, but not "as a whole" or "overall", there is a "trend line" that goes through these fluctuations that always increases. These are vague, undefined concepts, and are not even mentioned in the present statement.
I think that is indeed vague or undefined. Classical thermodynamics does not consider fluctuations. I don't like the idea of second guessing it by vaguely talking about "real systems" in that way.
As a matter of deep principle, for thermodynamics proper, I reject such talk about a "trend line". Entropy is defined for a thermodynamic system in its own state of internal thermodynamic equilibrium. It does not drift or trend towards equilibrium. I guess that perhaps Editor PAR may demur, but I will hold my line. My above treatment of an adiabatic process intends to clarify why.
To have fluctuations of entropy in a "thermodynamic system", one must properly go slightly beyond the universe of classical thermodynamics, to an extended universe, explicitly admitting fluctuations. Let me call it 'extended thermodynamics'. (Perhaps this may deal with Editor PAR's "real systems".) For example, in such extended thermodynamics, fluctuations of entropy are admitted for a system such as is described by an energy function,  , the Helmholtz function. It has rigid walls that determine its volume, with a part that is diathermal, through which it is in thermal equilibrium with a temperature reservoir in the surroundings. Its temperature   and its volume   are unchanging, unfluctuating. But heat can pass back and forth across the diabatic wall, creating fluctuations   in its Helmholtz function, and   in its Helmholtz entropy, functions of time  .
But in my view, fluctuations of entropy and of volume are not admitted for the extended system that is described by the unfluctuating energy function, known as the internal energy,  . This extended system is admitted as suffering fluctuations   in temperature, and   in pressure.
So I am ruling out, by decree, or postulate, simultaneous fluctuations in both members of a conjugate pair of state variables, extensive and intensive.
Einstein talks about fluctuations in entropy, so I reason that he is willing to consider my extended thermodynamic systems. In a sense, they lie on the margins of the framework of the applicability of the basic concepts of the eternal thermodynamics.
As for the thermodynamic limit, my position that it does not belong to thermodynamics proper. Editor PAR says that it is not restricted to statistical mechanics. As he pleases.Chjoaygame (talk) 09:36, 15 November 2020 (UTC)Reply
  • Okay, so apparently I lied and I am making one last post. I told you in Kittel the section opens that way and it is regularly called "the law of increase of entropy". Why do you think you know more about the topic than the people who wrote the standard textbooks on it? Clearly you are missing something. You must know that physicists use shorthand regularly and do not enunciate every caveat. For the record, I would have had no problem with a sub clause of "excluding fluctuations" if that is the proposition, but you might want to start a new topic on that as this is run down. Also for the record, I kept using remedial phrases like "trend lines" to make a point that wasn't getting through, similar to your repeated explanations of fluctuations and infinity. Footlessmouse (talk) 10:02, 15 November 2020 (UTC)Reply
Dear Editor Footlessmouse, I don't like your locution "I lied". I would rather you had written "Now I have changed my mind." Also, I think that Editor PAR doesn't think he knows more about the topic than those people.
I think it is not a matter of saying that  Kittel & Kroemer are wrong. I think it is a matter of not necessarily quoting them on this topic, because they are not writing in a context of thermodynamics proper. They belong to the class of the super clever, for whom mere thermodynamics proper is infra dig. As indicated by the title of their book, they live in a superior universe, that of 'thermal physics'. In my opinion, they do not intend to be, and are often enough not, good sources on mere thermodynamics proper.Chjoaygame (talk) 11:18, 15 November 2020 (UTC)Reply
Comment: Wikipedia is not a place to turn well established concepts and terms into some philosophical excursion with obscure motives by editors with obscure qualifications. There is no thermodynamics proper, no super clever, or other such B.S. But indeed, the field of thermodynamics has evolved over the centuries, what was once a very narrow field, is today a multifaceted array of endeavors. But the basics have not changed and are properly represented in most textbooks in common use at institutions. Wikipedia should reflect topics, especially science, with current, common understanding, not outdated ideas or hyperbolic interpretations, such might belong into a history sections, if relevant to a solid basic representation. Many of the thermodynamics articles have become increasingly unreadable, and reading talk pages is not only painful, but probably harmful. kbrose (talk) 17:38, 15 November 2020 (UTC)Reply

Onsager relations

edit

Some new edits, such as this one, have put in information about Onsager relations. I think this will cloud the situation, or confuse readers. I favour deleting the freshly posted information. It has been inserted and deleted in the past, and I think the past deletions should hold. It may make some editors happy to show their knowledge, but I think it deteriorates from the article.Chjoaygame (talk) 02:51, 11 March 2021 (UTC)Reply

I think Onsager relations should be here, they are famous enough to be called that, even the Fourth law of thermodynamics redirects there. Math can be avoided if necessary.--ReyHahn (talk) 11:08, 11 March 2021 (UTC)Reply
It seems that only Editor ReyHahn has offered reasons. I find them inadequate. I may respond. The comment "Math can be avoided if necessary" does not call for a reply.
The laws of thermodynamics are of monumental importance and generality. The Onsager relations have, by specialists, been called 'the fourth law', but that is not enough to earn them a title on the level of the celebrated three. The zeroth law just scrapes in as a fourth. The minus–one-th does not make the grade, though it is fundamental, as recognised by Callen, asserting the existence of states of thermodynamic equilibrium, and essential to make sense of the subject. Fame does not establish relevance or notability for this article.
The Onsager relations are about microscopic partial fluxes, approximately valid when a non-equilibrium system is near enough to local thermodynamic equilibrium. They are supposed to yield quantities that are called 'entropy production', but they are not adequate to account for the fullness of macroscopic entropy change. Their range of applicability is not well defined. Interesting and important though they are, in the ordinary sense of the words, they do not amount to laws of thermodynamics.
Their presence in the article detracts from its clarity and weight, especially for non-specialist readers, who might be misled or confused by it. This is summarized in the history section of article by the sentence "Additional laws have been suggested, but have not achieved the generality of the four accepted laws, and are generally not discussed in standard textbooks."Chjoaygame (talk) 06:54, 13 March 2021 (UTC)Reply
I am having a hard time understanding why Onsager relations were not found anywhere in this article, we should aim for completeness as an encyclopedia as long as there are appropriate sources (of course, with the right proportional weight). If needed we could add more details about its limitations for clarity. As said before "fourth law of thermodynamics" redirects there and it is consistent with other articles, see Thermodynamic equations, Scientific law, Timeline of scientific discoveries. The notability of Onsager equations is also not contested, but if we are not basing this on notability what is the actual guideline? Hide advanced topics from undergraduates? I want to also note that in my original edit I added it under Addtional laws it was another user (Kbrose) promoted it to its own section.--ReyHahn (talk) 09:39, 13 March 2021 (UTC)Reply
Thank you for your further comments. Your comments do not directly or adequately address the points in my just above post.
In the past it has been proposed to include the Onsager relations as "laws of thermodynamics", but the proposals have not stood up.
Why? Clues are in the headings 'Additional laws' and 'Onsager relations'. As ordinarily understood, they are not numbered amongst the laws of thermodynamics. That is not to deny that they are interesting, notable, or important. For an encyclopaedia, completeness does not mean that a particular topic appear in some chosen article; it requires that it appear in an appropriate article. This article is not about non-equilibrium thermodynamics; it is about the ordinarily recognised laws of macroscopic thermodynamics. With perhaps exceptions, established texts that discuss the Onsager relations do not refer to them as expressing a "law of thermodynamics". Examples are Attard, Blundell & Blundell, Byung Chan Eu, Callen, De Groot & Mazur, Grandy, Landau & Lifshitz, Lebon & Jou & Casas-Vázquez, Truesdell. Gyarmati refers to "Onsager's linear laws" and to "the Onsager principle", and to "the Onsager theory". I grew weary of searching standard texts before I found an exception. Sad to say, I do not have access to your cited book by Deffner. I think the reference to the Encyclopaedia Britannica lacks cogency for the present article. A reference to a research article in a journal is also inadequate. Wikipedia is not a reliable source.Chjoaygame (talk) 13:14, 13 March 2021 (UTC)Reply
Why have those proposals not stood up? I only see a previous version of that section [1] but I can see why it was removed, it was poorly written. If what we need are textbooks there are many others, just a Google search gives you [2][3][4][5], it was even referred as such during the Nobel award ceremony speech [6]. Note that I am not arguing for Onsager laws to be addressed indisputably as fourth law. I am just saying that somewhere in this page there should be minimal discussion about Onsager reciprocity relations as a candidate to the fourth law. One proposal could be to open a section for relevant candidates.--ReyHahn (talk) 15:36, 13 March 2021 (UTC)Reply
The section is short and to the point. Onsager's work was instrumental to the field and he was recognized for this. I see no reason this should not be included. The article is about laws of thermodynamics, not necessarily the three or four laws of thermodynamics. There is nothing confusing about this section. On the contrary, it is clear and less confusing than other recent additions about non-equilibrium aspects in WP's thermodynamics articles. kbrose (talk) 20:28, 14 March 2021 (UTC)Reply
edit

Shouldn't there be something on the popular (humorous) mnemonic summary of the 0-plus-3 laws?
(
(0 - everyone is playing the game)
1 - you can't win
2 - you can't break even
3 - you can't get out of the game
)

I mean, there *is* a "see also" link to Ginsberg's theorem at the very end,
but that's all.

What I mean is:
shouldn't there be something in the text of *this* article too,
likely even its own little section? [Contribution by editor Kayseychow 11 May 2021]

This summary has been attributed to C. P. Snow; see for example this website. I agree that it might be interesting or amusing to add it as a concluding section. Dirac66 (talk) 20:47, 11 May 2021 (UTC)Reply
There is separate article for that: Ginsberg's theorem.--ReyHahn (talk) 22:18, 11 May 2021 (UTC)Reply
Thanks. I see that it has been in the See also list of this article for 11 years, but I would never have thought to look there. So to help others, I have added a brief parenthetical note to the See also item. Now if someone searches for the words "can't win" or "can't break even", they will be directed to the right place. Dirac66 (talk) 23:12, 11 May 2021 (UTC)Reply

onsager section

edit

the subscripts should be defined, i don’t know if it’s right 192.184.204.28 (talk) 05:33, 4 February 2023 (UTC)Reply

History

edit

Maybe the history section should mention Mayer and Joule? DrKN1 (talk) 12:00, 21 January 2024 (UTC)Reply

Biology

edit

Ways in which energy is lost in a ecosystem 105.113.106.24 (talk) 21:59, 29 April 2024 (UTC)Reply