Talk:Laws of thermodynamics/Archive 1
This is an archive of past discussions about Laws of thermodynamics. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 |
Are the 'laws' of thermodynamics a subset of the principles of energetics?
I made the assertion that the 'laws' of thermodynamics can be considered a subset of the principles of energetics. Karol edited this on the account that most people do not see this to be the case.
1. Do most people believe that the 'laws' of thermodynamics are not subset of the principles of energetics? 2. Does it matter how many people believe in something for it to be true or useful? 3. What is a 'law' or 'principle'? 4. Is the fourth principle which is based on electronics, a fourth law of thermodynamics?
In the text it says. They are often associated with concepts far beyond what is directly stated in the wording. I think that this might be a good point to provide an example so I have added the following, For example, some scientists believe they can be considered a subset of the Principles of energetics, however currently this is not a widely held opinion. I hope that this is approprite. Sholto Maud 21:38, 29 November 2005 (UTC)
- I think extensions and less widely held opinions may need a separate section, but if its not a widely held opinion, I don't think we should put it in the introduction. I moved it to a separate section, along with Heat death. PAR 03:09, 30 November 2005 (UTC)
- Nice one PAR :). I'd still like it confirmed whether the Fourth principle is included in the "laws" of thermodynamics Sholto Maud 03:20, 30 November 2005 (UTC)
- I dont know about that - but I do know that the Onsager reciprocal relations are sometimes called the fourth law of thermodynamics. I'm not to familiar with them, though. PAR 04:12, 30 November 2005 (UTC)
Sholto, I'm sorry for my rapid edit, I meant to comment on it. I don't have the time right now, but I'll get back to it soon. Karol 08:41, 30 November 2005 (UTC)
One of the most crippling aspects of the laws of science, is how researchers and theorists take them for granted as inviolable prescriptions of how nature must work. It is important to remember that they are merely descriptions of how nature universally and invariably appears to work, and may be disproved if new facts or evidence arise to contradict them. Many speculative theories in other Wikipedia articles link to this article with the suggestion that the theory is impossible because it would violate one of these laws, when in fact the theory may be an exception to these laws. —Preceding unsigned comment added by Gcsnelgar (talk • contribs) 05:37, 29 December 2007 (UTC)
Some background, history?
Since now there is a separate article about these laws, it would be nice to put in some background info, a little bit of history, maybe some anegdote? Karol 20:50, 28 October 2005 (UTC)
- I noticed the history section makes unclear commentary about the transition from using the term principle to using the term law to name the laws of thermodynamics. This isn't really a strong or relible point to make. It's nomenclature only. It doesn't mean much beyond that.--68.122.193.47 (talk) 03:19, 24 December 2007 (UTC)
- But how are these related? —Preceding unsigned comment added by 98.232.108.182 (talk • contribs) 16:10, September 4, 2010
Maxwell's demon
Maxwell's demon might exist, but it must use energy to detect the particles it is selecting, and the resultant increase in entropy offsets the amount lost by the actions of the demon. PAR 03:18, 14 November 2005 (UTC)
We don't have an article on the combined law of thermodynamics. Should it go here? -- Kjkolb 12:45, 1 December 2005 (UTC)
- This is presently part of the Thermodynamic potentials article under "fundamental equations", but yes, it is in a way the most fundamental of the fundamental equations and should be mentioned here. PAR 04:02, 2 December 2005 (UTC)
- Added to page per request (I've seen this version used as well).--Sadi Carnot 02:21, 5 April 2006 (UTC)
Other Laws
In my General Chem Lecture the other day, the professor mentioned a -1st law which defines the number of variables needed to completely define the state of a system given the number of components in the system. Is there a reason why that is not included here? Gershwinrb 08:48, 13 January 2006 (UTC)
- What you speak of is not the 1st Law, but rather the state postulate developed by Willard Gibbs, i.e. the number of properties required to fix the state of a system is given by the state postulate, which dictates that the state of a simple compressible system is completely specified by two independent, intensive properties. See: thermodynamic state.--Sadi Carnot 00:33, 5 April 2006 (UTC)
Energy...work law of t/ds
What ever happened to "energy can be created nor destroyed, only converted to a different form, or used to do work", and I know this isn't possibly a valid source but my chemistry teacher (yes I am fully aware it is physics but they cross over a lot in the AS Mod: CHM2) states this as the first law, and he's an Oxford graduate, discuss if you don't mind. Have you missed it, or put it in overcomplicated terminology so that Ive missed it.
Infinity
I'm a high school student. The way I understood the article it said that everything is going to end eventually simply because it began. Am I right or wrong. Please correct me. But if I am right doesn't that imply that is something doesn't stop never began and therefore goes on into infinity?ch 17:25, 24 March 2006 (UTC)
- Close; but, essentially, from what we currently know of the universe, there are 12 fundamental particles and 4 fundamental forces, and this collection of particles interact via exchanges such that localized associations of these particles will tend to evolve onto equilibrium configurations or "attractors" for set durations of time, which we can loosely ascribe to as being the twice the half-life of the configuration; the 4 Laws of thermodynamics simply set limits to the types of interactions possible.
- Once an observable interaction begins it will spontaneously continue to react, interact, or evolve towards a more stable energy configuration and through this process some energy will be lost to dissipation (2nd Law). Next, the energy-mass content of the initial "state" of each interaction process must be equal to that of the "final" state (1st Law). Each such observable system can be assigned an arbitrary temperature, i.e. the ability of the system to give up energy; if two adjoined systems have the same temperature they are said to be in thermal equilibrium (0th Law). To "force" any such system to 0 degrees Kelvin would require an infinite amount of energy; hence, absolute zero is unattainable (3rd Law). Regarding whether or not these interactive processes will continue on into infinity, that question currently remains open. This is what the article should essentially say.--Sadi Carnot 00:59, 5 April 2006 (UTC)
"Chaos"
The term "chaos" is used in the following sentences: "the Second Law states that energy systems have a tendency to increase their entropy (chaos) rather than decrease it" and "if one imagines atoms flying around in a box, hitting each other randomly, all the time, one can imagine a lot of chaos. Then, imagine what would happen if the temperature begins to decrease. The atoms slow down, hit each other less frequently, begin to settle as gravity has more effect on them; the chaos decreases" in the second and third law sections, respectively.
Does anyone else agree that using the term "chaos" to describe entropy is a misleading, if not outright incorrect, description of entropy? For one thing, "chaos", in this context, isn't really given a firm definition, and I assume we're left to picture a box full of gas particles flying about at high speeds and in random directions and colliding with each other all the time, or some such scenario. This picture, though it might describe what's going on in the box, doesn't seem to me to provide an accurate conceptual picture of what entropy is.
I would suggest that the meaning to be conveyed by using the word "chaos" is presented more clearly, and that the word "chaos" be removed completely, so that people do not make erroneous connections between entropy and what they think chaos means (or even Chaos Theory, I suppose).
-Unregistered User, 07/31/06
- This seems like a reasonable idea (although many do argue that entropy is a measure of chaos); I'll make the change to "transformation content"; see: Mechanical Theory of Heat – by Rudolf Clausius [1850-1865]; Thanks:--Sadi Carnot 02:57, 1 August 2006 (UTC)
The 19th century notion of describing entropy as "chaos 'or "randomness" is silly and circular. The statement above about a system with "increasing chaos" attempts to quantify a chaos, which is quite ill-defined in a quantitative sense. No ! Please read a book on modeern statistical physics. Entropy is directly proportional to the logarithm of the number of available states in a system; that is a quantitative and theoretically measurable definition. See Kittel, "Solid State Physics", J.Wiley circa 1970 for a good development.
And at any rate, if you're going to use that probably inaccurate qualitative explanation, you should say "disorder" rather than "chaos," as chaos has a very specific mathematical meaning nowadays which is far removed from what would it mean in this context.--129.2.109.84 17:43, 27 August 2007 (UTC)
"Biology"
I've changed
- The majority of these tentative fourth law statements are attempts to reconcile the thermodynamics with evolution, predominantly.
to
- The majority of these tentative fourth law statements are attempts to apply thermodynamics to biological evolution.
since the former seemed to suggest a contradiction between thermodynamics and the theory of evolution (as well as some observed phenomena). Even if that were the case, however, adding an additional law cannot possible rectify a contradiction. From the material I have read (albeit, on Wikipedia) it seems that the latter statement more correctly summarizes the state of theory.
"Question"
RE: The first law of thermodynamics, that energy can niether be created or destroyed, if this were true how did energy come into being in the first place for if it cannot be created then how is it that it even exists? At some point didn't have to be created to exsist today? Makes you think doesn't it! Paul-11/28/06
yes, this implies that the 1st law is either wrong, is not a law (but an owerwhelming probabbility) or energy wasnt even created during the big bang. if latter is true, energy existed before the big bang. —Preceding unsigned comment added by 77.234.145.84 (talk) 12:09, 7 February 2010 (UTC)
Andrej 7.2.2010
Recent vandalism
On November 9, 2006 user 59.95.110.209 removed six main sections of this article in their entirety and completely blanked two related articles. Administrators: please revert to October 31, 2006 version by 68.186.185.49. --Neptunius 07:50, 4 December 2006 (UTC)
Edit from: 18:14, 11 November 2007 JzG (Talk | contribs) (14,319 bytes) (rm. blacklisted link) makes no sense; vandalised? —Preceding unsigned comment added by Snakeater (talk • contribs)
Third Law
I have included a very brief but correct statement of the core of the third law. The previous statement was very fuzzy, and included surprisingly irrelevant concepts that are much more complicated than the simple ingredients of the third law. I would welcome further editing to improve the appearance and expand upon the discussion and the historical references. For the mean time, however, it seemed important to have something correct and to the point. --Mbweissman 14:50, 23 December 2006 (CST)
- Any idea what the statement at the end of this section means? "This law is in fact proven false by all evolutionists and creationists." Should we remove this or does it have some significance? --Acjohns 03:53, 15 April 2007 (UTC)
- Christian vandalism Hamsterlopithecus (talk) 18:14, 23 April 2010 (UTC)
Second Law
I've heard from many sources that the law of conservation of matter and energy is only broken in the conversion of matter into energy as opposed to merely the release of energy. I am speaking of course of nuclear reactions. If we have consensus I'll edit the article. —The preceding unsigned comment was added by 63.229.221.138 (talk) 01:25, 4 January 2007 (UTC).
Could you please edit the double-quoted definition too. The double negative and madarjaat reference to "cycles" makes the wording confusing and almost useless. This isn't the 19th century and we don't need the stilted language constructs.
Equations
Hey, the equations in Overview do not explain what each variable is, and they're definitely not self-explanatory to an outsider to physics. Would someone care to note which variable is which? Thanks. -Matt 15:37, 4 March 2007 (UTC)
what about in nuclear processes?
first law says that "In any process, the total energy of the universe remains constant."
what about in nuclear reactions where energy is actually created via conversion from matter? btw, is there proof of this occuring?
- In fact all conversions of energy to/from some potential energy (such as nuclear potential energy) alter the mass of the particles... so two hydrogen atoms weigh more than one hydrogen molecule, because some energy was released in the reaction. This is discussed briefly in mass-energy equivalence Acjohns 03:46, 15 April 2007 (UTC)
I would say that, "The total energy (including the E=MC^2 energy) of a closed system, is a constant". "Doing work", is just another kind of conversion. AFAIK, the entire universe, is such a closed system. —Preceding unsigned comment added by 70.88.177.137 (talk) 09:45, 30 December 2008 (UTC)
- Nuclear reactions convert one form of energy (the energy stored in a nucleus) into another form of energy (heat). It has been proven a million times. --Steve (talk) 01:15, 31 December 2008 (UTC)
Clausius Inequality - Typo?
I am afraid, there may be a mistake in the Clausius inequality formula on this page (in Overview / Second law of thermodynamics and also the page Second law of thermodynamics), as the integral must be less or equal to zero (not the other way as on the page):
I'd rather stick to the differential form
Let's run a cyclic process now and integrate
We get , which I stated above. (Zero on the left-hand-side is just as entropy is a state function)
You can see this result also on page Clausius theorem. Crocusino 17:43, 12 May 2007 (UTC)
- There's no typo, that page has the math all backwards. I put a clean-up tag there. --Sadi Carnot 15:37, 30 September 2007 (UTC)
Creationist PoV pushing
User 80.195.246.104 keeps adding false statements claiming that the laws of thermodynamics contradict the Big Bang and Evolution. These are recognisable as common creationist arguments, and whilst I know I'm supposed to assume good faith it's hard to do so when such comments are stealthily added with the edit summary claiming that it's merely "(Small additions, corrections, hyperlink-insertions etc)". In keeping with wikipedia policy the bold claims this user keeps inserting require reputable scientific sources before they can be allowed to stay. Ifitmovesnukeit 23:24, 24 June 2007 (UTC)
- I revered back some, and cleaned. It should be good now. --Sadi Carnot 15:26, 30 September 2007 (UTC)
The whole article is a mess
For a subject of such fundamental importance as thermodynamics this page is a complete mess. The "overview" is useless, the essential physical content of the theory being obfuscated by undefined symbols. The claim that the second law is not a law but a theorem is interesting, if true, but certainly shouldn't be made without citing the article where this is proven. —Preceding unsigned comment added by 65.19.15.68 (talk) 04:10, 7 April 2008 (UTC)
Five laws?
Moved from History section: "Presently, there are a total of five laws."
Seems to be contradicted by the rest of the article and doesn't have a cite. -- 201.37.229.117 (talk) 20:50, 5 May 2008 (UTC)
Second Law
The section on Second Law makes the statement:
“A way of looking at the second law for non-scientists is to look at entropy as a measure of chaos. So, for example, a broken cup has less order and more chaos than an intact one.”
The use of the broken cup analogy is very misleading to any non-scientist reading this article because it suggests to the reader the conversational usage and understanding of the terms order and disorder. Reading the Wikipedia article on Chaos suggests a completely different understanding of what is meant by chaos. Many people will assume they know what chaos is and may not bother. And even if they do read the article on chaos, they will be left confused by what the broken cup analogy was supposed to suggest to them. I certainly think a better analogy can be found and if it can’t then this analogy should simply be replaced by the following sentence quoted from the article on Chaos.
“Chaotic systems are systems that look random but aren't. They are actually deterministic systems (predictable if you have enough information) governed by physical laws, that are very difficult to predict accurately (a commonly used example is weather forecasting).”
Ive heard a better analogy as well...ice cubes in a glass of water will eventually melt..leading to maximum entropy of that closed system —Preceding unsigned comment added by 24.63.28.22 (talk) 02:19, 16 October 2008 (UTC)
Rewrite "overview" section
As has been pointed out on this page, the "overview" section as written is completely useless. I propose the following replacement:
-
- This law
defines temperatureunderlies the definition of temperature.
- This law mandates conservation of energy, and states in particular that heat is a form of energy.
- This law states that perpetual motion machines are impossible, or (equivalently) that the entropy of the universe always increases.
- This law defines the absolute zero of temperature and the zero-point of entropy.
- Onsager reciprocal relations - sometimes called the Fourth Law of Thermodynamics
- This law gives a quantitative relation between the parameters of a system in which heat and matter are simultaneously flowing.
- This law
Sound OK? :-) --Steve (talk) 06:16, 5 November 2008 (UTC)
- Anything is surely better than the terse mathematical summary there is at the moment! A few comments though: The third law doesn't define absolute zero, but merely talks about the impossibility of attaining it. And I think it's a bit strong to say the zeroth law defines temperature. LeBofSportif (talk) 15:47, 5 November 2008 (UTC)
- I changed the zeroth law. I'm not actually sure the third law "talks about the impossibility of attaining" absolute zero. For example, my thermo textbook ("Thermal Physics" by Kittel/Kroemer) doesn't mention that as part of the third law. The third law of thermodynamics page mentions this only in a brief, uncited, incongruous way. Is there a citation for the claim that this is part of the "third law of thermodynamics"? I'm also posting this on Talk:third law of thermodynamics. --Steve (talk) 16:06, 5 November 2008 (UTC)
- Ok, I was a bit loose in phrasing and not really thinking. The absolute zero of temperature is established by the other laws of thermodynamics, not by the third law. The third law does have implications for its attainability. If there are experts loitering around on the third law page then their input would be helpful to decide a rewording. LeBofSportif (talk) 01:59, 6 November 2008 (UTC)
Zeroth law as opposed to fourth
The article says:
- While this is a fundamental concept of thermodynamics, the need to state it explicitly as a law was not perceived until the first third of the 20th century, long after the first three laws were already widely in use, hence the zero numbering.
This does not explain why it is called the zeroth law and not the fourth law. I would guess that the reason for calling it the zeroth law that it is more fundamental than the other three, but this is not stated. — 217.46.147.13 (talk) 14:25, 15 May 2009 (UTC)
I don't have time at the moment to make the appropriate changes, but the "zeroth law" is not in fact the most fundamental. Really the 2nd law is. From it one derives that any two large systems must, in equilibrium, have the same derivative of entropy with respect to any exchangable conserved quantity. Since energy is such a quantity (that's the first law), and 1/T is defined to be that derivative, any two systems large enough to have a defined T must have the same T in equilibrium if they can exchange energy. The part about a third system follows trivially from transitivity of equality. So the zeroth law is really superfluous. The third law does have some content: up to the Kramers degeneracy, the ground states of systems seem to be unique. The current discussion on that and other issues is fuzzy enough to be worth clearing up. —Preceding unsigned comment added by Mbweissman (talk • contribs) 21:30, 10 May 2010 (UTC)
It may be fortunate that the previous talker doesn't have time at the moment to make the changes that he tells us would be appropriate. He has a rather esoteric and rationalistic bent that I think is at odds with the more empirical approach to thermodynamics indicated by the usual numbering of its laws. The special approach to the subject that its proponent Truesdell calls "Rational Thermodynamics" does indeed take entropy as definable before temperature is defined, but this is likely to mystify most readers, for they are mostly not advanced experts, I think. The more customary approach is to use the concept of temperature to define entropy. The numbering of the laws in the article as it now stands is standard amongst standard textbooks. The previous talker is apparently telling us that he would like to see his own research, into how the fundamentals of the subject are most rationally constructed, used to edit into the page his own invention of a new numbering system; this kind of own-research editing I think is against Wikipedia principles.Chjoaygame (talk) 07:45, 11 May 2010 (UTC)
"the entropy of the universe"
There are two usages of the word universe. One usage is the ordinary language one, which refers to all that actually is. I think it likely that people reading the article will take this meaning. The other usage is more technical for thermodynamics. It refers to a finite part of the ordinary language universe. It is quite likely that a naive reader may confuse these two usages, and so be seriously misled.
The notion of entropy requires a system and its surrounds. Entropy concerns transactions of heat and work between the system and its surrounds. In the technical usage of "universe", it means the finite surrounds of the system, and thus has a definite physical meaning. The key concept of thermodynamics is temperature, as defined by the zeroth law. To define the temperature of a system, one must have a reference system in thermodynamic equilibrium to constitute one's thermometer, which is the surrounds in suitable cases. It is then called a heat bath or thermostat.
The ordinary language universe includes all that actually is, and thus has no actual surrounds. It can exchange heat and work with nothing. Accordingly, entropy is not defined for it. The technically defined universe is no more than a loose wording for the definite surrounds of a definite system; for this usage, the content of the second law does not need the rather colourful but potentially seriously misleading use of the word 'universe'.
Thinking a little more empirically about the entropy of the universe. It is not really known for sure whether the universe in the ordinary language sense is finite or infinite. In the light of this kind of uncertainty, one can hardly reasonably talk of measurements of the entropy of the universe.
Thinking a little more about the physics of the known universe. There are those who think that there are such things needed to explain it as "dark energy" and "dark matter". There terms are euphemisms for statements that on present empirical understanding, the first law of thermodynamics is wildly in error. The terms "dark energy" and "dark matter" are just polite ways of saying that our present empirical knowledge does not let us verify the first law for the known universe. The terms "dark energy" and "dark matter" do not have definite physical meanings. One must then ask, why would we expect the second law to be verified if the first one cannot be, for the known universe.
Thinking a little more about the meaning of entropy. In strict classical thermodynamics, entropy is defined only for systems in thermodynamic equilibrium. It is believed by some, and one cannot necessarily prove them wrong, that the universe began at some finite time, long ago; people talk of the "Big Bang". Such a universe is very far from being in thermodynamic equilibrium, and thus entropy is not defined for it.
Yes, many perhaps otherwise reliable sources talk of the entropy of the universe, but I think their talk in every case falls under the above criticism and is essentially wrong or seriously misleading.
It is not wise for scientists to claim to know things that they do not really know. Science is under enough attack these days without it adding to its problems by making claims about the universe which it is not really sure about.Chjoaygame (talk) 07:50, 16 August 2009 (UTC)
- It's not true that entropy is defined only for systems in thermodynamic equilibrium. For example, if I have a box with a partition, and half the box is a gas with temperature 50°, and the other half is a gas with temperature 60°, then it's very easy to compute the total entropy of the box (it's the sum of the two halves), but the box is obviously not a system in thermodynamic equilibrium. If entropy was only defined for a system in thermodynamic equilibrium, then there would never be a physical situation where entropy increases, because a system in thermodynamic equilibrium is stable and won't change.
- But whatever...I think the way you rephrased it is fine, even though I think it was also fine before. So, no point in arguing further :-) --Steve (talk) 19:31, 16 August 2009 (UTC)
Thank you for your reply. It is valuable to discuss these questions. You propose the box with the partition, that is then removed; this is an excellent example to consider. It depends how one defines 'system'. If one defines the system to be 'box with partition in place', one can say that it is in equilibrium while the partition is in place; and its entropy is the sum of the entropies of the two partitioned components; this is called extensivity; extensivity is not quite as simple as it might seem; you might like to read Walter T. Grandy, Jr, "Entropy and the Time Evolution of Macroscopic Systems", Oxford Science Publications, Oxford, 2008, ISBN 9780199546176, Chapter 5, The Presumed Extensivity of Entropy, pages 59-68, on this point. This may seem an artificial way of considering thermodynamic equilibrium and perhaps it is, but it is the classical thermodynamics way to approach the thing. When one considers a new system, 'box that was in equilibrium with partition in place, but is now out of equilibrium because partion has recently been removed', then one can wait till it reaches equilibrium and calculate its entropy again. It will, as you say, have increased. One can try to define entropy for systems really not in thermodynamic equilibrium, but that is not within the scope of classical thermodynamics, and requires great care and thought. The key to classical thermodynamics is the existence of a unique temperature. In non-equilibrium thermodynamics it is not necessarily so that a temperature exists, not even a local temperature. This is why classical thermodynamics does not extend to non-equilibrium systems.Chjoaygame (talk) 01:54, 17 August 2009 (UTC)
- I think you misunderstand me. The partition is never removed. A box, with a partition, with different temperatures on the two sides, is not in thermodynamic equilibrium. Something in thermodynamic equilibrium has one and only one temperature. Therefore the box with a partition is not a system in thermodynamic equilibrium. But its entropy is well defined and easily calculated in classical thermodynamics. Therefore, classical thermodynamics is capable of calculating the entropies of at least some out-of-equilibrium systems.
- I agree that you can't figure out the entropy of an arbitrary out-of-equilibrium system with classical thermodynamics. But you can with modern physics, and the second law will still hold. The second law was originally about heat engines that used ideal gases as working fluids, but we now know that it's a very broad and general thing. I think it would be misleading to say that the second law is a statement about equilibrium sub-systems undergoing quasi-static transformations, or whatever. It's general, and we should say it's general...it's not the 19th century anymore. --Steve (talk) 02:25, 17 August 2009 (UTC)
Thank you for this comment. For classical thermodynamics, I think one needs to admit systems with several compartments. For example, one considers a system with vapour and liquid phases in equilibrium with each other. One can imagine putting an isolating partition between them and calculating their separate entropies. One can imagine the two separate compartments transacting with separate respective environments and then being isolated from them, and then being allowed to communicate with each other. If you like to say that the 'box with separated isolated compartments each separately in thermodynamic equilibrium' is not in thermodynamic equilibrium then you are free to do so, of course. But you calculate their entropies separately. An isolated compartment in thermodynamic equilibrium has one and only temperature, as you say.
Perhaps you may wish to read more of Grandy's book mentioned above. It covers the questions you raise.Chjoaygame (talk) 09:49, 17 August 2009 (UTC)
References
An article titled "A FOURTH LAW OF THERMODYNAMICS" is cited as a reference. In the pdf referenced, the name of the publication is "Chemistry". In the reference, that name is copied. Chemistry gives an authority to the reference, but after opening the pdf it didn't seem to be from a top-level publication, even though I'm not from chemistry area.
Actually, after some good research I did, I understood the paper existed, but was actually from Khimiya/Chemistry. A Bulgarian journal for chemical education. Okay, its published. But the name "Chemistry" should be changed.
I'm not used to wikipedia and I don't know how to do that, as only {References} appear in the edit page. —Preceding unsigned comment added by Japaa (talk • contribs) 13:03, 10 September 2009 (UTC)
time to delete the section "Tentative fourth laws or principles"
The section "Tentative fourth laws or principles" is half-baked waffle and should be deleted without replacement. There is nothing in the present section that is worth keeping in the Wikipedia. Occasional proposals of laws beyond the classical ones are noted already in the introduction of the page: "During the last 80 years writers have suggested additional laws, but none of them have become well accepted." This is a wise remark and it shows that comment about further proposed laws is not appropriate for the Wikipedia, which is not in general a place for vague and disorganized speculation. The prolonged life of the present section "Tentative fourth laws or principles" stands against this wise remark, and is to be regretted. Time to end it.Chjoaygame (talk) 10:00, 23 January 2010 (UTC)
- I agree, but we should find a citation for that sentence in the introduction. If it were up to me, being lazy, I would just copy the citations from the section-to-be-deleted into that sentence in the introduction. --Steve (talk) 22:47, 23 January 2010 (UTC)
Thank you, Steve. But I would not like just to copy the citations as you suggest, because they are not representative enough, I think.Chjoaygame (talk) 00:52, 24 January 2010 (UTC)
a repetition of the view that it is time to delete
The section "Tentative fourth laws or principles" is half-baked waffle and should be deleted without replacement. Only one comment, by Steve, has been made in response to this proposal to delete the section. The comment agreed with the proposal to delete, but wanted a reference to support a relevant dismissive sentence in the main introduction to the page. Go ahead and supply such a reference. But still, now is the time to delete this section without replacement.Chjoaygame (talk) 21:26, 19 February 2010 (UTC)
section deleted as proposed, references preserved
The whilom section "Tentative fourth laws or principles" was half-baked waffle and, as proposed above, I have deleted it. The references are preserved beside a comment in the introduction to the page.Chjoaygame (talk) 00:42, 20 March 2010 (UTC)
- Nice job, thanks! :-) --Steve (talk) 07:31, 20 March 2010 (UTC)
Deleted sources in introductory statement
I have deleted the sources for the statement "There have been suggestions of additional laws". I do not doubt that sources for this statement exist, but the sources previously cited were from blogs and such. Hamsterlopithecus (talk) 18:28, 23 April 2010 (UTC)
time to delete the section "Summaries"
The section "Summaries" is trying to be clever and funny. But it doesn't deserve a place in the Wikipedia. It should be deleted without replacement.Chjoaygame (talk) 05:01, 31 January 2010 (UTC)
definition of work
It would be good if some expert and diligent editor were to put in more of the definition of thermodynamic work. The present page shows only the pV work, but there are many other components of thermodynamic work, such as gravitational, electric, magnetic, and I suppose others, that are not mentioned here.Chjoaygame (talk) 02:38, 19 May 2010 (UTC)
wording in section on second law
What about "Energy can be transferred as heat ..."? It would be a pity, I think, to try to expunge the long- and well-established word 'heat' just to replace it with a more technical sounding and polysyllabic term, such as 'thermal energy', when 'heat' has done such good service over the years, and is well accepted in the literature. We agree to restrict heat to refer to a kind of transfer of energy, but it is not necessary to say that such energy is 'thermal'.Chjoaygame (talk) 23:26, 4 June 2010 (UTC)
English grammar
Energy can be neither created nor destroyed. Energy can neither be created nor be destroyed. "Energy can neither be created or destroyed" is not English.Chjoaygame (talk) 02:29, 19 June 2010 (UTC)
Aristotle
It seems to me that the phrase "Although it is customary for physicists to ridicule Aristotle" at the begining of the history section is unnecessary and has pro-aristotelian tone that is irrelevant to the history of thermodynamics. I'm not an experienced editor so i think some discussion about the possibility of removing this phrase might be in order. —Preceding unsigned comment added by 65.190.37.19 (talk) 02:01, 17 July 2010 (UTC)
- What you say is reasonable -- I hope you rewrite the sentence. :-) --Steve (talk) 02:52, 17 July 2010 (UTC)
- It is true that Aristotle is often, when his name is mentioned, ridiculed by physicists. I originally put in the comment to warn readers that they should be aware that Aristotle might seem to some physicists to be irrelevant, but that should not remove him from consideration in the history of ideas of this kind. It is true that after Aristotle there was a serious stagnation of science for a long time.Chjoaygame (talk) 08:00, 18 July 2010 (UTC)
I agree that it's true that much of aristotle's work has been critisized in modern times, but I believe mention of that fact is unnecessary. His work on thermodynamics should stand on its own, without any mention of criticism towards his overall body of work. Save that for aristotle's biographical article, it shouldn't be included in a short history of thermodynamics. —Preceding unsigned comment added by 65.190.37.19 (talk) 17:31, 21 July 2010 (UTC)
Also the history section is poorly written in general. I think an experienced editor should rewrite it. —Preceding unsigned comment added by 65.190.37.19 (talk) 17:39, 21 July 2010 (UTC)
- Thank you for your comments. May I point out that I did, in response to your comment, actually immediately remove the phrase that offended you, and my comment here was only retrospectively stating my reason for putting it in originally, before I removed it in response to your concern. What is needed for a really useful rewrite of the history section is not just experience as an editor, but more importantly some serious knowledge of the history.Chjoaygame (talk) 23:36, 21 July 2010 (UTC)
Reasons for undoing edit of the zeroth law
The now undone edit differs fromt the previous state of the page (1) by using a notation A, B, C; (2) by referring to thermodynamic as distinct from thermal equilibrium; (3) being very nearly a changed repeat of the leading sentence that was already in the section that was edited.
The distinction between thermodynamic and thermal equilibrium is important. The zeroth law is stated in standard textbooks as being about thermal equilibrium, not about thermodynamic equilibrium. This is because the law is concerned only with the basic idea of temperature. The editor (who does not give a name) who made the now undone edit did not attempt to tell why he/she made the change, nor to justify it, nor to give a literature reference for it, nor to deal with the fact that the now undone edit introduced a clear inconsistency with the lead sentence of the section.
The now undone edit is not clear in that it does not make clear that the mentioned equilibria are intended in the law to be pairwise. This unclarity is also present in the leading sentence of the section. Perhaps this unclarity in the leading sentence should be remedied, but in any case it should not be repeated by a further edit. —Preceding unsigned comment added by Chjoaygame (talk • contribs) 21:14, 3 August 2010 (UTC)
Sorry I forgot to sign.Chjoaygame (talk) 21:28, 3 August 2010 (UTC)
- I restored the transitivity relationship, making sure that it described thermal equilibrium. Transitivity is a "pairwise" (binary) relationship: two systems are compared at a time. Please see equivalence relation. An equivalence relation has three parts:
- Reflexivity a->a
- Symmetry a->b implies b->a
- Transitivity a->b and a->c implies b->c
- You cannot derive any one of the above relationships from the other two, they are all independent. The transitivity statement is not "nearly a changed repeat of the leading sentence". PAR (talk) 00:37, 4 August 2010 (UTC)
Mathematics and physics
The present page contains: "Zeroth law Main article: Zeroth law of thermodynamics If two thermodynamic systems are each in thermal equilibrium with a third, then they are in thermal equilibrium with each other. When two systems, each in its own thermodynamic equilibrium, are put in purely thermal connection, radiative or material, with each other, there will be a net exchange of heat between them unless or until they are in thermal equilibrium. That is the state of having equal temperature. Although this concept of thermodynamics is fundamental, the need to state it explicitly was not widely perceived until the first third of the 20th century, long after the first three principles were already widely in use. Hence it was numbered zero -- before the subsequent three. The Zeroth Law implies that thermal equilibrium, viewed as a binary relation, is a transitive relation. Since a system in thermodynamic equilibrium is defined to be in thermal equilibrium with itself, and, if a system is in thermal equilibrium with another, the latter is in thermal equilibrium with the former. Thermal equilibrium is furthermore an equivalence relation: If a system A is in thermal equilibrium with both systems B and C, then systems B and C are in thermal equilibrium with each other."
The mathematical statement is not consistent with the physical statement. The physical statement of the third law, in full, requires that the systems be separately in thermodynamic equilibrium and then put into thermal communication with each other. This is not necessarily compatible with a mathematical binary relation, that does not, without special explicit statements, account for such operations as putting things into communication with each other and then separating them and putting them into other communications. The mathematical statement "If a system A is in thermal equilibrium with both systems B and C, then systems B and C are in thermal equilibrium with each other." does not seem to tell about the systems B and C being thermally connected with each other at the same time or at a different time from their respective thermal connections with each other. The wording of the account of the binary relation question is thus apparently fine for a mathematician, but has forgotten the physics. The first version of it forgot the difference between thermodynamic equilibrium and thermal equilibrium.Chjoaygame (talk) 07:44, 18 August 2010 (UTC)
Mathematics and physics again
Starting from scratch, one would have trouble learning from this article that thermodynamics is about the physical distinction between heat and work. For a mathematician the distinction is just a trivial matter of arbitrary axiomatics, but for a physicist it has physical meaning that appears to be of no interest to mathematicians, and is hardly to found in this article. For the mathematician, slick verbalistic axiomatics is what matters; for a physicist, what matters is an understanding of the physics.Chjoaygame (talk) 04:34, 5 December 2010 (UTC)
- I think that's part of a bigger problem: The total lack of statistical mechanics in this article. The 0th, 2nd, and 3rd laws were historically axioms, but now we understand that they are consequences of the more fundamental laws of statistical mechanics. --Steve (talk) 17:45, 5 December 2010 (UTC)
- Physics without mathematics is dumb (as in "unable to speak", not as in "stupid"). If you don't understand the mathematics, you don't fully understand the physics, although if you do understand the mathematics, that does not mean you understand the physics. Its not helpful to declare that there are two camps, the mathematicians who are not interested in physics and physicists who are not interested in mathematics.
- Thermodynamics without statistical mechanics is a complete theory, based on macroscopic measurements, but does not give the huge insights into the physics of the situation that statistical mechanics provides. The laws of thermodynamics are concise summaries of countless experiments, which may be taken as "axioms" in the development of the mathematical theory of thermodynamics, without recourse to statistical mechanics. Statistical mechanics explains these laws on the basis of more fundamental "laws" which are postulates, not directly verifiable by experiment, but whose validity is affirmed by their ability to explain the laws of thermodynamics. I agree with Sbyrnes321, the insights into the laws of thermodynamics provided by statistical mechanics should definitely be included in the article, but it should be made clear that the validity of laws of thermodynamics does not depend on their ability to verify the postulates of statistical mechanics, but rather that the validity of statistical mechanics rest on its ability to explain the laws. PAR (talk) 05:55, 6 December 2010 (UTC)
For one, I think all this article needs is a very simple reference to the explanation of macroscopic thermodynamics in terms of statistical mechanics. I would favour no more about it in this article. One can get a long way to understanding the nature of entropy without going into the intricacies of statistical mechanics, though it is not often done. I have put in a simple reference.Chjoaygame (talk) 07:36, 6 December 2010 (UTC)
- Yes, obviously it is possible to use the laws of thermodynamics without understanding their basis in statistical mechanics, but why would you want to? For historical interest? The statistical mechanics picture is more true to reality. It is also more intuitive, especially for entropy but also for temperature etc. (at least in my own experience teaching and learning the topic...maybe there's a good macro-only intuitive picture, but I haven't seen it?). Therefore, it seems to me that explaining these things in macro-only terms is only a historical exercise, not a useful thing to do today. Obviously I'm not proposing writing out detailed derivations of each law here -- just outlining the physical picture in statistical-mechanics terms, and why the laws make sense if you think about molecules moving around.
- PAR, you say "the validity of laws of thermodynamics does not depend on their ability to verify the postulates of statistical mechanics, but rather that the validity of statistical mechanics rest on its ability to explain the laws". Everyone thinks that statistical mechanics and thermodynamics are both correct. But it seems to me, if anything, that there is even more empirical evidence for statistical mechanics than empirical evidence for thermodynamics. (Statistical mechanics correctly predicts everything in classical thermodynamics, and statistical mechanics also correctly predicts many things that are not part of classical thermodynamics.) :-) --Steve (talk) 21:48, 3 January 2011 (UTC)
Defeating the purpose of the article
The current formulation of the section on the zeroth law defeats the purpose of the article if the article is to be thought of as being about physics. The point of the zeroth law is that it does not presuppose the concept of temperature, while this formulation does presuppose it. There is in the current formulation of the section on the zeroth law no indication of the physical distinction between equality of temperature of two thermally separate systems and thermodynamic equilibrium between two systems in thermal connection.Chjoaygame (talk) 07:53, 6 December 2010 (UTC)
The present article includes the words "it defines temperature in a non-circular logistics without reference to entropy". But I think the present article is itself circular in logic (logic is the right word here, not 'logistics', which means something different). It seems from the main statement that the three systems must from the outset be in thermal connection with each other: "If two thermodynamic systems are each in thermal equilibrium with a third system, then they are in thermal equilibrium with each other." There is no notion in this statement of separate equilibria of the three systems, that they might or might not be in diathermal connection with each other, and subject to possible connection or disconnection. If all three are really in thermal connection with each other, then surely, properly speaking we have only one system. I think that separability and connectibility are essential to the logical use of the zeroth law as proposed in the comment about non-circularity. This is the physics. It is true that some admirable and authoritative texts do not make this clear when they prefer the present slick wording; but we are interested in the basic ideas here, not just slick wording. Moreover, a discussion of the notion of equivalence is not appropriate here; it is implicit in the term equal, and arises when one deduces from the zeroth law that two systems in thermal equilibrium have equal temperatures; a discussion of the notion of an equivalence relation belongs to an entire article on the subject. This is the mathematics.Chjoaygame (talk) 22:20, 22 February 2011 (UTC)
- Regarding the interpretation of the zeroth law, two systems are in equilibrium if they are in diathermic contact and neither one experiences a change of thermodynamic state. They are also in equilibrium when they are not in diathermic contact, but their thermodynamic states are such that IF they were put into diathermic contact, neither one would experience a change of thermodynamic state.
- Even slicker! Would you very kindly be willing to give a textbook reference for this?Chjoaygame (talk) 07:11, 23 February 2011 (UTC)
- Buchdahl "Concepts of Classical Thermodynamics" on page 29 says it this way - (I am simplifying) - If Pi is the set of thermodynamic parameters of system i which specify its thermodynamic state, then equilibrium between system a and b can be expressed as f(Pa,Pb)=0, between b and c as g(Pb,Pc)=0 and between a and c as h(Pa,Pc)=0, where f,g, and h are some functions of the two sets of parameters. The zeroth law says that if f(Pa,Pb)=0 and g(Pb,Pc)=0 then h(Pa,Pc)=0. I don't like this statement because it unnecessarily uses the idea of thermodynamic parameters and functions of thermodynamic parameters, but it clearly implies that two systems do not have to be in contact in order for their mutual equilibrium to be defined.
- I think what you quote from Buchdahl is a sophisticated approach, indeed, if I read you aright, too complicated to be usefully stated here; and nearly all mathematics, hardly any physics. And still it is not explicit, but has to be read in as an implication, that two systems do not have to be in contact in order for their mutual equilibrium to be defined.Chjoaygame (talk) 17:28, 23 February 2011 (UTC)
- Mathematics is the language of physics. Buchdahl's statement is not mathematics, it is not provable mathematically, it is physics expressed in the language of physics. I agree, though, that its not the most explicit statement of what we are looking for. Perrot (A to Z of thermodynamics) basically repeats Buchdahl. Dugdale (Entropy and its physical meaning) says, (paraphrasing) that if A and B are brought into diathermic contact and achieve thermal equilibrium, and then A is brought into diathermic contact with C, and experiences no change, then B and C are in equilibrium.
- Regarding equivalence, I think it is appropriate here. The equality of temperature follows from the equivalence relationship, it does not imply the equivalence relationship. Temperature is not defined until the equivalence relationship is established. The equivalence relationship is more primitive than temperature equality.PAR (talk) 03:46, 23 February 2011 (UTC)
- I am not disputing that it is appropriate to mention that the binary relationship of mutual thermal equilibrium is an equivalence relation; but I am unhappy with how it is done here; I think the way it is done here puts unduly more emphasis on mathematical argument than on the physics. I think that the present text "The zeroth law implies that thermal equilibrium, viewed as a binary relation, is a transitive relation. Thermal equilibrium is furthermore an equivalence relation between any number of system. The law is also a statement about measurability. To this effect the law allows the establishment of an empirical parameter, the temperature, as a property of a system such that systems in equilibrium with each other have the same temperature. The notion of transitivity permits a system, for example a gas thermometer, to be used as a device to measure the temperature of another system." is not well structured logically and has a grammatical error and an inconsistency of terminology, and consequently should be re-written. The law as written does not seem to consider the possibility of systems not being in thermal equilibrium with each other, and therefore to be expected to be able to pass heat from one to the other, but relies for this on the subsequent sentence "When two systems, each internally in thermodynamic equilibrium at a different temperature, are brought in diathermic contact with each other they exchange heat to establish a thermal equilibrium between each other." So I think that sentence, improved by being re-written, should precede the statement of the law, as a preamble. I think it unlikely that a process of multi-editing will be a good way to solve this problem. Better that someone do it as a whole, and offer a proposal on the talk page before posting it in the article.Chjoaygame (talk) 07:11, 23 February 2011 (UTC)
- I agree, the idea of equilibrium should be introduced before stating the zeroth law, and I agree the paragraph could be written more clearly, but I think it is not in error. What exactly is the grammatical error and inconsistency in terminology you speak of? I think multi-editing will work as long as we do not view our edits as "turf". PAR (talk) 14:11, 23 February 2011 (UTC)
- I think it more important to re-construct the paragraph than to focus on the grammatical error and the terminological inconsistency.Chjoaygame (talk) 17:28, 23 February 2011 (UTC)
a fresh start
Thank you PAR. I am taking the liberty of making your proposal the start of a new section in this talk page, I trust by your leave.
Ok, how about this:
When two systems are brought into thermal ("diathermic") contact with each other, after a sufficiently long period of time, their thermodynamic states will not change in time. When this occurs, the two systems are said to be in thermal equilibrium. Additionally, if it is known that two systems will not change their respective states should they be brought into diathermic contact, those two systems are also said to be in thermal equilibrium.
The zeroth law of thermodynamics may be stated as follows:
If two thermodynamic systems are each in thermal equilibrium with a third system, then they are in thermal equilibrium with each other.
The zeroth law implies that thermal equilibrium, viewed as a binary relation, is an equivalence relation between any number of systems. The zeroth law allows the establishment of an empirical parameter, the temperature, as a property of a system such that systems in equilibrium with each other have the same temperature. The transitivity of the equivalence relationship implies that a thermodynamic system, for example a gas thermometer, may be used as a device to measure the temperature of another system.
[end of proposed re-write of the section on the zeroth law]
- Yes, this looks like a good start for a process.Chjoaygame (talk) 23:11, 23 February 2011 (UTC)
comment
The article on Thermodynamics states the four laws. The zeroth is there stated as follows.
*Zeroth law of thermodynamics: If two systems are in thermal equilibrium with a third, they are also in thermal equilibrium with each other.
This statement implies that thermal equilibrium is an equivalence relation on the set of thermodynamic systems under consideration. Systems are said to be in equilibrium if the small, random exchanges between them (eg. Brownian motion) do not lead to a net change in energy. This law is tacitly assumed in every measurement of temperature. Thus, if one seeks to decide if two bodies are at the same temperature, it is not necessary to bring them into contact and measure any changes of their observable properties in time.[1] The law provides a fundamental definition of temperature and justification for the construction of practical thermometers.
It is interesting to note that the zeroth law was not initially recognized as a law. The need to for the zeroth law was not initially realized, so the first, second, and third laws were explicitly stated and found common acceptance in the physics community first. Once the importance of the zeroth law was realized, it was impracticable to renumber the other laws, hence the zeroth.
How will we be influenced by the above?Chjoaygame (talk) 02:57, 24 February 2011 (UTC)
- Classical thermodynamics is a self-contained theory, it does not need statistical mechanics to prove its validity. It is essentially the four laws along with a bunch of measurements of the properties of a system (e.g. specific heat) with no inquiry into the microscopic basis of these measurements, and none is needed. Statistical mechanics needs to duplicate the results of classical thermodynamics in order to prove ITS applicability to the problems of classical thermodynamics, which it does extremely well. It also provides a microscopic theory of matter which predicts the properties of a system, rather than simply taking them as a raw measurement. Thermal equilibrium is a concept of classical thermodynamics, and therefore it needs to be defined in terms of classical thermodynamics, followed by the microscopic explanation provided by statistical mechanics. The classical thermodynamic definition is (roughly) that an isolated system is in equilibrium when the thermodynamic parameters specifying its state cease to change in time. Therefore, I object to the "small, random exchanges" definition, since it is a statistical definition. I would say that the thermodynamic definition implies, via statistical mechanics, that "small random changes ...". Furthermore, I object to the idea that the zeroth law "provides a fundamental definition of temperature". It provides an infinite number of possible definitions of temperature. The correct way to say it is that it allows the definition of any number of possible temperature scales. PAR (talk) 03:29, 24 February 2011 (UTC)
- Agreed.Chjoaygame (talk) 02:24, 25 February 2011 (UTC)
comment
There are two main kinds and perhaps ? exactly two kinds of thermal connection: radiative and conductive. To separate two bodies with respect to radiative transfer it is necessary to surround one or other by a complete opaque shield. If there is no complete opaque shield, they can be in radiative exchange equilibrium no matter by what finite distance they are separated, and will be so if each is in its own separate state of local thermodynamic equilibrium, and each is at its own uniform temperature, and those two temperatures are equal; they can be so under other conditions, but this will in general need rather complicated statements which are perhaps of not much interest to us here. It can be that two bodies connected by conductive but opaque and fully covering material are in conductive but not radiative thermal connection. It is easy to have two bodies in radiative but not conductive connection. Are there other kinds of thermal connection?Chjoaygame (talk) 02:57, 24 February 2011 (UTC)
- "Diathermic" means "able to conduct heat" and so radiative connection is a form of diathermic connection. I guess convective transfer of heat would another form, but at equilibrium it would be absent.
- Regarding radiation, this is like saying each system is really two systems, the massive system and the photon system. So rather than two systems, you have four systems, and then you can talk about the diathermic connections between all four.
- You can have a system whose massive particles are equilibrated, but whose photonic system is not. Sometimes you can ignore the coupling between the two and forget about radiation, sometimes not. Even the massive system can sometimes be separated. A fluorescent lamp, for example, it is often analysed as two systems - electrons and atoms. Each have their own temperature and are treated as coupled systems where each are nearly in equilibrium, with the exchange between them being so small as to not knock either out of equilibrium. PAR (talk) 03:46, 24 February 2011 (UTC)
- I like to think of convection as an aspect of bulk flow transfer, that moves internal energy about, and not as a form of heat transfer, which I like to limit to conduction and radiation, and perhaps something else that I haven't thought of.
- For our present purposes, I am not keen on this idea of talking about several systems when thinking of radiation. I think it enough to consider thermal radiation merely as a form of heat transfer. As noted by Christopher Essex, often textbooks on laboratory scale thermodynamics ignore radiation. Perhaps we should do so too? I now think so. At this stage, having two systems in the same place seems to me to be beyond the range of our work here.Chjoaygame (talk) 02:57, 25 February 2011 (UTC)
- I think thats right. Maybe a note and a link, but a detailed description belongs elsewhere. PAR (talk) 03:31, 25 February 2011 (UTC)
comment
I would say that the fundamental requirement for a system to be in its own separate thermodynamic equilibrium is that, after it is prepared, it should be left for a long time to settle, or to age. I think that there are few or no natural systems in thermodynamic equilibrium. Mostly it is an artificial state created by laboratory preparation.
The article on Thermodynamic equilibrium does not mention this till about halfway through, but hits the reader in the introduction with a requirement that there be no unbalanced potentials. Eventually, at the end of the section on System models, one finds:
"As time passes in an isolated system, internal differences in the system tend to even out and pressures and temperatures tend to equalize, as do density differences. A system in which all equalizing processes have gone to completion is considered to be in a state of thermodynamic equilibrium.
In thermodynamic equilibrium, a system's properties are, by definition, unchanging in time."
There is also an article on Equilibrium thermodynamics. One would get little idea of the aging requirement from it: it says "An equilibrium state is obtained by seeking the extrema of a thermodynamic potential function, whose nature depends on the constraints imposed on the system."
In the present article, the first statements about equilibrium are: "Classical thermodynamics describes the thermal interaction of systems that are individually in a state of thermodynamic equilibrium. Thermal equilibrium is a statistical condition of macroscopic systems, while microscopically all systems undergo random fluctuations."Chjoaygame (talk) 02:57, 24 February 2011 (UTC)
- I agree with these definitions, they are thermodynamic definitions. I disagree with the last sentence, of course. "Thermal equilibrium is a statistical condition..." refers inappropriately to statistical mechanics. PAR (talk) 03:46, 24 February 2011 (UTC)
- Agreed, "is a statistical condition" is inappropriate.
- I think it best to have just one definition for a term, and then to prove it equivalent to various interesting sets of necessary and sufficient conditions (or even in principle perhaps that it might not be able to exist as a physical reality). I like to think that equilibrium is the unique stationary state, reached after a long period of settling after preparation of the system, that is stable against all perturbations. One then investigates and finds that it has various properties, such as having its entropy maximum.Chjoaygame (talk) 03:10, 25 February 2011 (UTC)
comment
Perhaps we should add that we are considering classical thermodynamics as a theory about systems in separate equilibria that can be allowed to interact to generate new mutual equilibria and then separated to have new separate equilibria, but not 'local thermodynamic equilibrium thermodynamics' that follows the time courses of the interactions if they are sufficiently slow though not so slow as to be quasi-static (e.g. Prigogine and Defay 1964 and Kondepudi 2008 and Lebon Jou and Casas-Vázquez 2008)?
- I think LTE is a logical extension of classical thermodynamics.
I think it important to distinguish between thermal equilibrium and thermodynamic equilibrium. As I read thermal equilibrium it means time-invariant temperature. As I read thermodynamic equilibrium it requires a stationary state of zero net rates for all reactions and flows. Guggenheim 1967 uses the term 'mutual equilibrium' in his section on thermal equilibrium (page 8).
- I agree
For the zeroth law to work I think one needs that the systems, as to temperature, are each fully characterized by just one temperature; for example the dependence of temperature on altitude in an equilibrium column of gravitationally loaded gas is not allowed to be considered here; each system must be so small as to have just one temperature.Chjoaygame (talk) 02:39, 25 February 2011 (UTC)
- We have to be careful of circular logic. The zeroth law is used to justify the concept of temperature, so it cannot be described or constrained using the concept of temperature. PAR (talk) 03:37, 25 February 2011 (UTC)
- Yes. But somehow we have to ensure that the accessible surface of the system is homogeneous with respect to temperature. My proposal above "equilibrium is the unique stationary state, reached after a long period of settling after preparation of the system, that is stable against all perturbations" may not be detailed enough. It seems perhaps to lack a homogeneity requirement? Would it be enough to require also the effective absence of externally set field potentials such as gravitic and electrostatic and magnetic? Should one explicitly stipulate the absence of internal flows? Presumably we will require that the system be closed apart from its permitted interaction with the other system of interest.Chjoaygame (talk) 03:58, 25 February 2011 (UTC)
- I don't think a homogeneity requirement is good. You could have a system composed of two subsystems that are isolated from each other. That forms a perfectly good, inhomogeneous system at equilibrium. I think isolation requires isolation from any external fields, or else a good reason why they may be ignored. PAR (talk) 04:21, 26 February 2011 (UTC)
comment
I find in Planck's Thermodynamics the following:
2. If two bodies, one of which feels warmer than the other, be brought together (for example, a piece of heated metal and cold water), it is invariably found that the hotter body is cooled, and the colder one is heated up to a certain point, and then all change ceases. The two bodies are then said to be in thermal equilibrium. Experience shows that such a state of equilibrium finally sets in, not only when two, but also when any number of differently heated bodies are brought into mutual contact. From this follows the important proposition : If a body, A, be in thermal equilibrium with two other bodies, B and C, then B and C are in thermal equilibrium with one another. * For, if we bring A, B, and C together so that each touches the other two, then, according to our supposition, there will be equilibrium at the points of contact AB and AC, and, therefore, also at the contact BC. If it were not so, no general thermal equilibrium would be possible, which is contrary to experience.
* As is well known, there exists no corresponding proposition for electrical equilibrium. For if we join together the substances Cu | CuSO 4 aq. ZnSO 4 aq. | Zn to form a conducting ring, no electrical equilibrium is reached.
3. These facts enable us to compare the degree of heat of two bodies, B and C, without bringing them into contact with one another; namely, by bringing each body into contact with an arbitrarily selected standard body, A (for example, a mass of mercury enclosed in a vessel terminating in a fine capillary tube). By observing the volume of A in each case, it is possible to tell whether B and C are in thermal equilibrium or not. If they are not in thermal equilibrium, we can tell which of the two is the hotter. The degree of heat of A, or of any body in thermal equilibrium with A, can thus be very simply defined by the volume of A, or, as is usual, by the difference between the volume of A and its volume when in thermal equilibrium with melting ice under atmospheric pressure. This volumetric difference, which, by an appropriate choice of unit, is made to read 100 when A is in contact with steam under atmospheric pressure, is called the temperature in degrees Centigrade with regard to A as thermometric substance. Two bodies of equal temperature are, therefore, in thermal equilibrium, and vice versa.
Some venerable texts, such as Guggenheim, seem, I think, to have taken Planck's sentence "If a body, A, be in thermal equilibrium with two other bodies, B and C, then B and C are in thermal equilibrium with one another" out of this context and treated it as a statement of the third law of thermodynamics, ignoring Planck's care about how the connections are made and unmade. Guggenheim does not take the care that you have taken, to make the explicit statement "Additionally, if it is known that two systems will not change their respective states should they be brought into diathermic contact, those two systems are also said to be in thermal equilibrium." Indeed, if one wanted to, one could make the definition as follows: If two systems are such that they will not change their respective states should they be brought into thermal contact, they are said to be in thermal equilibrium. It would then all be hypothetical and indirect.
Planck felt the need to be explicit each time he actually made or broke a thermal connection. I feel the same way. I think that we are looking at Chinese whispers when we move from Planck's naive position to the sophisticated position that is expressed, perhaps for the first time explicitly, in your statement "Additionally, if it is known that two systems will not change their respective states should they be brought into diathermic contact, those two systems are also said to be in thermal equilibrium." Guggenheim actually writes: "If any two separate systems each in complete internal equilibrium are brought together so as to be in contact through a thermally conducting wall then in general the two systems will be found not to be in mutual equilibrium, but will gradually adjust themselves until eventually they do reach mutual equilibrium after which there will of course be no further change. The two systems are then said to have reached a state of thermal equilibrium. Systems separated by a conducting boundary are said to be in thermal contact. ... Consider now a reference system in a well-defined state. Then all systems in thermodynamic equilibrium with it have a property in common, namely the property of being in thermal equilibrium with one another. This property is called temperature. In other words systems in thermal equilibrium are said to have the same temperature. Systems not in thermal equilibrium are said to have different temperatures." Without the habit of actually testing for mutual thermodynamic equilibrium, when one will be comparing some systems that are not in mutual thermal equilibrium, but actually transfer heat in a definite direction in during the test, one would not have any way of concluding that temperatures can be ordered as real numbers; they might be members of a finite ring, for all we would know.
I think that the actual physical process of comparison by connection and disconnection is important and should not be elided from expression for the sake of getting a neat statement of the zeroth law. If it was good enough for Planck, it is good enough for me. For the Wikipedia I prefer the naivety of Planck to the sophistication of Buchdahl.Chjoaygame (talk) 13:27, 25 February 2011 (UTC)
- I don't agree - there are a number of implicit assumptions about single and double systems before we get to the zeroth law. One of these is that if we bring two systems into diathermic contact and let them equilibrate, and we then separate them, their states will not change. If we bring them again into diathermic contact, their states will again not change. After the first separation, we are therefore in a position to say that the two systems are in equilibrium even though they are separated, and we do not need to test this in order to know it is true. This line of reasoning can be applied to the zeroth law as well.
- I am not in favor of expressing things in the mathematical language of Buchdahl to begin with, but it is wrong to say that two separated systems being in equilibrium is a hypothetical situation only. PAR (talk) 06:40, 26 February 2011 (UTC)
- As I think about this I see that a more important point is that warmth can be put into correspondence with the real numbers. This is far more than merely an equivalence relation between systems in potential or actual thermal equilibrium.
- Planck starts by saying
1. THE conception of " heat " arises from that particular sensation of warmth or coldness which is immediately experienced on touching a body. This direct sensation, however, furnishes no quantitative scientific measure of a body's state with regard to heat ; it yields only qualitative results, which vary according to external circumstances. For quantitative purposes we utilize the change of volume which takes place in all bodies when heated under constant pressure, for this admits of exact measurement. Heating produces in most substances an increase of volume, and thus we can tell whether a body gets hotter or colder, not merely by the sense of touch, but also by a purely mechanical observation affording a much greater degree of accuracy. We can also tell accurately when a body assumes a former state of heat.
- Reading the introduction and first chapter of Kittel and Kroemer second edition 1980 one would get no hint that temperature had anything to do with warmth, coolness, or heat. Planck's introductory comments are about hotter and colder bodies, and this is the main burden of the notion of temperature, but is not addressed in the wording "If two thermodynamic systems are each in thermal equilibrium with a third system, then they are in thermal equilibrium with each other." Guggenheim uses the wording "If two systems are both in thermal equilibrium with a third system the they are in thermal equilibrium with each other." He does not mention anything to do with warmth, and thus his formulation gives no hint that warmth can be expressed by a real number; he is therefore an unreliable source. These wordings seriously misrepresent Planck's thinking, and are seriously deficient for the purpose of establishing a physical basis for the measurement of temperature, and therefore cannot serve well as a statement of the zeroth law of thermodynamics for the Wikipedia. I would like to do some more reading to check this further.Chjoaygame (talk) 09:28, 26 February 2011 (UTC)
- Exactly right! But Planck was not describing the zeroth law, he was describing a more fully developed concept of temperature. The zeroth law allows the definition of temperature, but does not define it. In particular, it does not provide for the idea of hotter or colder. The zeroth law separates the set of all conceivable equilibrated thermodynamic systems into a bunch of separate ("disjoint") subsets. Each member of a subset are in equilibrium with every other member of that subset, and NOT in equilibrium with any member outside that subset. That's what an equivalence relationship does - that's ALL it does. If there are N such subsets, then we can take any other N-element set and assign one member to each of the disjoint subsets. We can call the members of that second set "temperature". Any two systems which have the same temperature will then be in equilibrium with each other. That's what the zeroth law allows you to do. As a particular case, we can assign some real number to each of the subsets and call that real number temperature. The assignment can be completely random. If a thermodynamic system has a temperature of one and another a temperature of two and you bring them in contact, they may both equilibrate to a temperature of one billion. Still, two systems with the same temperature will be in equilibrium with each other. That's all the zeroth law tells you, or allows you to do. Planck's idea of hotter or colder is not an expression of the zeroth law. The zeroth law allows you to assign a real number using, for example, a gas thermometer. But when you do this, it is this use of a gas thermometer which orders the temperature into what we perceive as hotter and colder, not the zeroth law. A gas thermometer will not provide exactly what we commonly call temperature unless it happens to be an ideal gas, but it WILL provide you with the ordering of what we call hotter and colder. There can be many such temperature scales, and they are called "empirical" temperature scales. The second law defines a temperature scale called "thermodynamic temperature" which is what we commonly call temperature. (It is the same temperature provided by an ideal gas thermometer, but its not explicitly defined that way.) PAR (talk) 17:29, 26 February 2011 (UTC)
- It begins to seem relevant that empirical temperature is not really on the main line of the development of thinking about thermodynamics: the zeroth law is really setting up a concept of thermal equilibrium that allows the notion of transfer of energy as heat, there being in a sense only one kind of heat. Heat is not a state variable and doesn't make perfect differentials and so it is with increments of empirical temperature. If one sticks with one empirical thermometer, one can make path-dependent measurements of heat transfer with path-dependent heat capacities for that thermometer. One can find a work equivalent of heat for that thermometer. It is heat transfer that is really at the root of the importance of the zeroth law.Chjoaygame (talk) 01:23, 27 February 2011 (UTC)
- One can find systems which act purely as heat conductors, by means of which one can connect other systems. So connected pairwise, such other systems can either be unaffected by the connection, or else can be affected by the connection. Heat conductors are more or less indifferent to what systems they are used to connect. If the pairwise connected systems are affected by the connection, it is said that heat has been conducted by the connection. According to Planck's idea, heat is said to be conducted into the system that increases its volume when held at constant pressure, and out of the one that decreases its volume. We might also allow the the system that is heated by conduction shall increase its pressure at constant volume? The physical meaning of empirical temperature as introduced by Planck is about change of volume at constant pressure. Sometimes I suppose heat transfer can take place at constant pressure and constant volume, if things just so happen that other changes can fit the bill; perhaps there is a suitable chemical reaction or phase change.
- The characteristic feature of heat conducting systems is their indifference to the characters of the sources and sinks of heat flow through them. This is the reason for saying that, in contrast to the existence of many kinds of work, there is only one kind of heat.Chjoaygame (talk) 03:24, 27 February 2011 (UTC)
- Yes, but do you understand what I am trying to say in my previous post? Does it make sense? PAR (talk) 03:35, 27 February 2011 (UTC)
- Yes, I understand what you are saying in your previous post. It makes sense. But sense for a mathematician. The characteristic indifference of heat conducting systems to the characters of the sources and sinks of heat is an important physical fact, and I am saying that it is the real physical content for which one needs a zeroth law. The zeroth law, stated as for example by Guggenheim, who seems to have got his statement by lifting a sentence out of context from Planck, does not tell about the well-ordering relation of empirical temperature; I would like to know more about the sources of the statement of the zeroth law, and will have to read a little for that. A temperature that does not have a well-ordering relation is of less interest to a physicist, because it does not do much to support his idea of heat. With a well-ordering of empirical temperature, the differential of heat, considered by the first law, may be imperfect, but at least it exists. Without a well-ordering of empirical temperature, I think the very notion of an imperfect differential of heat will fail? Will the physicist have to wait for absolute temperature from the second law before he can define heat for the first?Chjoaygame (talk) 05:48, 27 February 2011 (UTC)
- But we would not expect Guggenheim to mention the ordering of temperature when discussing the zeroth law because the zeroth law does not provide or imply any ordering. With regard to the first law, temperature should not be introduced at this point, strictly speaking. Whenever it is introduced as part of the dU of the first law, it comes in as T dS, where S is entropy, and entropy is certainly a second law concept. So is thermodynamic temperature. In other words, yes, the physicist will have to wait for the second law to introduce thermodynamic temperature before he can use T dS in the first law. Until then it is just δQ - the increment of heat - which is well defined without recourse to thermodynamic temperature and will not fail. Also - how do you define empirical temperature? I have read it defined as any temperature that is consistent with the zeroth law - it need not have ordering or continuity or anything. Other places it is defined in terms of a real thermodynamic system, like a gas thermometer, in which case it is ordered and continuous. Also, regarding Guggenheim - can you tell me, does he at any point discuss the ordering or continuity of temperature and if so, how does he introduce or explain its origin? PAR (talk) 06:43, 27 February 2011 (UTC)
- As I noted, I would like a little time to read about the origin of the zeroth law, about which I am entirely ignorant. The present Wikipedia article attributes it to Planck and to Fowler. Guggenheim is co-author of a text with Fowler. On page 6 of his text (fifth edition 1967) Guggenheim writes that there is an independent degree of thermodynamic freedom that "fixes the degree of hotness or coldness". He soon talks about thermally insulating and thermally conducting walls. Then he says that "systems in thermal equilibrium are said to to have the same temperature, without actually talking about the flow of heat. By this apparent definition, there is no difference between being in thermal equilibrium and having the same empirical temperature. He admits a wide variety of [empirical] thermometers but and postpones an account of absolute temperature till the discussion of the second law. He then leaves the discussion of temperature and goes on to discuss the first law.
- But we would not expect Guggenheim to mention the ordering of temperature when discussing the zeroth law because the zeroth law does not provide or imply any ordering. With regard to the first law, temperature should not be introduced at this point, strictly speaking. Whenever it is introduced as part of the dU of the first law, it comes in as T dS, where S is entropy, and entropy is certainly a second law concept. So is thermodynamic temperature. In other words, yes, the physicist will have to wait for the second law to introduce thermodynamic temperature before he can use T dS in the first law. Until then it is just δQ - the increment of heat - which is well defined without recourse to thermodynamic temperature and will not fail. Also - how do you define empirical temperature? I have read it defined as any temperature that is consistent with the zeroth law - it need not have ordering or continuity or anything. Other places it is defined in terms of a real thermodynamic system, like a gas thermometer, in which case it is ordered and continuous. Also, regarding Guggenheim - can you tell me, does he at any point discuss the ordering or continuity of temperature and if so, how does he introduce or explain its origin? PAR (talk) 06:43, 27 February 2011 (UTC)
- I think that in order to define the magnitude and sign of an increment of heat conducted, one needs a well-ordered temperature. Guggenheim defines heat as the discrepancy between a first law that is stated only for work transfer and one that is stated for total energy transfer. He defines the sign of the increment of heat transferred on the basis of the difference in temperatures, with the convention that heat is transferred from the higher temperature to the lower temperature. He does not say that the hotter body has the higher temperature, but perhaps one might guess it to be so? He is assuming that temperature is ordered, a thing not provided for in his statement of the zeroth law, as you note. Unlike Planck, he does not link empirical temperature with any other property, such as volume at constant pressure, which fixes the sign of increments. So, to answer your question, I think surely he is effectively assuming the second law here, without saying so? After a while he moves on to postulate the existence of a function of state called entropy, and from this he derives absolute temperature.
- I note that Prigogine and Defay 1954 do not appeal to any zeroth law; they do not enter into the kind of discussion that we are holding here. As I read it, the first edition 1903 of Planck's Thermodynamics also manages without a zeroth law. I will need to get a later edition.Chjoaygame (talk) 12:47, 27 February 2011 (UTC)
If you take a system and perform only work on it (i.e. prevent any diathermic connection = adiabatically) then you can come up with a formula that relates the internal energy to its state parameters (excluding temperature). This is an experimental fact - no matter what form the work takes, no matter what the sequence of intermediate states, the amount of work done is always the same when a system goes from one particular state to another adiabatically. It is THE experimental fact expressed by the first law. If you now take a system and perform work on it while it is diathermally connected, then you can use that formula to calculate its internal energy and the discrepancy between that and the energy added by work is equal to the heat energy transferred, by definition. With this procedure, there is no reference to temperature, and no problem with direction, and no need for the zeroth or second law.
Neither the zeroth, first, or second laws use the concept of temperature for their expression. The zeroth law lays the groundwork for the concept of temperature by establishing certain, but not all, of its properties, allowing for an infinite number of, let's say, "proto-temperature scales" (or maybe "empirical temperature scales") which may or may not embody an ordering relationship (hot and cold), and then the second law defines what we commonly call temperature: the thermodynamic temperature, which does embody the ordering relationship. PAR (talk) 16:13, 27 February 2011 (UTC)
comment
We seem to have uncovered a slight problem with the statement of the first law. We are interested in thermodynamics because we want to talk about the interconversion of heat and work, and simply to say that energy is conserved:
Energy can be neither created nor destroyed. It can only change forms.
In any process in an isolated system, the total energy remains the same.
is not very explicit about that. And
For a thermodynamic cycle the net heat supplied to the system equals the net work done by the system.
is unnecessarily narrow: the later stated form
Increase in internal energy of a system = heat supplied to the system + work done on the system
seems more explicitly thermodynamic than the above-quoted statements, I think. But I think that is outside the scope of our present exercise.Chjoaygame (talk) 05:48, 27 February 2011 (UTC)
Is this maybe too emphatic?
The section on the third law starts:
- It is fair to say that Aristotle seems perhaps to have helped...
Could this possibly be seen as slightly too emphatic? I don't want to go out on a limb here but isn't it possible that a few more weaselly "maybe", "could be" qualifiers could be used to make this less readable? 82.224.234.55 (talk) 10:07, 23 January 2011 (UTC)
- Yes, it should be toned down a bit. I would recommend "Maybe it could be said that it is fair to say that the argument could be made that Aristotle possibly seems to perhaps have helped...". On second thought, maybe it is fair to say the whole section should put up or shut up. Either have it describe exactly the way in which Aristotle contributed, or delete it. PAR (talk) 11:25, 23 January 2011 (UTC)
"More accurate statement about fluctuations" but also unnecessarily complicated
The new version reads "The entropy of an isolated system will be non-decreasing only in the limit of an infinite number of microstates. For finite systems, the entropy may exhibit fluctuations that increase entropy. For a macroscopic system these fluctuations are negligibly small, but for smaller and smaller systems, these fluctuations will grow in importance."
Dear PAR, I think this is more complicated than necessary without enough benefit to justify it. Since the fluctuations are negligibly small for a macroscopic system, it is not too harmful to say that the entropy of an isolated macroscopic system never decreases. The idea of an infinite number of microstates is not well defined here. You should be able to fill in the details of what I am pointing out, without me needing to spell them out. It is very unusual to try to deal with a system that is not, in some ordinary sense of the word, finite. I think the new edit should be undone.Chjoaygame (talk) 11:59, 20 February 2011 (UTC)
I see that PAR has not replied to my comment and suggestion. I should make my comment more explicit. PAR's new edit practically abolishes the very useful, even essential, distinction made in thermodynamics, between systems that are simply macroscopic and systems that are simply microscopic. To be sure, there are mesoscopic systems, with particle numbers in between. But the main interest of thermodynamics is in macroscopic systems, and PAR's edit practically abolishes them. PAR's edit seems to say that the distinction is so much one of mere degree that a pure thinker will hardly be interested in it. This would throw a novice to thermodynamics right off the scent of the main thread of the subject; therefore PAR's new edit, for a Wikipedia article, is simply misleading. I think PAR's new edit should be undone. Perhaps PAR might like to make his point for pure thinkers in another edit that indicates that it is not focused on the main line of thermodynamic thinking, which is macroscopic.Chjoaygame (talk) 12:01, 21 February 2011 (UTC)
- The sentence is not misleading, because its true. To say that the entropy can decrease for microscopic systems, but not for macroscopic ones is misleading. If we want to introduce the concept of fluctuations in the first place, then lets do it accurately or ignore it altogether. There is no clear line between a macroscopic and a microscopic system, and it follows that there is no clear line between a system that exhibits fluctuations and one that does not, and to imply that there is such a line is misleading. I have no problem with rewording the sentence to make it clearer to the new reader, emphasizing that fluctuations are negligible in macroscopic cases, but less and less so as the systems become more microscopic, but I am opposed to the false implication that fluctuations suddenly disappear when systems reach a certain size. Either that or we avoid the subject of fluctuations altogether, for the sake of the new reader. PAR (talk) 18:55, 21 February 2011 (UTC)
- Dear PAR, Thank you for your careful reply. You write that "The sentence is not misleading, because it it true." This is not a valid use of "because". This usage of "because" does not comprehend the distinction between "misleading" and "not true". There is a distinction between rhetoric and logic, and, in addition to logic, some art of rhetoric is also required for the present kind of work. You object to "the false implication that fluctuations suddenly disappear when systems reach a certain size", but this implication is as you write, an implication: it is not explicit in the previous text, but has to be teased out by the efforts of a pure thinker like yourself; the newcomer will not draw it immediately, and can be led to it gently, without having the initial simplicity of the classical theory abruptly and prematurely deleted.Chjoaygame (talk) 19:53, 21 February 2011 (UTC)
- Well, lets agree to disagree. I am fine with whatever consensus we can reach with other editors. PAR (talk) 02:28, 22 February 2011 (UTC)
- Dear PAR, thank you for this reply. Yes, it is likely that things will be decided by a process of trial and error.Chjoaygame (talk) 10:45, 22 February 2011 (UTC)
"an extension to classical theory with the concepts from statistical thermodynamics"
Dear Kbrose, while it is true that "Non-equilibrium thermodynamics may be considered separately as an extension to classical theory with the concepts from statistical thermodynamics", that statement does not quite cover all the main lines of non-equilibrium thermodynamics. For example, the classic text of De Groot and Mazur (Non-equilibrium Thermodynamics, North Holland, Amsterdam, 1962) has only three chapters out of fifteen on the statistical mechanical foundations, the introduction being mostly in macroscopic terms, and the first six and the last six chapters being in purely macroscopic terms. With the concept of local thermodynamic equilibrium, classical non-equilibrium thermodynamics stands on a purely macroscopic basis, as exemplified by Glansdorff and Prigogine's classic text (Thermodynamic Theory of Structure, Stability and Fluctuations, Wiley-Interscience, 1971). The text of Lebon, Jou, and Casas-Vázquez (Understanding Non-equilibrium Thermodynamics, Springer, 2008) is also devoid of discussion of statistical mechanics. Therefore, while your edit has restored the "original meaning", I think the original meaning was misleading.Chjoaygame (talk) 20:29, 21 February 2011 (UTC)
"deficiency of information, or uncertainty"
Dear Kbrose, while in ordinary language, 'deficiency of information' and 'uncertainty' have closely related meanings, in physics there is a significant difference. 'Deficiency of information' is likely in physics to be interpreted in terms of information theory, as for example in Shannon's definition, while 'uncertainty' is likely to bring up ideas of quantum mechanical uncertainty; the concepts are different. Entropy depends on the information theoretical concept, not on the quantum mechanical one. Therefore I think it better to omit the alternative "or uncertainty" here.Chjoaygame (talk) 20:38, 21 February 2011 (UTC)
No reply?Chjoaygame (talk) 17:48, 23 February 2011 (UTC)
"the concepts of energy and its transformations "
Dear Kbrose, while it is true that Aristotle made much use of his Greek word ἑνέργεια, he did not come near inventing the modern physical concept of energy; that had to wait till the nineteenth century. Aristotle's word for the underlying preserved and unchanging substrate was ὕλη. The key here is Aristotle's probable invention, and his extensive discussion, of the concept of the preservation of an underlying unchanging substrate in a process of change, the substrate not necessarily being matter or energy, though able to appear in diverse forms. Therefore I think your introduction of the word "energy" here is misleading.Chjoaygame (talk) 20:53, 21 February 2011 (UTC)
Dear Kbrose, Aristotle's physics conspicuously lacked any concept corresponding to momentum or kinetic energy, and did not have a concept directly corresponding to energy of any kind. Even in the nineteenth century Clausius was talking about vis vivae, not kinetic energy. Aristotle's categories included 'position', but lacked anything corresponding to 'velocity'. The nearest he had was 'action' in the sense of making or doing. It took thousands of years to develop the idea of 'velocity' as a fundamental notion or category, as it appears for example in Newton, who nevertheless also lacked the concept of energy. I think it better to undo your edit on this.Chjoaygame (talk) 21:58, 22 February 2011 (UTC)
No reply?Chjoaygame (talk) 17:48, 23 February 2011 (UTC)
distinction between thermal equilibrium and thermodynamic equilibrium
Dear Kbrose, your edit writes "Thermal equilibrium is a statistical condition". Thermal equilibrium means just that the temperatures are steady, and does not cover the full scope of thermodynamic equilibrium, which demands far more than thermal equilibrium. The distinction is very important. I think your edit needs revision or preferably just undoing.
No response on the term 'thermal equilibrium'? I would like to delete the sentence "Thermal equilibrium is a statistical condition of macroscopic systems, while microscopically all systems undergo random fluctuations." I think it is not logically well structured, and does not clearly convey much meaning.Chjoaygame (talk) 17:48, 23 February 2011 (UTC)
Your edit writes: "The laws of thermodynamics ... to satisfy the laws of statistics". The laws of thermodynamics are laws of nature, while the laws of statistics are merely mathematical truths, not dependent on empirical fact. Nature does not consult the laws of statistics; she does it all by herself; the laws of statistics are merely useful artifacts for describing how she does it. I think it would be better to omit the the phrase "to satisfy the laws of statistics".Chjoaygame (talk) 21:13, 21 February 2011 (UTC)
new edit by PAR
Dear PAR, your edit, dated 07:42, 10 March 2011, seems to tend toward removal of physical content from the article, leaving only a mathematical skeleton. Your edit seems designed to give us a lesson in algebra, and have us forget that thermodynamics is rooted in empirical physics. I am particularly brought up to date by the introduction of the erudite notion of a Euclidean relation here. I have here five relevant textbooks (I suppose all out of date) that do not mention this notion, and none that does; some use the term 'Euclidean equivalence' in an entirely different sense. The Wikipedia article on Euclidean relation does not cite any references or sources, which were requested in June 2007. The Wikipedia article on Equivalence relation has an unreferenced section on Euclidean relations that comments "Euclid probably would have deemed the reflexivity of equality too obvious to warrant explicit mention." Chjoaygame (talk) 11:04, 10 March 2011 (UTC)
- Well, you know that I object to the idea that there is a dividing line between math and physics. Mathematics is the language of physics. The whole idea of equivalence relationships, transitivity, etc. can be expressed in abstract mathematical set theory, but when those sets take on a physical meaning, then you are talking physics. To say that an equivalence relationship generates a set of disjoint subsets, that is mathematics. When you say that those subsets are thermodynamic systems in mutual equilibrium, that's physics. Physical content has not been removed, it has been more precisely expressed.
- I have not checked, but if the Wikipedia article does not reference any sources for a Euclidean relationship, then we need to find some. The zeroth law as it is expressed in 90 percent of the books I have checked is certainly not a transitive relationship. When expressed as "If A and B are in equilibrium with C then A is in equilibrium with B" then it is absolutely a Euclidean relationship. Add reflexivity, and you have an equivalence relationship. When expressed as "If two systems are in equilibrium with a third, they are in equilibrium with each other", that requires two extra assumptions, reflexivity (A->A) and symmetry (A->B implies B->A). The fewer extraneous assumptions, the better, so that's why the first is preferable.
- The fact is, all of this simple set theory stuff serves to very precisely define what the zeroth law means in a physical sense. I am open to the idea that all this precision need not be thrown at a new reader up front, but to ignore it or to gloss over errors in its expression is not helpful.PAR (talk) 18:55, 10 March 2011 (UTC)
- Dear PAR, you write: "I object to the idea that there is a dividing line between math and physics." I am not saying that there is a dividing line between mathematics and physics. You write "Mathematics is the language of physics." This is your personal point of view.Chjoaygame (talk) 00:20, 11 March 2011 (UTC)
- What is your concept of the role of mathematics in physics? How would you prefer that the zeroth law explanation be written? PAR (talk) 04:51, 11 March 2011 (UTC)
- Dear PAR, you write: "What is your concept of the role of mathematics in physics?" My views on that very general question are not called for here. You write: "How would you prefer that the zeroth law explanation be written?" I have written above that I need to read some more and that it will take me time to get the relevant literature.Chjoaygame (talk) 11:45, 13 March 2011 (UTC)
- No problem, I have been doing that too. Buchdahl "The Concepts of Classical Thermodynamics" is the best I have found. Also, google "Buchdahl zeroth law" for some interesting articles on whether the zeroth law is fundamental or not. If you find some other good ones, please let me know. PAR (talk) 18:14, 13 March 2011 (UTC)
History of zeroth law
This is good stuff, but I think it is too much detail for this overview article, and should go in the zeroth law of thermodynamics article instead. PAR (talk) 16:08, 24 March 2011 (UTC)
Minor grammatical question
Thank you Afedor for your kind and necessary corrections. But the last one is a matter of taste. An old tradition, discussed in Modern English Usage (H.W. Fowler, Oxford University Press, London, 1926/1954) under the heading Fused Participle, would put in the 's as I have done. The old story is that the word basing is here used as a gerund. Perhaps some will regard the old tradition as obsolete.Chjoaygame (talk) 22:01, 12 June 2011 (UTC)
perpetual motion machines
Perpetual motion machines of the first kind endlessly produce work without otherwise altering their surroundings. They are impossible. This is not related to their temperature. The edit was mistaken, and its maker should not try to restore it.Chjoaygame (talk) 19:31, 31 October 2011 (UTC)
This is an archive of past discussions about Laws of thermodynamics. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 |
- ^ Moran, Michael J. and Howard N. Shapiro, 2008. Fundamentals of Engineering Thermodynamics. 6th ed. Wiley and Sons: 16.