Talk:Entropy/Archive 8

Latest comment: 16 years ago by Count Iblis in topic Exact Differential
Archive 5Archive 6Archive 7Archive 8Archive 9Archive 10Archive 14

Nice 1872 Boltzmann article page

I found this nice page about Boltzmann's contribution to science:

This rare article costs $3,600 dollars; I'll pitch in the first ten dollars. --Sadi Carnot 04:56, 22 November 2006 (UTC)

Coming back to look at the Entropy article after a long hiatus, I am refreshed by Sadi's light comment (although still depressed by the size of the article and the discussion, prior to Bduke!) As Sadi undoubtedly is well aware, an English translation of the 1872 article is available on pp. 88-175 of Volume 2 of "The kinetic theory of gases : an anthology of classic papers with historical commentary" by Stephen G Brush ; edited by Nancy S Hall. London: Imperial College Press ; River Edge, NJ : Distributed by World Scientific Pub., C2003. This is evidently a reprint of the 1965 Brush publication by Oxford (The 'Boltzmann equation' to which the ad for the rare article refers is Boltzmann's H equation, of course, not Planck's 1900 or 1906 versions of the equation on Boltzmann's tombstone.) FrankLambert 06:02, 23 November 2006 (UTC)

A review of the whole article

After being away from this for a while, I have been trying to think about the whole article. There has been a lot of good and exacting work, but I am afraid the article has many defaults. It is very unclear, difficult to understand and confusing. Where can we begin? Well, an obvious practical place is the warning we get when we try to edit the article:-

Note: This page is 58 kilobytes long. It may be appropriate to split this article into smaller, more specific articles. See Wikipedia:Article size.

Yes, it is far too large, yet we already have many more specific artciles and we refer to them at the top of many sections with "See article: XXX". Yet these headings are followed by far too much material that is often duplicated in the specific article. There should, in these sections, be no more than two sentences that just allow the readers to decide whether they should move on to read the specific article. The "Arrow of time" section is how it should be. All the others are way way too long. We should also avoid any equations in these sections and perhaps everywhere. These sections should be trimed down to one or two sentences with possible changes to the specific articles:-

  • History
  • Macroscopic viewpoint
  • Microscopic viewpoint
  • Entropy in chemical thermodynamics
  • Second Law
  • Entropy and information theory
  • Entropy and Life
  • Entropy and cosmology

Approaches to understanding entropy should perhaps stay but it needs rewriting. The section on open systems needs a specific article for this material with just a couple of sentences to point to it as for the others. The section at the end on other relations needs carefull thought. It could even be deleted.

So the idea is to spawn off more specific articles, not less specific articles. If we do what I suggest above we will save a great deal of space and can then merge in the introduction to entropy article and work on getting a clear easily understood introduction to the article. I have thoughts on the introduction article as it now stands, but I will leave them for now except to say that I do not think it is clearly understood.

If people agree, let us work on each section in turn discussing the wording of the one or two sections here along with what needs to go into the specific article. Then we can move to the introduction.

We need rigour but we need clarity. The emphasis in this article should be on clarity with no lack of rigour but the main rigour moved to more specific articles.

Lets have your views. --Bduke 21:09, 22 November 2006 (UTC)

A good idea might be to change this "Entropy" page into a gateway to all kinds of entropy, and keep thermodynamic entropy in the Entropy (thermodynamic views) article. The thermodynamic views article would still be rather large, but could give more space to an explanation before handing off to another article. I agree that the article size needs to be reduced. By the way, I very much disagree with splitting the macro and micro into separate articles. The macro article would be blind, offering no intuitive understanding of entropy, and the micro article would be lame, having little connection to hands-on reality. It would be like having two articles about the penny, one about the heads side, the other about the tails.
Although it is a small point, I agree with you about not splitting the macro and micro, but I note that the two sections have different main articles. I will add one of them that you missed to your very usefull table. More general stuff to follow below. --Bduke 03:13, 23 November 2006 (UTC)
Let me offer an expanded list of thermodynamic entropy-related articles, existing and proposed. Please add to it and remove it from here, if anyone wants to change it. PAR 22:40, 22 November 2006 (UTC)
Entropy (this article) Entropy (thermodynamic views) Introduction to entropy
Second law of thermodynamics Entropy (arrow of time) Entropy and life
Entropy and cosmology History of entropy Entropy of mixing
Entropy (energy dispersal) Entropy (order and disorder) Principle of maximum entropy
Maximum entropy thermodynamics Entropy in thermodynamics and information theory Entropy unit
Gibbs entropy Boltzmann entropy Tsallis entropy
Black hole entropy Residual entropy Loop entropy
Conformational entropy Entropic force Free entropy
Entropic explosion Entropy change redirects to Standard entropy change of vaporization Sackur-Tetrode entropy
Entropy in chemical thermodynamics Entropy (statistical views)

I think the majority of minds [including mine] are to small to summarize the article in its current state. The immediate wrong I see in the article is the phrase "In thermodyanmics" at the beginning of the article, which implies to readers that the topic is basically about thermodyanamics. However, in fact, not all of the article talks specifically about thermodyanamics. The topic, I believe, should be subdivided according to general fields of science such as astronomy, biology, chemistry, computers, physics, sociology, etc.Kmarinas86 22:42, 22 November 2006 (UTC)

I disagree. Entropy in all natural sciences is about thermodynamics. It is not different. --Bduke 03:13, 23 November 2006 (UTC)
From what I've seen, most of these fields adopt variations on either thermodynamic entropy or information entropy: I'd like to see this article concisely outline the main fields in a way that ordinary readers can grasp, and include sentences briefly relating the applications such as sociology to the main field concerned. Detail should then go in the relevant articles which will be linked. Entropy (thermodynamic views) at present is pretty impenetrable, there may be a case for a separate Thermodynamic entropy article .. dave souza, talk 23:17, 22 November 2006 (UTC)
Of course, entropy started out as thermno -- then, because of the intellectual incapacity to understand it as an abstract and the human desire for order, the definition of entropy relying on chaos and disorder and functionally related gibberish was born and appropriated by a number of other disciplines. Sigh. •Jim62sch• 01:22, 23 November 2006 (UTC)

Solution is simple

The solution is to keep this page very simple, and to use it as a WP:summary page such that all the details are contained on other main pages--Light current 22:47, 22 November 2006 (UTC)

The disambiguation page refers the entropy article as "thermodynamic entropy". Yet the article goes beyond thermodynamics. I believe the article should be inclusive to the fields of science which talk about entropy and reject the image of being just a article on thermodyanmic entropy.Kmarinas86 22:55, 22 November 2006 (UTC)
Yes hence the summary page idea! 8-)--Light current 23:02, 22 November 2006 (UTC)
The whole thing needs to be redone.....from intro to outro. However, that's virtually impossible at the moment. •Jim62sch• 23:09, 22 November 2006 (UTC)
BTW, it shouldn't be a "summary page", it should be an introduction to entropy -- it needs meat on the bones (a summary is bones). At the moment though, the meat is fracid, and the bones protruding through the rent flesh. •Jim62sch• 23:12, 22 November 2006 (UTC)
Wikipedia:Summary style#Levels of desired details sets out the aim pretty well:
  • many readers need just a quick summary of the topic's most important points (lead section),
  • others need a moderate amount of info on the topic's more important points (a set of multi-paragraph sections), and
  • some readers need a lot of detail on one or more aspects of the topic (links to full-sized separate articles).
Thanks for that. I have been looking for it. It is exactly what we should be considering. --Bduke 03:13, 23 November 2006 (UTC)
This article should include a moderate amount of info aimed at giving an ordinary reader a basic understanding, with the more specialist and maths-heavy stuff in the linked articles. .. dave souza, talk 23:28, 22 November 2006 (UTC)
Couldnt agree more!--Light current 23:36, 22 November 2006 (UTC)
OK, that works. As it is now, this article is worse than it was 2 months ago. •Jim62sch• 23:45, 22 November 2006 (UTC)
Light current, how about a middle ground?  ;) I've asked three friends who are unbiasedly honest (one of my prerequisites for friendship), and who are not wikipedians to read this and give me their opinions. Bottom line being that I might be too involved to be objective. •Jim62sch• 00:05, 23 November 2006 (UTC)

I agree with Dave Souza that the Entropy (thermodynamic view) can use improvement. Would anyone mind if I edited it as if it were THE thermodynamic entropy article? I would like to take the present page, pare it down, and incorporate it into the Entropy (thermodynamic view) page. People will feel freer to pare down the "Entropy" page then, without worrying that some good thermodynamic-entropy stuff will be lost. PAR 00:39, 23 November 2006 (UTC)

I think that begs the question somewhat. We do not want to lose material that is good, but we ought to get the main article right first. Let us see if we can get some consensus first. We have already written too much. We need to reflect on what to change, what to delete and what to expand. Just rushing off and writing even more could lead to even more confusion. In physical and natural sciences entropy is thermodynamic entropy. Everything in the article is about thermodynamic entropy except for the material on informational entropy. Informational entropy is not core to the idea of entropy. Indeed its links are via statistical mechanics and the Boltzmann Equation, not bulk thermodynamics. I think it should be left out of the entropy article, which should have something like this explanation at the top:-

This article is about the thermodynamic concept of entropy in the physical and natural sciences. For details of informational entropy see informational entropy.

Indeed, I feel a lot of the problems with the article arise from the inortinate fondness of some editors of this article for informational entropy. It leads to a lack of focus and a resulting lack of clarity. Let us bear in mind that for almost all students who meet entropy in a science class, except perhaps for a few mathematically inclined physics students, it is the most difficult concept they ever come across. We can not afford to make it more difficult for them. If we stuck with thermodynamic entropy, we could write a good clear article. Even then it needs to be a summary article with a lot of clarity, leaving the more in depth stuff to the other articles. --Bduke 03:35, 23 November 2006 (UTC)

I reverted the recent destruction of the order of the article. I restored Kenosis edits and Jim62Sch additions of fact templates, but not Jim62Sch laundry templates, since he did not revert their removal in later edits. As for information entropy, I agree this is not the place to develop the concept of information entropy, but it IS the place to describe the application of information entropy to enhance the understanding of thermodynamic entropy. I will work to tighten that up, if certain editors who think I inserted that stuff out of an "inordinate fondness" for info entropy will please try to understand the link between the two. Please read the section "similarities between the two" - really - does this add or detract from the understanding of entropy? PAR 05:15, 23 November 2006 (UTC)
We are clearly not on the same wavelength. I think it does add to the understanding of entropy, so it should be on WP, but not here. It is just too far into the learning process to be here. This article should just say that a concept of information entropy has been introduced and it does have links to thermodynamic entropy and then point to an article that develops what you say. Ask yourself this question - "Who will benefit from this knowledge and how much other knowledge do they need before they can appreciate it?". --Bduke 05:59, 23 November 2006 (UTC)
Well, if you have read it and still disagree, lets not get all twisted up in that argument. What do you suggest for the Entropy page? I'm still not clear on that. Should it be a gateway to all types of entropy, where the main discussion takes place on other pages, should it be an "introduction to thermodynamic entropy" with the harder stuff on another page, or what. Could you outline a proposal? Anybody who expects that your proposal will be taken seriously will be reluctant to edit until there is a clear consensus. Lets get on with it. PAR 06:27, 23 November 2006 (UTC)
I do not think we are in that much hurry. I made a number of proposals above and, yes, they have a few differences between them. I would like to see whether other people respond. As you say, let us try to get consensus. I will expand my ideas in more details tomorrow. What I have said however that is I hope fairly clear is that we should cut back the sections that already point to a main article to no more than two sentences leaving those main articles to take the weight. I have also said that I would prefer information entropy to be not mentioned other than as a disambiguation at the top. Thus I believe this article under this simple name of "Entropy" should be about core entropy - that is thermodynamical entropy in the physical and natural sciences. I have also said that the "Introduction to Entropy" article should disappear as this is the place to do the introduction. We should then try to write the introduction in a clear non-technical style that hopes to help readers to understand this extremely difficult concept. As a final point, I am curious. How many of the editors who regularly work on this article have actually taught entropy to students who are beginning the journey to grasp the concept? --Bduke 06:56, 23 November 2006 (UTC)
I would be fine with such a revision to the article: as I stated earlier, thermo gave birth to entropy. We need this to be helpful to our core audience, (primarily) high school students who are learning and trying to understand the concept of entropy; overloading them with conflicting definitions and the myriad disciplines that have appropriated entropy to serve their own ends can only serve to confuse the issue for our audience. Also, as Dave had pointed out -- much of the math can be deleted from this article -- it is not helpful to one trying to grasp what entropy is. •Jim62sch• 13:48, 23 November 2006 (UTC)
To Bduke - There are a number of concepts that have been discussed here that are contentious. These include energy dispersal, order/disorder, and now information theory applications. We have more or less settled on the idea that contentious issues should be presented as such - the arguments pro and con should be honestly presented without confusing them with widely accepted viewpoints. Any other viewpoint means war. The fact that you think information entropy should be removed from the article conflicts with my idea that it is a vital part of understanding thermodynamic entropy. This disagreement is reflected in the literature as well. Therefore, the topic should be included, not as recieved wisdom, but as a point of contention, with the argument briefly and honestly outlined, before linking to a more detailed page.
Until we gain a consensus on this page, I will combine the "Entropy (thermodynamic views)" and "Entropy (statistical views)" into "Entropy (thermodynamics/statistical mechanics)" and put higher-level material there, but without subtracting it from the "Entropy" article. I hope that is not a problem. I will maintain the distinction between contentious and non-contentious viewpoints. PAR 15:25, 23 November 2006 (UTC)
PAR, I strongly suggest you do not carry out such a merger. The two pages "Entropy (thermodynamic views)" and "Entropy (statistical views)" are already long enough and complicated enough. Having them as separate pages is a useful factoring. Putting the two together would be IMO overkill for one page - more content than any article should be expected to stand. Jheald 20:43, 30 November 2006 (UTC)
Yes, I've come the same conclusion. PAR 22:29, 30 November 2006 (UTC)
PAR, you appear to be missing the point. Info theory entropy is not necessary to an understanding of entopy at a basic level. I would suggest you do nothing until we decide how we are going to proceed. The combination you propose is not particularly appealing to me as it seems less than helpful, and will likely only serve to detract frpom our mission of explaining entropy to our audience. You need to remember PAR, we are not here to write for ourselves, or to show-off how much we know -- we are here to provide a service. Key to that provision is clarity in the communication of seminal ideas. •Jim62sch• 15:49, 23 November 2006 (UTC)
I agree completely with your last two sentences. PAR 15:52, 23 November 2006 (UTC)

The time delay means I miss a lot of debate while sleeping! I agree almost entirely with Jim62sch. I am not sure about teaching entropy to high school students. I do not think that happens in this country. I think we are talking about early undergraduates and particularly those who do not have a strong mathematics background. PAR, I do not disagree about treating informational entropy as you suggest. I disagree about doing it in this article rather than a sub article. However if it eases consensus, I agree to a section on information entropy which points to the longer article. It can mention the controversy but it must be very brief, and the thrust of this article should be thermodynamic entropy.

Now for something new, that I hope illustrates what I believe should be the approach to this article. The very first sentence is "In thermodynamics, entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems, particularly in heat engines during an engine cycle.". This was added in front of a couple of sentence that I added discussing entropy in relation to the 1st and 2nd laws to give context. What can one say about that first sentence? Well, I say that it is completely correct, but completely off-putting to the students mentioned above and completely unclear. It is normal for the first para to have some links to put material in context. Biographies have links to the subjects profession and to dates and places, for example. The links to the two laws and to energy do put entropy in context. The first sentence has 7 links. Only the first puts entropy in context. It is about thermodynamics. The other 6 links are all to more advanced concepts that the student will not understand. Do they really have to go off to find out about 'state function', 'extensive', 'irreversability' and so on before they get past the first sentence? Of course it has to be mentioned somewhere, but it should be later. I suggest we move that sentence down and since people objected earlier to starting the article with 'While', we should use one of the alternative wordings that were discussed here at that time.

I will try to suggest much briefer wording of some of the sections that link to other 'main' articles, but I am busy. We have an election here tomorrow and I'm involved with that. --Bduke 21:19, 23 November 2006 (UTC)

Thanks for taking time out from the election to discuss this, your comments basically look right to me. I've no objection to specialist terms or mathematical equations being used where they aid understanding, subject to their intended meaning first being spelled out in plain English for the uninitiated. Readers can't be expected to know mathematical symbols or have any grasp of calculus. Regarding the earlier search for a basic description, this article on Undergraduate students' understandings of entropy and Gibbs Free energy includes "Table 1: Scientifically accepted statements about the concepts of entropy and Gibbs free energy" which is aimed at chemistry students, but may be of some use. ... dave souza, talk 22:13, 23 November 2006 (UTC)
Thanks to both of you for your comments. (I did have entropy in 10th grade Chem class, but I was in an AP class so I guess that's why). In any case, I guess the age of the student isn't really relevant, as Brian alluded to, but the level of understanding is. To be really odd and use a cooking analogy, this article starts off as a recipe for Beef Wellington, without explaining at all what filet mignon, pâté de foie gras or duxelles are. •Jim62sch• 23:12, 23 November 2006 (UTC)
I dont understand anything about entropy. I came to the page looking for an expalnation. Immediately I was in right over my head. I am prepared, however, to suggest ways in which this large subject can be dissected to make it far more digestible for people like me. You only need to ask me! 8-)--Light current 23:17, 23 November 2006 (UTC)

Sorry, but starting with a correct, to the point, sentence is the sine qua non of an encyclopedia. Note that:

  • hyperlinks can guide the uninitiated to make sense of a technical statement
  • more this means-like explanations can follow the definition.

On the issue what would be the ideal structure of this and related articles, I'm strongly opposed of both extremes:

  • only giving a limited POV in the main article Entropy, as sometime argued, tha information entropy shouldn't be here at all
  • only giving a meager intro and having 30 sections main article nada, nada (+very, very short summary of this topic).

Instead IMHO this article should give

  • a correct definition
  • some down to earth explantion of the definition, without lying to children
  • some history and name-dropping
  • short, but not stubby sections of the main branches of entropy (thermostatics, statistical, quantum, information)

That's it. Then the next tier of articles should give deeper explanations of these branches and itself link to even more specialised articles.

Pjacobi 23:50, 23 November 2006 (UTC)

"Sorry, but starting with a correct, to the point, sentence is the sine qua non of an encyclopedia". It is of course in one sense difficult to disagree with this, but nevertheless, I do disagree and disagree strongly. The first sentence is far too complex for almost anyone starting this article. The first sentences should be correct but informative, giving a clear idea about what the subject of the article is about. This just drops people straight in. Above I asked "Do they really have to go off to find out about 'state function', 'extensive', 'irreversability' and so on before they get past the first sentence?". Do you seriously think the answer to this question is "Yes"? If the answer is "No", that sentence should be lower down. It is not good enough to confuse people in the first sentence and then go on to say what it really means. Have you taught entropy to beginning students? I ask not to press my experience but just to make the point that we all have to put ourselves in the shoes of the reader. Entropy is very very difficult for most people.

I do not think I really disagree with the rest. It is a matter of degree. This article is already too long. There are a lot of things to cover here if we are going to get a sensible set of branches down to more specialised articles. I do however disagree with the order of the first two * points. We should state where entropy fits into the scheme of things and why it is relevent. We should then define it clearly and simply and then give a little expansion of the definition. If you look up an encyclopedia article on a person or place, the reader immediately can see what the person did and why they are important, or where the place is and what it is important for. With complex scientific ideas they usually can not do this. They can not pick up a definition and run with it. We are having to teach them about it. --Bduke 00:53, 24 November 2006 (UTC)

I have reduced the information entropy section to sound more like an introduction to the concept. I hope I have represented the pro's and the con's properly. Almost all of the information removed has been inserted in the information entropy page or the history of entropy page. PAR 16:45, 2 December 2006 (UTC)

A fresh view

With at least some of you (!), I welcome the counsel and guidance of a seasoned teacher of entropy, Bduke, on an article that should correspond to its theme, thermodynamic entropy. There have been 202 print pages of Talk:Entropy since July 2005. The article is almost as ridiculously swollen. Bduke’s experience with real-time student reactions of confusion, discouragement, and anger about their difficulties in trying to understand thermodynamic entropy could assure a moderate and useful Wikipedia Entropy article.

Certainly, the first sentence is totally absurd as an introduction. The referent words, the re-introduction of heat engines and engine cycles to naive Wikipedia users must be completely deleted. Thus, why not just begin the first sentence with “The concept of energy is central to the first law of…” but end it with “…physical and chemical processes and whether…” The last sentence is better ended as “…dispersal of energy in processes.”

As eloquently expressed by Jim62sch below, re “intermolecular molecular frictions..”, we owe much to HappyCamper for deleting it! FrankLambert 23:30, 23 November 2006 (UTC).

I have been very busy on many other matters and have not had much time to look at this article. However, at User:Bduke/Entropy I have taken what is now a slightly old version, and started the process of cutting it back to be much shorter. I hope this gives a clearer indication of what I was proposing. The sections that have another "main article" have been shortened but they could be shortened further. Let me be quite clear - I do not think the material I have cut should disappear from WP. It needs to be integrated into the relevant specialised articles. My shortened sections do need some more work. I have shortened other areas and removed stuff that it just too complex for here. There are now many less equations and that is how it should be. I have moved together all the various definitions. I repeat my earlier view that I think Introduction to entropy should be merged here. This should be the portal into all things entropy and there should not need to be a simpler article. It is our job to make this both an introduction and a portal into all areas of entropy. My effort needs more work, and I am sorry I have not been able to do that. I thought however that I should draw your attention to it before it gets too old. I believe however that my shortened tighter effort is the basis for moving this article towards GA or FAC. Please give it serious consideration. --Bduke 20:57, 4 December 2006 (UTC)

I guess I am the appointed defender of the link between information entropy and thermodynamic entropy. The connection I am defending can be stated this way:
"If each "possibility" has an equal probability of occurring, then the Shannon entropy (in bits) is equal to the number of yes/no questins you have to ask to determine what possibility actually occurred. The thermodynamic entropy as defined by Boltzmann, is equal to the number of yes/no questions you have to ask to determine the microstate, given that you know the macrostate, multiplied by Boltzmann's constant times ln(2)"
I know, I know, your eyes just glazed over. BUT - If you have a problem with my defense of this statement, then you need to read it, understand it, and show where it is wrong.
I cannot and will not defend my strong feeling that my understanding of thermodynamic entropy is enhanced by the above statement, particularly the understanding of Gibbs paradox and the entropy of mixing, because this feeling is not quantitative, not testable. However, this viewpoint has support in the literature (Jaynes, Balian, etc), and needs to be clearly stated in the article.
I believe I understand the informed objections that people have to making a link between the two. The entire dynamics of entropy through its relation to energy and temperature is missing and so the concept of thermodynamic entropy is much larger than just its informational entropy connection. This objection is likewise not quantitative and is therefore indefensible, but it has support in the literature, and needs to be clearly stated in the article.
So, focussing on this narrow subject, I would say that the treatment of information entropy in BDuke's new version is lacking. Consider the statement:
"The thermodynamic interpretations, generally, differ substantially from the information theory interpretation and are only related in namesake, although there is not complete agreement on this issue."
This shows a large amount of prejudice against information entropy. It could have just as easily been written:
"The thermodynamic interpretations and the information theory interpretations are identical, although there is not complete agreement on this issue."
both of which I object to. In addition, in BDuke's section "approaches to understanding entropy" the subsection on information entropy is the shortest and most useless one of all. It should at least give a flavor for what the argument is, like the other subsections do. Now BDuke took this from the subsection that was way too large, from November 27. Could anyone look at the present version of this section to see if it meets the criteria for providing a brief but balanced view of the concept? PAR 05:49, 5 December 2006 (UTC)

I really do not think the issue is whether there is a link or not between thermodynamic entropy and information entropy. I think the fact that it is disputed indicates it should not be here in great detail and that the issue should be discussed elsewhere.

Yes, what I left is taken from the article on that date. Please feel free to rewrite the brief section to be a better description of the connection and what information entropy is. As I said improving all of these shortened sections is just something I have not had the time to do.

Finally I am disturbed that you keep coming back only to this one issue. My concern is to shorten and clarify all sections and make the overall article more readable and more understandable. Informational entropy is just one of many issues. Let us concentrate on the big picture. --Bduke 06:11, 5 December 2006 (UTC)

You and Sadi Carnot and Frank Lambert disagree with me on information entropy, so that gets diminished. Frank Lambert and I disagree with Sadi Carnot on order/disorder, so that gets diminished. Sadi Carnot and I disagree with Frank Lambert on energy dispersal so that gets diminished. I don't agree. These disagreements should be reflected in the article so that a reader can see that there is disagreement, and be motivated to study both sides. The sections which reflect disagreement can be briefly written, and level headed editors dedicated to NPOV can prevent these sections from degenerating into an extended argument. I keep coming back to this issue because it is about the only aspect of your rewrite that I strongly disagree with. Sorry if I left that out in my rant. PAR 06:57, 5 December 2006 (UTC)
Thanks for that. If that is the only part of my rewrite you disagree with, then we are indeed making progress. I am not even sure I disagree with you on information entropy itself. I really do not have a view. I do have a view that this article should make an extremely complex topic as simple as possible. That means that disagreements should be underplayed relative to core material simply described. I then make a judgement that disagreements about energy dispersal and order/disorder do need to be addressed as they impact directly on how entropy is described in textbooks for physics and chemistry students. I also make the judgement that the disagreement about information entropy is not that central and would be best dealt with elsewhere. I certainly have never seen a mention of it in a physical chemistry text and suspect that it would be given short shift by a chemistry professor using a text that did cover it to teach a course in physical chemistry. Can you show physics texts that find information entropy as helpfull as you do? If not what sources do you have to suggest that is helpfull. If you think my rewrite has promise in other areas, why not have a go at rewriting other areas to follow my lead? I would however like to hear what others say about it all. --Bduke 09:45, 5 December 2006 (UTC)
The fact that it would be given short shrift by a chem professor doesn't set off any alarm bells in my mind, although perhaps it should. I understand that you are a teacher, so you see this article as a teaching aid for students. I am a researcher and I tend to see it as a quick reference source for researchers. We are both partially wrong and we should realize that. It can and should be both.
I apologize to anyone who has heard me say this a thousand times before, but IMHO the best thermo book bar none is "Thermodynamics and thermostatistics" by H. Callen. It is the most cited thermo reference in the world and someday I hope to understand half of it. It has a section on Shannon entropy as an introduction to the order/disorder concept (which I disagree with). The best reference on the info/thermo entropy connection is Jaynes' two 1957 articles - "Information theory and Statistical Mechanics - Part 1 and Part 2. The first article I read was Gibbs Paradox]. It is perhaps more approachable and could be read first.
When you say "have a go at rewriting" I assume you mean rewrite parts of the entropy article with your user page as a set of suggestions. That sounds good. I will wait a while to see what other opinions are. PAR 15:46, 5 December 2006 (UTC)
Let me deal with these three points separately. (3) - entirely agree. It should not be just you and I discussing this. On (1), I have been both a teacher and a researcher. That is, I was an academic. Now I am a part-time researcher. Perhaps we are both partially wrong, but I see the researcher as going to the many more specialised articles on entropy that we already have. Entropy is difficult. This is the portal article to entropy. It should be understandable. It can still have the links for the researcher to move on down to the other articles. On (2), I do not see any of your references being to texts that a student starting to learn about entropy would read. So we should be looking at the general physics texts and the general physical chemistry texts, although I guess Frank would want us to also look at the 1st year general chemistry texts. It sounds like Callen should be a good reference for the more specialised articles. If I had come across a text on physical chemistry that I had to use (sometimes you can not chose your own text) and it included informational entropy, why would I give it short shift? Simply because I would already not have enough time to get the basics of this difficult subject across to the students. --Bduke 20:41, 5 December 2006 (UTC)

More pruning and explanations

Firstly, thanks for tackling the major task of making this article understandable. Sorry it's been a bit hectic lately at souza acres, so haven't had time to do more than glance at it.

My first impression is that the lists of definitions have to be removed: if they provide useful explanation, that should be incorporated into the text. From memory, many of them don't even appear in the detailed articles, where they belong. The disorder / dispersal and topics in entropy sections also need severe pruning to focus on an elementary explanation. Secondly, as you say, some work is needed to improve the brief explanations: in particular no previous knowledge of maths symbols can be assumed, so for example Δ needs to be described in writing, and ln explained. Agree with PAR that a very brief explanation of the information entropy / statistical position is needed. Will try to find time to comment in more detail, but rather tied up for a while, .. dave souza, talk 09:19, 6 December 2006 (UTC)

What????

"via intermolecular molecular frictions and collisions" •Jim62sch• 23:45, 22 November 2006 (UTC)


"The more states available to the system with higher probability, and thus the greater the entropy." <- is this supposed to mean anything? ;)

Quotes and humor

This isn't Wikiquote. Is there any reason we need a humor-quote section in an article on a serious science topic? I'm unsure if this was the result of some previous consensus discussion or something, so I thought I'd inquire before simply pulling the material. Serpent's Choice 02:01, 9 January 2007 (UTC)

Section removed, though I'm open to discussion regarding means to include the content elsewhere if someone so desires. Serpent's Choice 07:52, 10 January 2007 (UTC)

Definition

Maybe I'm just silly, but I don't feel there is an actual definition for entropy. The article states that entropy increses or dececreases in a given system, that it is essential to the 2nd law of thermodynamics. But what IS it, if it is already there, and I missed it, can somebody point me to it? Thanks 201.134.106.227 Alex B.

Here's a somewhat nontechnical definition from the article: "Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed." - JustinWick 21:26, 23 January 2007 (UTC)

Def of Generalized Entropy. From this article I cant tell what Generalized Entropy is ( and I would like to know). What is being generalized?

Any truth in this article?

http://www.1729.com/blog/SecondLawDoesntProhibitEntropyDecrease.html

Any truth in that article, and if so, could it somehow be incorporated into the entropy entry? —The preceding unsigned comment was added by BrandonR (talkcontribs) 21:43, 3 April 2007 (UTC).

It looks dubious, although it is true that statistical mechanics is, well, statistical. However, my "crank alert" goes off with this:
It is this myth of absolute prohibition that underlies the persistence of the creationist second-law-of-thermodynamics argument against the theory of evolution. Until the defenders of evolution recognise that evolution does require the occurrence of entropy decreases within closed systems, and that the Second Law does not prohibit these decreases, this argument will not fade into the obscurity that it deserves.
The second law has little to do with evolution because the earth is not a closed system. —Ben FrantzDale 22:49, 1 May 2007 (UTC)

Awful

Why is the page perpetually terrible? The two definitions of entropy are unbelievably difficult to find in the intro - the statistical definition being findable, the other.. almost not being there at all... Fresheneesz 22:08, 17 April 2007 (UTC)

Formula

Is there a formula for (relative) entropy like there is for temperature?

 

That is, if I have a molecular simulation and know the velocities of all atoms, then I know the velocities of all atoms at another point in time, can I find ΔS? —Ben FrantzDale 22:42, 1 May 2007 (UTC)

A nice external page?

I have just run across this page, The Laws of Thermodynamics by one John S. Denker, which I think would make a particularly useful "External link". IMO the page is unusually clear and well-written, and would be particularly useful for intermediate-level readers, who may have come across quite a lot written quite loosely about entropy and heat and thermodynamics. I think it could be particularly useful to them, in presenting rather cautiously and carefully what is fundamentally true about entropy, and clarifying it and separating it from notions which are only mostly true, which readers may have picked up from more glib introductory presentations.

In that respect I believe it fits rather well WP's key criterion for external links, of information which is accurate and on-topic, but presented at a level of detail (and in this case perhaps a level of technical nicety) which is more than is appropriate for WP's own article. I think it fits the criterion of a useful additional external source over and above the content which a Wikipedia "Good Article" might be expected to contain.

So, what do people think of the link, both in its own right, and as a candidate for the "External Links" section of this article? -- Jheald 13:49, 22 May 2007 (UTC)


But only for information entropy, not for thermodynamic entropy!

Merely to scroll down this enormous site takes 4 minutes and 40 seconds, but using Denker's 'Contents' (reached in 15 seconds :-)) to quickly access Section 20 "The relevance of entropy" near its end, you will discover his unique ranking of the importance of entropy to various fields of interest: "1. cryptography...4. librarianship...11. chemistry and chemical engineering". A brilliant physicist and published author in aeronautics, John Denker for many years has been writing his own sometimes good, sometimes awful version of information entropy. He does not believe in the relevance of energy to thermodynamic entropy. He says that the Third Law of thermo does not exist. He cites no peer-reviewed publications of his in the field of entropy, nor a single text or any kind of book by others that cite his writing. Thus, to cryptographers or librarians -- but not to those concerned about thermodynamic entropy in chemistry or physics -- Denker could be recommended. FrankLambert 01:26, 8 June 2007 (UTC)

I did not read that section as a ranking. He states that those nearer the top are more informational entropy and those nearer the bottom more thermodynamic entropy. Therefore it is quite proper for chemistry to be at the bottom as the least concerned with informational entropy. I share your other concerns however after a brief read of an enormous document. --Bduke 03:11, 8 June 2007 (UTC)
Are we talking about the same page here? Denker's page seems absolutely rooted in physics. That page is overwhelmingly a page about entropy in thermodynamics. I don't see any substantial material on that page that is not relevant to entropy in physics. Jheald 09:00, 8 June 2007 (UTC)

Extensive?

Entropy gap. Some models may cast some doubt but not as "extensive" as this article points out. Citation needed if that "extensive" sentence is going to be in this article; No citation, no sentence about "extensive doubt".—Preceding unsigned comment added by 74.213.94.207 (talkcontribs) 06:18, 18 June 2007

Exact Differential

There is an error in the integral defining the entropy difference, even if dS is an exact integral (if the transformations are reversible that is) dQ is NOT an exact integral, and this should be stated by having a barred "d" or a "delta" instead of a simple "d" in the formula. The best would be a barred "d" like the "\dj" in the MikTex distributions. I cannot do this correction myself because i don't know latex that much :) —Preceding unsigned comment added by 89.97.102.194 (talk) 09:22, 22 September 2008 (UTC)

According my knowledge of Thermo, there is no such thing as ΔQ as we can't explain equation "Q2 - Q1". So using Q1,2 or 1Q2 is, or at least should be more scientifically correct. --E147387 (talk) 08:42, 27 November 2008 (UTC)

In physics and mathematics we call this "abuse of notation". Abuse of notation is very common in the physics and math literature. Since wikipedia is supposed to reflect the literature, there is really no problem. What we should do is explain that "dQ" is not to be interpreted as a differential of a function "Q", as such a function does not exist. It merely represents an infinitessimal amount of heat added to the system. Count Iblis (talk) 14:45, 27 November 2008 (UTC)

Error?

Could someone please correct this: Failed to parse (unknown function\DeltaS): \Delta G=\Delta H - T\DeltaS ? —Preceding unsigned comment added by 81.180.224.38 (talk) 04:39, 22 September 2008 (UTC)

The following code will do what you want. Math markup is in LaTeX form.
 <math>\Delta G=\Delta H - T \Delta S</math>

 

Note the space is required to separate the command \Delta from the s, otherwise latex tries to interpret a nonexistent command "\DeltaS".

New intro

Rudolf Clausius' 1879 book (2nd Ed.) Mechanical Theory of Heat (see page: 107 beginnings of entropy discussions) is now available in Google books. Thus, I have started updating the intro to the correct presentation, i.e. in Clausius' own words. --Sadi Carnot 21:36, 30 July 2007 (UTC)

I have reverted your last change which was:

"In physics, entropy, symbolized by S, from the Greek τροπή meaning "transformation", is a mathematical function that represents the measure of the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[1] In short, entropy is a variable that quantifies the affects of irreversibility in natural processes."

Your paragraph is all true, but it is quite unintelligible to the average reader and is far to concise. You also left the lead far too short and not being a summary of the whole article. However I recognise that you probably were going to add something more. Also entropy is not anymore what Clausius wrote. We should be describing entropy as it is now understood and used not its historical roots. Please stop and discuss your changes here. --Bduke 23:28, 30 July 2007 (UTC)

Current lead

Bduke, all I did was move the bulk of the lead to an "overview" section. The current lead paragraph (which is completely un-referenced), show below, is filled with errors (especially the etymology):

The concept of entropy (Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn) [wrong (τροπή meaning "transformation", )]) in thermodynamics is central to the 2nd law of thermodynamics, which deals with physical processes and whether they occur spontaneously [wrong (the measure of spontaneity is "free energy" as per the combined law of thermodynamics)]. Spontaneous changes occur with an increase in entropy [wrong (only for isolated systems)]. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed [close (in some cases, but no reference)]. In contrast, the first law of thermodynamics deals with the concept of energy, which is conserved [correct (but does not explain what the connection is to entropy)].

I'll move the bulk of the lead back in, but I'm still correcting all this mess; for instance, all these suppositions need to be referenced. As to your statement "entropy is not anymore what Clausius wrote", there is some truth to this (in terms of verbal terms), but entropy, at its core, is what he wrote (in conceptual and mathematical terms). Now that the original paper is available, I intend to include a blend of this, as well as modern views, in the lead. No need to do any further reverting, please work together on this. --Sadi Carnot 07:05, 31 July 2007 (UTC)

It seems good now. I'm guessing, however, that if the lead keeps growing, some of it will have to be moved into a new "overview" section (which is what I was attempting to do before), as per WP:LEAD which states that opening section "should contain up to four paragraphs, should be carefully sourced as appropriate, and should be written in a clear, accessible style so as to invite a reading of the full article". --Sadi Carnot 07:16, 31 July 2007 (UTC)

Sadi, please discuss it more here and let us see what others think. I do not agree one little bit that it "seems good now", but I'm not going to revert. The problem is that the new first paragraph is NOT "written in a clear, accessible style so as to invite a reading of the full article". It will be a complete off-put to most readers, particularly those who are coming to it from a discipline other than physics but realise that this is a part of physics they need to know about. This has been a long term problem with this article and particularly its lead, but I just do not seem to be able to convince you and others. I can not work together with you on it, because it is the very opposite of what I would like to see. Keep the lead simple. Let it attract people with very different views of why they came to read it. Make in intelligible. --Bduke 07:38, 31 July 2007 (UTC)

I agree with you, any subject should be written in the manner that best conveys information in a digestible manner. One should not, however, bend, twist, misconstrue or even misrepresent basic science and logic for the sake of readability. Presently, to review as things currently stand, we are debating the first two sentences in the article. Please explain what your fuss is about (with these two sentences)? All I did was to correct wrong information and to add a reference. --Sadi Carnot 08:00, 31 July 2007 (UTC)

Lead comparison

To give you a comparative idea of why the lead is in “good shape” now, below is the current lead for the energy article (which there seems to be no issues with):

In physics, energy (from the Greek ενεργός, energos, "active, working")[2] is a scalar physical quantity, often represented by the symbol E,[3] that is used to describe a conserved property of objects and systems of objects.
In physics, entropy, symbolized by S, from the Greek τροπή meaning "transformation", is a mathematical function that represents the measure of the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[4]

There really is no need to make a big fuss over entropy; it’s basically the same thing as energy, only in a non-conservative sense. If you think the average reader is going to be “put off” by this sentence, than you might as well go over to the energy article, and post a note on that talk page as well, because I see no difference between these to sentences, in terms of difficultly. In short, the first sentence has to define the term. This is the way it is in all science articles. --Sadi Carnot 08:08, 31 July 2007 (UTC)

I think the lead to energy could be improved somewhat, but it really is not as difficult or off-putting as the current one to "Entropy". I do not think the first sentence has to define the term. It may do so, but often it is better to say in general terms what it is about, where it is used etc. and define it later. This does not mean being inexact or misleading. I do not want to sound patronising, but I think it is clear that you have never taught entropy or energy to people who are somewhat apprehensive about the topics. If you had you would see quite clearly what to me "the fuss is about". It has to attract people. It has to be simple, so readers can decide whether they need to get deeper. It currently does not do these things. I am busy with other things, so I am going to leave it to you. When you have done, put it to peer review and try to get it to featured article status. That will bring many others to comment on the article. --Bduke 09:32, 31 July 2007 (UTC)

(Edit conflict note: this was written before seeing Bduke's post above) IMO, the lead sentence is as clear as mud. What is transformation-content? What is "dissipative energy use"? That 19th century quote can be impenetrable to the modern reader. I'd rather have a definition like this one (although it's not perfect either):
"Quantity the change in which is equal to the heat brought to the system in a reversible process at constant temperature divided by that temperature. Entropy is zero for an ideally ordered crystal at 0 K. In statistical thermodynamics, S = k ln W, where k is the Boltzmann constant and W the number of possible arrangements of the system."[1]
This definition has the deficiency of not saying what entropy is good for or what it "is", but it is concrete and clear. Saying what entropy "is" gets into issues of interpretations or analogies, of which everyone has a favorite. --Itub 09:41, 31 July 2007 (UTC)

Dava souza's revert

Sadi, your enthusiasm for obscure historical definitions is noted, but this article is about informing newcomers to the subject and Bduke's considerable expertise on the subject has produced a much better lead than the proposed change, so I've restored it. .. dave souza, talk 09:56, 31 July 2007 (UTC)

Dave, you reverted several of my edits (corrections to errors) just now, not just the definition. I'm flexible on this, however, I want to seen a reference (or several) in the opening sentence and I don't want to see sloppy (incorrect) sentences. The sentence "spontaneous changes occur with an increase in entropy", is only correct in isolated systems; the novice reader will think it applies to all situations. The etymology is wrong to; I added an original source reference and you have reverted this too. Also, the lead needs to be four concise paragraphs, and the rest moved to an overview section. Please be considerate of my editing efforts. If you want to blend in a new reference to make it easier to read then do so. The lead you reverted to:
The concept of entropy (Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn)) in thermodynamics is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy.
is completely incorrect, e.g. see spontaneous process. This is what I am trying to clean. --Sadi Carnot 16:48, 31 July 2007 (UTC)
Spontaneous process is misleading at best and certainly needs work. It does not clearly say that   is for the system only and   is also an entropy term - for the surroundings.   is really a measure of the total entropy change. --Bduke 22:32, 31 July 2007 (UTC)
Well, we can discuss whether the article should sharpen its discussion on the difference between the entropy of the universe and the entropy of the system. But, as the article spontaneous process makes quite clear, chemists define the word spontaneous to mean a process in which the entropy of the universe increases - ie a process allowed by the 2nd law. Jheald 17:17, 31 July 2007 (UTC)
(As a physicist, that usage always makes me uncomfortable -- it seems misguided to me to call a reaction spontaneous, if in practice it doesn't occur spontaneously, because the reaction barrier is too high. But who am I to argue with chemists in full flood?) Jheald 17:20, 31 July 2007 (UTC)
I kind of agree, but the distinction between thermodynamic control and kinetic control of a reaction is a useful one. --Bduke 22:32, 31 July 2007 (UTC)

Etymology

On the subject of the derivation of the word entropy, in Greek τροπή comes from τρέπω just like for example in English "cessation" comes from "cease". The verb τρέπω is the root word, and "chase, escape, rotate, turn" gives a good sense of what it means. The noun τροπή means a τρέπω-ing, hence a turning, a changing, a transformation.

I have to agree, in the strongest terms, with Itub above, when he writes that your proposed lead sentence "is as clear as mud. What is transformation-content? What is "dissipative energy use"?" These terms should be left in the 19th century. He is so right, when he writes, they are simply "impenetrable to the modern reader". I have reverted this ancient cruft, and would do so again without hesitation. Jheald 17:43, 31 July 2007 (UTC)

Jheald, I was the one that added the original etymology (from Perrot's A to Z Dictionary of Thermodynamics), and now that I've seen the 2nd Edition of the book (page 107):
I have corrected it to how Clausius coined it. Thanks you: --Sadi Carnot 17:55, 31 July 2007 (UTC)
The difference is that εν + τρέπω understood as the "chasing/ escaping/ rotating/ turning" "inside" the system actually gives quite a helpful steer towards understanding what entropy is. "Transformation" doesn't. Jheald 18:02, 31 July 2007 (UTC)
A measure of the unavailability of a system’s energy to do work. This actually is rather unhelpful. TR S is a measure of the energy unavailable to do work. The dependence on the reservoir temperature TR is fundamental. If TR was zero, then all the energy would be available to do work. It therefore is not helpful to suggest that S on its own is a measure of the unavailability of a system’s energy to do work. Jheald 18:07, 31 July 2007 (UTC)
Oh, and while you're at it, please learn enough about information theory to understand why saying Shannon entropy is "attenuation in phone-line signals" is imbecilic. Jheald 18:12, 31 July 2007 (UTC)

I added a second ref note per your request:

  • The etymology of entropy, in modern terms, according to Perrot’s A to Z of Thermodynamics, can be interpreted to mean, “from the Greek root εντροπη, the act of turning around (τροπη, change of direction), implying the idea of reversibility”.

I hope this helps. As to the new 2005 Oxford Dictionary of Physics definition, do you really have to complain about every reference? First Clausius is to historic, now Oxford it too unhelpful. Give me a break. I'm only trying to add references to the article so to give it credibility, rather than original research. As to Shannon, fix it if you know of a better wording. --Sadi Carnot 18:21, 31 July 2007 (UTC)

As to Shannon, I was rather hoping you might go away and actually learn something, so you don't continue to inflict nonsense like this any more.
As for the opening definitions, no I'm not going to give you a break. Settling for misleading is not acceptable. "A measure of the unavailability of a system’s energy to do work" is horribly misleading, because that unavailablity depends utterly on the reservoir temperature.
Finally, connecting τροπη with the idea of reversibility is a spectacularly unhelpful intuition, even by your standards. What is valuable about the link with τρέπω = "chase, escape, rotate, turn" is that it gives some idea of internal molecular confusion. No, that's not what Clausius was thinking of when he coined the phrase. But it's the most valuable connection today. Clausius's original etymology frankly isn't helpful for the intro. Jheald 18:39, 31 July 2007 (UTC)

Jheald’s comments

Jheald, let me get this straight: from your point of view, I’m an imbecile and you want me to go away? --Sadi Carnot 02:36, 1 August 2007 (UTC)

No, but from time to time, like all of us, you may write things which make you look clueless. At which point, the best solution is to get a clue. I told you 18 months ago that this sort of statement about information entropy was misconceived, and yet you still trot it out. Jheald 08:22, 1 August 2007 (UTC)
In any event, thanks for the nice comments, I've added them to my user page. --Sadi Carnot 03:21, 1 August 2007 (UTC)

Economic entropy

I have changed economic entropy from being a quantitative value to a semi-quantitative value. I would go further and call it qualitative, but people might disagree. I fail to see how it can be quantitative without a mathematical definition, which is stated by the article by filing it under sociological definitions. I would argue that quantitative measurements must be of a known quantity if they are to be named as such. Thanks User A1 11:26, 6 August 2007 (UTC)

Entrophy and the relative number of states

Ω, the number of microstates, in S = k ln Ω might be better interpreted as a relative number of states which would be a dimensionless quantity for which the logarithm would be defined.

On p. 24 of Wolfgang Pauli's Statistical Mechanics (Vol. 4 of Pauli Lectures on Physics) he comments,

"The statistical view also permits us to formulate a definition of entrophy for nonequilibrium states. For two states, 1 and 2 we have

S2 - S1 = k log(W2/W1);

leaving the additive constant unspecified, we obtain

S = k log W.

Because of the logarithm, and because the probabilities of independent states multiply, the additivity of entrophy is maintained." --Jbergquist 18:41, 2 October 2007 (UTC)

Gibbs entropy as fundamental but not defined?

Under "Miscellaneous definitions", "Gibbs entropy" is described as being the "usual statistical mechanical entropy of a thermodynamic system". However, Gibbs entropy does not appear to be defined in this article, and the linked article on "Gibbs entropy" does not define some of the terms used. 68.111.243.96 20:18, 17 October 2007 (UTC)

Entropy (disambiguation)

Editors of this page might like to look over the recent to-and-fro and Entropy (disambiguation). User:Thumperward seems dead set to (IMO) make the page harder to use. Compared to eg this edit, he seems determined to

  • remove the link to Introduction to entropy
  • remove links directing people to find additional entropy articles in the categories
  • reduce the structuring of the page between thermodynamic entropy and information entropy.

-- all of which (IMO) are mistakes. Anyhow, there have been a series of reverts and counter-reverts (I've now had my 3 for the day), and there's discussion on the talk page there, if anybody wants to have a look. Jheald 13:52, 18 October 2007 (UTC)

Never mind, we seem to have come to agreement. Edit war over :-) Jheald 15:47, 18 October 2007 (UTC)

"calculated using the multiplicity function" ????

And if I click on the wiki link or multiplicity function I see the expression for a system of N noninteracting spins :) Also, we should avoid using misleading examples of a system of which the energy levels are exactly degenerate. It is better to define   as F. Reif does in his textbook:   is the number of energy eigenstates with energy between   and  , where   is a macroscopically small energy interval. The entropy defined in this way depends on the choice of  , but this dependence becomes negligible in the thermodynamical limit. It cannot be set to zero, because then for generic systems  , and the entropy becomes identical to zero.

Basically what happens is that if you specify the energy of a system with infinite accuracy, then there can be only one microstate compatible with that energy specification. This entropy is the so-called fine grained entropy, while the entropy defined with the nonzero   is the coarse grained entropy. Count Iblis 15:24, 23 October 2007 (UTC)

Clausius Inequality

Shouldn't the inequality be a less than or equal to sign rather than a greater than or equal to sign? That should probably be fixed. --Lee —Preceding unsigned comment added by 24.3.168.99 (talk) 05:39, 21 November 2007 (UTC)

No - the inequality is correct. Read the statement just above the equation where it states that the result should be positive or zero for the extreme situation. This is a basic statement of the 2nd Law of Thermodynamics. But thanks for checking. PhySusie (talk) 13:09, 21 November 2007 (UTC)

  1. ^ Clausius, Rudolf. (1879). Mechanical Theory of Heat, 2nd Edition. London: Macmillan & Co.
  2. ^ Harper, Douglas. "Energy". Online Etymology Dictionary. {{cite web}}: Unknown parameter |accessmonthday= ignored (help); Unknown parameter |accessyear= ignored (|access-date= suggested) (help)
  3. ^ International Union of Pure and Applied Chemistry (1993). Quantities, Units and Symbols in Physical Chemistry, 2nd edition, Oxford: Blackwell Science. ISBN 0-632-03583-8. p. 12. Electronic version.
  4. ^ Clausius, Rudolf. (1879). Mechanical Theory of Heat, 2nd Edition. London: Macmillan & Co.