Talk:Entropy
From Wikipedia, the free encyclopedia
“ | Please put comments in chronological order, top (oldest) to bottom (newest), and sign your name | ” |
Contents |
[edit] Archives
|
|
[edit] Template stuff
[edit] Nice 1872 Boltzmann article page
I found this nice page about Boltzmann's contribution to science:
- 1872 Boltzmann Article - Manhattan Rare Book Company
This rare article costs $3,600 dollars; I'll pitch in the first ten dollars. --Sadi Carnot 04:56, 22 November 2006 (UTC)
-
-
- Coming back to look at the Entropy article after a long hiatus, I am refreshed by Sadi's light comment (although still depressed by the size of the article and the discussion, prior to Bduke!) As Sadi undoubtedly is well aware, an English translation of the 1872 article is available on pp. 88-175 of Volume 2 of "The kinetic theory of gases : an anthology of classic papers with historical commentary" by Stephen G Brush ; edited by Nancy S Hall. London: Imperial College Press ; River Edge, NJ : Distributed by World Scientific Pub., C2003. This is evidently a reprint of the 1965 Brush publication by Oxford (The 'Boltzmann equation' to which the ad for the rare article refers is Boltzmann's H equation, of course, not Planck's 1900 or 1906 versions of the equation on Boltzmann's tombstone.) FrankLambert 06:02, 23 November 2006 (UTC)
-
[edit] A review of the whole article
After being away from this for a while, I have been trying to think about the whole article. There has been a lot of good and exacting work, but I am afraid the article has many defaults. It is very unclear, difficult to understand and confusing. Where can we begin? Well, an obvious practical place is the warning we get when we try to edit the article:-
- Note: This page is 58 kilobytes long. It may be appropriate to split this article into smaller, more specific articles. See Wikipedia:Article size.
Yes, it is far too large, yet we already have many more specific artciles and we refer to them at the top of many sections with "See article: XXX". Yet these headings are followed by far too much material that is often duplicated in the specific article. There should, in these sections, be no more than two sentences that just allow the readers to decide whether they should move on to read the specific article. The "Arrow of time" section is how it should be. All the others are way way too long. We should also avoid any equations in these sections and perhaps everywhere. These sections should be trimed down to one or two sentences with possible changes to the specific articles:-
- History
- Macroscopic viewpoint
- Microscopic viewpoint
- Entropy in chemical thermodynamics
- Second Law
- Entropy and information theory
- Entropy and Life
- Entropy and cosmology
Approaches to understanding entropy should perhaps stay but it needs rewriting. The section on open systems needs a specific article for this material with just a couple of sentences to point to it as for the others. The section at the end on other relations needs carefull thought. It could even be deleted.
So the idea is to spawn off more specific articles, not less specific articles. If we do what I suggest above we will save a great deal of space and can then merge in the introduction to entropy article and work on getting a clear easily understood introduction to the article. I have thoughts on the introduction article as it now stands, but I will leave them for now except to say that I do not think it is clearly understood.
If people agree, let us work on each section in turn discussing the wording of the one or two sections here along with what needs to go into the specific article. Then we can move to the introduction.
We need rigour but we need clarity. The emphasis in this article should be on clarity with no lack of rigour but the main rigour moved to more specific articles.
Lets have your views. --Bduke 21:09, 22 November 2006 (UTC)
- A good idea might be to change this "Entropy" page into a gateway to all kinds of entropy, and keep thermodynamic entropy in the Entropy (thermodynamic views) article. The thermodynamic views article would still be rather large, but could give more space to an explanation before handing off to another article. I agree that the article size needs to be reduced. By the way, I very much disagree with splitting the macro and micro into separate articles. The macro article would be blind, offering no intuitive understanding of entropy, and the micro article would be lame, having little connection to hands-on reality. It would be like having two articles about the penny, one about the heads side, the other about the tails.
- Although it is a small point, I agree with you about not splitting the macro and micro, but I note that the two sections have different main articles. I will add one of them that you missed to your very usefull table. More general stuff to follow below. --Bduke 03:13, 23 November 2006 (UTC)
- Let me offer an expanded list of thermodynamic entropy-related articles, existing and proposed. Please add to it and remove it from here, if anyone wants to change it. PAR 22:40, 22 November 2006 (UTC)
I think the majority of minds [including mine] are to small to summarize the article in its current state. The immediate wrong I see in the article is the phrase "In thermodyanmics" at the beginning of the article, which implies to readers that the topic is basically about thermodyanamics. However, in fact, not all of the article talks specifically about thermodyanamics. The topic, I believe, should be subdivided according to general fields of science such as astronomy, biology, chemistry, computers, physics, sociology, etc.Kmarinas86 22:42, 22 November 2006 (UTC)
- I disagree. Entropy in all natural sciences is about thermodynamics. It is not different. --Bduke 03:13, 23 November 2006 (UTC)
- From what I've seen, most of these fields adopt variations on either thermodynamic entropy or information entropy: I'd like to see this article concisely outline the main fields in a way that ordinary readers can grasp, and include sentences briefly relating the applications such as sociology to the main field concerned. Detail should then go in the relevant articles which will be linked. Entropy (thermodynamic views) at present is pretty impenetrable, there may be a case for a separate Thermodynamic entropy article .. dave souza, talk 23:17, 22 November 2006 (UTC)
-
- Of course, entropy started out as thermno -- then, because of the intellectual incapacity to understand it as an abstract and the human desire for order, the definition of entropy relying on chaos and disorder and functionally related gibberish was born and appropriated by a number of other disciplines. Sigh. •Jim62sch• 01:22, 23 November 2006 (UTC)
[edit] Solution is simple
The solution is to keep this page very simple, and to use it as a WP:summary page such that all the details are contained on other main pages--Light current 22:47, 22 November 2006 (UTC)
- The disambiguation page refers the entropy article as "thermodynamic entropy". Yet the article goes beyond thermodynamics. I believe the article should be inclusive to the fields of science which talk about entropy and reject the image of being just a article on thermodyanmic entropy.Kmarinas86 22:55, 22 November 2006 (UTC)
-
- Yes hence the summary page idea! 8-)--Light current 23:02, 22 November 2006 (UTC)
-
-
- The whole thing needs to be redone.....from intro to outro. However, that's virtually impossible at the moment. •Jim62sch• 23:09, 22 November 2006 (UTC)
-
-
-
- BTW, it shouldn't be a "summary page", it should be an introduction to entropy -- it needs meat on the bones (a summary is bones). At the moment though, the meat is fracid, and the bones protruding through the rent flesh. •Jim62sch• 23:12, 22 November 2006 (UTC)
-
-
- Wikipedia:Summary style#Levels of desired details sets out the aim pretty well:
-
- many readers need just a quick summary of the topic's most important points (lead section),
- others need a moderate amount of info on the topic's more important points (a set of multi-paragraph sections), and
- some readers need a lot of detail on one or more aspects of the topic (links to full-sized separate articles).
-
-
- Thanks for that. I have been looking for it. It is exactly what we should be considering. --Bduke 03:13, 23 November 2006 (UTC)
-
- This article should include a moderate amount of info aimed at giving an ordinary reader a basic understanding, with the more specialist and maths-heavy stuff in the linked articles. .. dave souza, talk 23:28, 22 November 2006 (UTC)
-
-
- Couldnt agree more!--Light current 23:36, 22 November 2006 (UTC)
-
-
- OK, that works. As it is now, this article is worse than it was 2 months ago. •Jim62sch• 23:45, 22 November 2006 (UTC)
-
- Light current, how about a middle ground? ;) I've asked three friends who are unbiasedly honest (one of my prerequisites for friendship), and who are not wikipedians to read this and give me their opinions. Bottom line being that I might be too involved to be objective. •Jim62sch• 00:05, 23 November 2006 (UTC)
I agree with Dave Souza that the Entropy (thermodynamic view) can use improvement. Would anyone mind if I edited it as if it were THE thermodynamic entropy article? I would like to take the present page, pare it down, and incorporate it into the Entropy (thermodynamic view) page. People will feel freer to pare down the "Entropy" page then, without worrying that some good thermodynamic-entropy stuff will be lost. PAR 00:39, 23 November 2006 (UTC)
I think that begs the question somewhat. We do not want to lose material that is good, but we ought to get the main article right first. Let us see if we can get some consensus first. We have already written too much. We need to reflect on what to change, what to delete and what to expand. Just rushing off and writing even more could lead to even more confusion. In physical and natural sciences entropy is thermodynamic entropy. Everything in the article is about thermodynamic entropy except for the material on informational entropy. Informational entropy is not core to the idea of entropy. Indeed its links are via statistical mechanics and the Boltzmann Equation, not bulk thermodynamics. I think it should be left out of the entropy article, which should have something like this explanation at the top:-
- This article is about the thermodynamic concept of entropy in the physical and natural sciences. For details of informational entropy see informational entropy.
Indeed, I feel a lot of the problems with the article arise from the inortinate fondness of some editors of this article for informational entropy. It leads to a lack of focus and a resulting lack of clarity. Let us bear in mind that for almost all students who meet entropy in a science class, except perhaps for a few mathematically inclined physics students, it is the most difficult concept they ever come across. We can not afford to make it more difficult for them. If we stuck with thermodynamic entropy, we could write a good clear article. Even then it needs to be a summary article with a lot of clarity, leaving the more in depth stuff to the other articles. --Bduke 03:35, 23 November 2006 (UTC)
- I reverted the recent destruction of the order of the article. I restored Kenosis edits and Jim62Sch additions of fact templates, but not Jim62Sch laundry templates, since he did not revert their removal in later edits. As for information entropy, I agree this is not the place to develop the concept of information entropy, but it IS the place to describe the application of information entropy to enhance the understanding of thermodynamic entropy. I will work to tighten that up, if certain editors who think I inserted that stuff out of an "inordinate fondness" for info entropy will please try to understand the link between the two. Please read the section "similarities between the two" - really - does this add or detract from the understanding of entropy? PAR 05:15, 23 November 2006 (UTC)
- We are clearly not on the same wavelength. I think it does add to the understanding of entropy, so it should be on WP, but not here. It is just too far into the learning process to be here. This article should just say that a concept of information entropy has been introduced and it does have links to thermodynamic entropy and then point to an article that develops what you say. Ask yourself this question - "Who will benefit from this knowledge and how much other knowledge do they need before they can appreciate it?". --Bduke 05:59, 23 November 2006 (UTC)
-
- Well, if you have read it and still disagree, lets not get all twisted up in that argument. What do you suggest for the Entropy page? I'm still not clear on that. Should it be a gateway to all types of entropy, where the main discussion takes place on other pages, should it be an "introduction to thermodynamic entropy" with the harder stuff on another page, or what. Could you outline a proposal? Anybody who expects that your proposal will be taken seriously will be reluctant to edit until there is a clear consensus. Lets get on with it. PAR 06:27, 23 November 2006 (UTC)
-
- I do not think we are in that much hurry. I made a number of proposals above and, yes, they have a few differences between them. I would like to see whether other people respond. As you say, let us try to get consensus. I will expand my ideas in more details tomorrow. What I have said however that is I hope fairly clear is that we should cut back the sections that already point to a main article to no more than two sentences leaving those main articles to take the weight. I have also said that I would prefer information entropy to be not mentioned other than as a disambiguation at the top. Thus I believe this article under this simple name of "Entropy" should be about core entropy - that is thermodynamical entropy in the physical and natural sciences. I have also said that the "Introduction to Entropy" article should disappear as this is the place to do the introduction. We should then try to write the introduction in a clear non-technical style that hopes to help readers to understand this extremely difficult concept. As a final point, I am curious. How many of the editors who regularly work on this article have actually taught entropy to students who are beginning the journey to grasp the concept? --Bduke 06:56, 23 November 2006 (UTC)
-
-
- I would be fine with such a revision to the article: as I stated earlier, thermo gave birth to entropy. We need this to be helpful to our core audience, (primarily) high school students who are learning and trying to understand the concept of entropy; overloading them with conflicting definitions and the myriad disciplines that have appropriated entropy to serve their own ends can only serve to confuse the issue for our audience. Also, as Dave had pointed out -- much of the math can be deleted from this article -- it is not helpful to one trying to grasp what entropy is. •Jim62sch• 13:48, 23 November 2006 (UTC)
-
-
-
-
- To Bduke - There are a number of concepts that have been discussed here that are contentious. These include energy dispersal, order/disorder, and now information theory applications. We have more or less settled on the idea that contentious issues should be presented as such - the arguments pro and con should be honestly presented without confusing them with widely accepted viewpoints. Any other viewpoint means war. The fact that you think information entropy should be removed from the article conflicts with my idea that it is a vital part of understanding thermodynamic entropy. This disagreement is reflected in the literature as well. Therefore, the topic should be included, not as recieved wisdom, but as a point of contention, with the argument briefly and honestly outlined, before linking to a more detailed page.
-
-
-
-
-
- Until we gain a consensus on this page, I will combine the "Entropy (thermodynamic views)" and "Entropy (statistical views)" into "Entropy (thermodynamics/statistical mechanics)" and put higher-level material there, but without subtracting it from the "Entropy" article. I hope that is not a problem. I will maintain the distinction between contentious and non-contentious viewpoints. PAR 15:25, 23 November 2006 (UTC)
-
-
-
-
-
-
- PAR, I strongly suggest you do not carry out such a merger. The two pages "Entropy (thermodynamic views)" and "Entropy (statistical views)" are already long enough and complicated enough. Having them as separate pages is a useful factoring. Putting the two together would be IMO overkill for one page - more content than any article should be expected to stand. Jheald 20:43, 30 November 2006 (UTC)
-
-
-
-
-
-
-
-
- Yes, I've come the same conclusion. PAR 22:29, 30 November 2006 (UTC)
-
-
-
-
-
-
- PAR, you appear to be missing the point. Info theory entropy is not necessary to an understanding of entopy at a basic level. I would suggest you do nothing until we decide how we are going to proceed. The combination you propose is not particularly appealing to me as it seems less than helpful, and will likely only serve to detract frpom our mission of explaining entropy to our audience. You need to remember PAR, we are not here to write for ourselves, or to show-off how much we know -- we are here to provide a service. Key to that provision is clarity in the communication of seminal ideas. •Jim62sch• 15:49, 23 November 2006 (UTC)
-
-
-
-
- I agree completely with your last two sentences. PAR 15:52, 23 November 2006 (UTC)
-
-
The time delay means I miss a lot of debate while sleeping! I agree almost entirely with Jim62sch. I am not sure about teaching entropy to high school students. I do not think that happens in this country. I think we are talking about early undergraduates and particularly those who do not have a strong mathematics background. PAR, I do not disagree about treating informational entropy as you suggest. I disagree about doing it in this article rather than a sub article. However if it eases consensus, I agree to a section on information entropy which points to the longer article. It can mention the controversy but it must be very brief, and the thrust of this article should be thermodynamic entropy.
Now for something new, that I hope illustrates what I believe should be the approach to this article. The very first sentence is "In thermodynamics, entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems, particularly in heat engines during an engine cycle.". This was added in front of a couple of sentence that I added discussing entropy in relation to the 1st and 2nd laws to give context. What can one say about that first sentence? Well, I say that it is completely correct, but completely off-putting to the students mentioned above and completely unclear. It is normal for the first para to have some links to put material in context. Biographies have links to the subjects profession and to dates and places, for example. The links to the two laws and to energy do put entropy in context. The first sentence has 7 links. Only the first puts entropy in context. It is about thermodynamics. The other 6 links are all to more advanced concepts that the student will not understand. Do they really have to go off to find out about 'state function', 'extensive', 'irreversability' and so on before they get past the first sentence? Of course it has to be mentioned somewhere, but it should be later. I suggest we move that sentence down and since people objected earlier to starting the article with 'While', we should use one of the alternative wordings that were discussed here at that time.
I will try to suggest much briefer wording of some of the sections that link to other 'main' articles, but I am busy. We have an election here tomorrow and I'm involved with that. --Bduke 21:19, 23 November 2006 (UTC)
- Thanks for taking time out from the election to discuss this, your comments basically look right to me. I've no objection to specialist terms or mathematical equations being used where they aid understanding, subject to their intended meaning first being spelled out in plain English for the uninitiated. Readers can't be expected to know mathematical symbols or have any grasp of calculus. Regarding the earlier search for a basic description, this article on Undergraduate students' understandings of entropy and Gibbs Free energy includes "Table 1: Scientifically accepted statements about the concepts of entropy and Gibbs free energy" which is aimed at chemistry students, but may be of some use. ... dave souza, talk 22:13, 23 November 2006 (UTC)
-
- Thanks to both of you for your comments. (I did have entropy in 10th grade Chem class, but I was in an AP class so I guess that's why). In any case, I guess the age of the student isn't really relevant, as Brian alluded to, but the level of understanding is. To be really odd and use a cooking analogy, this article starts off as a recipe for Beef Wellington, without explaining at all what filet mignon, pâté de foie gras or duxelles are. •Jim62sch• 23:12, 23 November 2006 (UTC)
- I dont understand anything about entropy. I came to the page looking for an expalnation. Immediately I was in right over my head. I am prepared, however, to suggest ways in which this large subject can be dissected to make it far more digestible for people like me. You only need to ask me! 8-)--Light current 23:17, 23 November 2006 (UTC)
Sorry, but starting with a correct, to the point, sentence is the sine qua non of an encyclopedia. Note that:
- hyperlinks can guide the uninitiated to make sense of a technical statement
- more this means-like explanations can follow the definition.
On the issue what would be the ideal structure of this and related articles, I'm strongly opposed of both extremes:
- only giving a limited POV in the main article Entropy, as sometime argued, tha information entropy shouldn't be here at all
- only giving a meager intro and having 30 sections main article nada, nada (+very, very short summary of this topic).
Instead IMHO this article should give
- a correct definition
- some down to earth explantion of the definition, without lying to children
- some history and name-dropping
- short, but not stubby sections of the main branches of entropy (thermostatics, statistical, quantum, information)
That's it. Then the next tier of articles should give deeper explanations of these branches and itself link to even more specialised articles.
Pjacobi 23:50, 23 November 2006 (UTC)
"Sorry, but starting with a correct, to the point, sentence is the sine qua non of an encyclopedia". It is of course in one sense difficult to disagree with this, but nevertheless, I do disagree and disagree strongly. The first sentence is far too complex for almost anyone starting this article. The first sentences should be correct but informative, giving a clear idea about what the subject of the article is about. This just drops people straight in. Above I asked "Do they really have to go off to find out about 'state function', 'extensive', 'irreversability' and so on before they get past the first sentence?". Do you seriously think the answer to this question is "Yes"? If the answer is "No", that sentence should be lower down. It is not good enough to confuse people in the first sentence and then go on to say what it really means. Have you taught entropy to beginning students? I ask not to press my experience but just to make the point that we all have to put ourselves in the shoes of the reader. Entropy is very very difficult for most people.
I do not think I really disagree with the rest. It is a matter of degree. This article is already too long. There are a lot of things to cover here if we are going to get a sensible set of branches down to more specialised articles. I do however disagree with the order of the first two * points. We should state where entropy fits into the scheme of things and why it is relevent. We should then define it clearly and simply and then give a little expansion of the definition. If you look up an encyclopedia article on a person or place, the reader immediately can see what the person did and why they are important, or where the place is and what it is important for. With complex scientific ideas they usually can not do this. They can not pick up a definition and run with it. We are having to teach them about it. --Bduke 00:53, 24 November 2006 (UTC)
- I have reduced the information entropy section to sound more like an introduction to the concept. I hope I have represented the pro's and the con's properly. Almost all of the information removed has been inserted in the information entropy page or the history of entropy page. PAR 16:45, 2 December 2006 (UTC)
[edit] A fresh view
With at least some of you (!), I welcome the counsel and guidance of a seasoned teacher of entropy, Bduke, on an article that should correspond to its theme, thermodynamic entropy. There have been 202 print pages of Talk:Entropy since July 2005. The article is almost as ridiculously swollen. Bduke’s experience with real-time student reactions of confusion, discouragement, and anger about their difficulties in trying to understand thermodynamic entropy could assure a moderate and useful Wikipedia Entropy article.
Certainly, the first sentence is totally absurd as an introduction. The referent words, the re-introduction of heat engines and engine cycles to naive Wikipedia users must be completely deleted. Thus, why not just begin the first sentence with “The concept of energy is central to the first law of…” but end it with “…physical and chemical processes and whether…” The last sentence is better ended as “…dispersal of energy in processes.”
As eloquently expressed by Jim62sch below, re “intermolecular molecular frictions..”, we owe much to HappyCamper for deleting it! FrankLambert 23:30, 23 November 2006 (UTC).
I have been very busy on many other matters and have not had much time to look at this article. However, at User:Bduke/Entropy I have taken what is now a slightly old version, and started the process of cutting it back to be much shorter. I hope this gives a clearer indication of what I was proposing. The sections that have another "main article" have been shortened but they could be shortened further. Let me be quite clear - I do not think the material I have cut should disappear from WP. It needs to be integrated into the relevant specialised articles. My shortened sections do need some more work. I have shortened other areas and removed stuff that it just too complex for here. There are now many less equations and that is how it should be. I have moved together all the various definitions. I repeat my earlier view that I think Introduction to entropy should be merged here. This should be the portal into all things entropy and there should not need to be a simpler article. It is our job to make this both an introduction and a portal into all areas of entropy. My effort needs more work, and I am sorry I have not been able to do that. I thought however that I should draw your attention to it before it gets too old. I believe however that my shortened tighter effort is the basis for moving this article towards GA or FAC. Please give it serious consideration. --Bduke 20:57, 4 December 2006 (UTC)
- I guess I am the appointed defender of the link between information entropy and thermodynamic entropy. The connection I am defending can be stated this way:
-
- "If each "possibility" has an equal probability of occurring, then the Shannon entropy (in bits) is equal to the number of yes/no questins you have to ask to determine what possibility actually occurred. The thermodynamic entropy as defined by Boltzmann, is equal to the number of yes/no questions you have to ask to determine the microstate, given that you know the macrostate, multiplied by Boltzmann's constant times ln(2)"
- I know, I know, your eyes just glazed over. BUT - If you have a problem with my defense of this statement, then you need to read it, understand it, and show where it is wrong.
- I cannot and will not defend my strong feeling that my understanding of thermodynamic entropy is enhanced by the above statement, particularly the understanding of Gibbs paradox and the entropy of mixing, because this feeling is not quantitative, not testable. However, this viewpoint has support in the literature (Jaynes, Balian, etc), and needs to be clearly stated in the article.
- I believe I understand the informed objections that people have to making a link between the two. The entire dynamics of entropy through its relation to energy and temperature is missing and so the concept of thermodynamic entropy is much larger than just its informational entropy connection. This objection is likewise not quantitative and is therefore indefensible, but it has support in the literature, and needs to be clearly stated in the article.
- So, focussing on this narrow subject, I would say that the treatment of information entropy in BDuke's new version is lacking. Consider the statement:
-
- "The thermodynamic interpretations, generally, differ substantially from the information theory interpretation and are only related in namesake, although there is not complete agreement on this issue."
- This shows a large amount of prejudice against information entropy. It could have just as easily been written:
-
- "The thermodynamic interpretations and the information theory interpretations are identical, although there is not complete agreement on this issue."
- both of which I object to. In addition, in BDuke's section "approaches to understanding entropy" the subsection on information entropy is the shortest and most useless one of all. It should at least give a flavor for what the argument is, like the other subsections do. Now BDuke took this from the subsection that was way too large, from November 27. Could anyone look at the present version of this section to see if it meets the criteria for providing a brief but balanced view of the concept? PAR 05:49, 5 December 2006 (UTC)
I really do not think the issue is whether there is a link or not between thermodynamic entropy and information entropy. I think the fact that it is disputed indicates it should not be here in great detail and that the issue should be discussed elsewhere.
Yes, what I left is taken from the article on that date. Please feel free to rewrite the brief section to be a better description of the connection and what information entropy is. As I said improving all of these shortened sections is just something I have not had the time to do.
Finally I am disturbed that you keep coming back only to this one issue. My concern is to shorten and clarify all sections and make the overall article more readable and more understandable. Informational entropy is just one of many issues. Let us concentrate on the big picture. --Bduke 06:11, 5 December 2006 (UTC)
- You and Sadi Carnot and Frank Lambert disagree with me on information entropy, so that gets diminished. Frank Lambert and I disagree with Sadi Carnot on order/disorder, so that gets diminished. Sadi Carnot and I disagree with Frank Lambert on energy dispersal so that gets diminished. I don't agree. These disagreements should be reflected in the article so that a reader can see that there is disagreement, and be motivated to study both sides. The sections which reflect disagreement can be briefly written, and level headed editors dedicated to NPOV can prevent these sections from degenerating into an extended argument. I keep coming back to this issue because it is about the only aspect of your rewrite that I strongly disagree with. Sorry if I left that out in my rant. PAR 06:57, 5 December 2006 (UTC)
-
- Thanks for that. If that is the only part of my rewrite you disagree with, then we are indeed making progress. I am not even sure I disagree with you on information entropy itself. I really do not have a view. I do have a view that this article should make an extremely complex topic as simple as possible. That means that disagreements should be underplayed relative to core material simply described. I then make a judgement that disagreements about energy dispersal and order/disorder do need to be addressed as they impact directly on how entropy is described in textbooks for physics and chemistry students. I also make the judgement that the disagreement about information entropy is not that central and would be best dealt with elsewhere. I certainly have never seen a mention of it in a physical chemistry text and suspect that it would be given short shift by a chemistry professor using a text that did cover it to teach a course in physical chemistry. Can you show physics texts that find information entropy as helpfull as you do? If not what sources do you have to suggest that is helpfull. If you think my rewrite has promise in other areas, why not have a go at rewriting other areas to follow my lead? I would however like to hear what others say about it all. --Bduke 09:45, 5 December 2006 (UTC)
- The fact that it would be given short shrift by a chem professor doesn't set off any alarm bells in my mind, although perhaps it should. I understand that you are a teacher, so you see this article as a teaching aid for students. I am a researcher and I tend to see it as a quick reference source for researchers. We are both partially wrong and we should realize that. It can and should be both.
- I apologize to anyone who has heard me say this a thousand times before, but IMHO the best thermo book bar none is "Thermodynamics and thermostatistics" by H. Callen. It is the most cited thermo reference in the world and someday I hope to understand half of it. It has a section on Shannon entropy as an introduction to the order/disorder concept (which I disagree with). The best reference on the info/thermo entropy connection is Jaynes' two 1957 articles - "Information theory and Statistical Mechanics - Part 1 and Part 2. The first article I read was Gibbs Paradox]. It is perhaps more approachable and could be read first.
- When you say "have a go at rewriting" I assume you mean rewrite parts of the entropy article with your user page as a set of suggestions. That sounds good. I will wait a while to see what other opinions are. PAR 15:46, 5 December 2006 (UTC)
-
- Let me deal with these three points separately. (3) - entirely agree. It should not be just you and I discussing this. On (1), I have been both a teacher and a researcher. That is, I was an academic. Now I am a part-time researcher. Perhaps we are both partially wrong, but I see the researcher as going to the many more specialised articles on entropy that we already have. Entropy is difficult. This is the portal article to entropy. It should be understandable. It can still have the links for the researcher to move on down to the other articles. On (2), I do not see any of your references being to texts that a student starting to learn about entropy would read. So we should be looking at the general physics texts and the general physical chemistry texts, although I guess Frank would want us to also look at the 1st year general chemistry texts. It sounds like Callen should be a good reference for the more specialised articles. If I had come across a text on physical chemistry that I had to use (sometimes you can not chose your own text) and it included informational entropy, why would I give it short shift? Simply because I would already not have enough time to get the basics of this difficult subject across to the students. --Bduke 20:41, 5 December 2006 (UTC)
[edit] More pruning and explanations
Firstly, thanks for tackling the major task of making this article understandable. Sorry it's been a bit hectic lately at souza acres, so haven't had time to do more than glance at it.
My first impression is that the lists of definitions have to be removed: if they provide useful explanation, that should be incorporated into the text. From memory, many of them don't even appear in the detailed articles, where they belong. The disorder / dispersal and topics in entropy sections also need severe pruning to focus on an elementary explanation. Secondly, as you say, some work is needed to improve the brief explanations: in particular no previous knowledge of maths symbols can be assumed, so for example Δ needs to be described in writing, and ln explained. Agree with PAR that a very brief explanation of the information entropy / statistical position is needed. Will try to find time to comment in more detail, but rather tied up for a while, .. dave souza, talk 09:19, 6 December 2006 (UTC)
[edit] What????
"via intermolecular molecular frictions and collisions" •Jim62sch• 23:45, 22 November 2006 (UTC)
"The more states available to the system with higher probability, and thus the greater the entropy." <- is this supposed to mean anything? ;)
[edit] Quotes and humor
This isn't Wikiquote. Is there any reason we need a humor-quote section in an article on a serious science topic? I'm unsure if this was the result of some previous consensus discussion or something, so I thought I'd inquire before simply pulling the material. Serpent's Choice 02:01, 9 January 2007 (UTC)
- Section removed, though I'm open to discussion regarding means to include the content elsewhere if someone so desires. Serpent's Choice 07:52, 10 January 2007 (UTC)
[edit] Definition
Maybe I'm just silly, but I don't feel there is an actual definition for entropy. The article states that entropy increses or dececreases in a given system, that it is essential to the 2nd law of thermodynamics. But what IS it, if it is already there, and I missed it, can somebody point me to it? Thanks 201.134.106.227 Alex B.
- Here's a somewhat nontechnical definition from the article: "Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed." - JustinWick 21:26, 23 January 2007 (UTC)
Def of Generalized Entropy. From this article I cant tell what Generalized Entropy is ( and I would like to know). What is being generalized?