Entropy and the teenager's bedroom
Many people, including yours truly, have a habit of referring to entropy as a process of "running down" – an inexorable slide towards a state of chaos or disorder. Accordingly, we're inclined to use the word entropy when referring to things like teenagers' bedrooms in order to imply that they are ever heading towards total disarray.
But we know from physics that this colloquial description of entropy is incorrect. So what is a more accurate description of this concept?
First, a disclaimer: I am not a physicist. Nonetheless, I have decided to proffer an explanation of the term "entropy" that I hope will be more accessible to the layperson than what is typically out there at present. I do so knowing that my "potted account" is, in many detailed respects, probably going to be "wrong": differences in terms will be glossed over and concepts in physics will be conflated. Nonetheless, I hope that I can give some reasonable lay account of this term that is at least an improvement on the "teenager's bedroom" analogy!
"Entropy" is a concept that arises from the second law of thermodynamics. It describes the process by which energy tends towards an even dispersal over a closed system – until none of that energy is left to effect changes across the system (ie. none of that energy remains available to do work in that system).
But what does that mean?
Well, for example, if your "closed system" comprises a glass of lemonade with ice cubes, you can expect the ice cubes to warm and melt and the lemonade to cool. Eventually you can expect the water and lemonade to reach a state where all the ice has melted and you are left with a glass of evenly mixed, and even temperature, water and lemonade. None of the energy that melted the ice cubes is available to do further "work" in the "system".
[There is another meaning of "entropy" and that goes to what is known in mathematics as "information theory". The latter defines entropy as "a measure of unpredictability or information content". I'll deal with this a bit later.]
Okay, so how did this concept of "even dispersal of energy over a system" give rise to the more colloquial account of entropy as an "inexorable trend towards chaos"? I've previously noted that this "metaphor" has been decried by physicists who specialise in thermodynamics. Yet the "metaphor" persists.
I think the argument underpinning the continued use of that metaphor in certain introductory physics texts goes something like this:
The tendency of energy in a system to disperse evenly can come to something very similar to "chaos" in our colloquial sense – namely what has been called "disorder" of the energy in that system (principally when one compares, say, a crystalline solid with the rather more "random" arrangement of particles in a liquid or a gas of the same substance). It is argued that is particularly true when the energy of the relevant system is viewed in totality. I mention "totality" because it is possible for the energy in a system to have moved towards an overall greater "disorder" while the energy in certain parts of the system is still "ordered" (eg. gas from our atmosphere might be forming clouds, ice crystals etc. while the total volume of atmospheric gas is decreasing as some of it escapes and disperses into space).
Yet from the perspective of thermodynamics, terms such as "order" and "disorder" are largely meaningless. It is true that an observer of a closed system that is near maximum entropy (ie. maximum dispersal of energy) might note a decrease in what he or she would colloquially call "order" (eg. the melting of ice in that glass of lemonade) however while this might be a rather striking "visual and crystallographic impression" it arguably has little to do with "order" in any sense known to physics.
One might counter that "order" is a descriptor of the molecular "degrees of freedom" applicable to the physical state of the system - ie. the minimum number of coordinates required to specify the position of particles comprising that system. When there is little scope for change in that system (eg. there is no more ice to melt in our lemonade glass) the "degrees of freedom" are higher. The higher the degrees of freedom, the greater the number of "bits" of "information" required to specify the system. And the greater the information required, the harder it will be to specify that system. Conversely, the fewer the bits of information, the greater the "order" of the energy in that system.
In other words, this "counter" purports to link the concept of entropy in information theory with entropy in thermodynamic theory. [For an example of how these theories are "linked" consider the following website which goes through the various equations.]
Leaving aside the exact equations, what is happening here is that an unsubstantiated leap is being made: information theory is not a branch of physics, but is instead a branch of pure mathematics. There is no reason to suppose that the similarity between the equations of both permits their interchangeability. More specifically, there is no reason to substitute "degrees of molecular freedom in a closed system" for the independent and identically distributed random variables of information theory.
Consider that this purported link ignores the fact that any differences between the accessible molecular degrees of freedom from state to state are hardly evidence of relative "order" or "disorder" in any scientific sense. As Frank L. Lambert, Professor Emeritus Occidental College in Los Angeles, says:
"Crystalline ice at 273 K has an S0 of 41.34 J/K mol, and thus, via S = kB ln W, there are 10 to an exponent of 1,299,000,000,000,000,000,000,000 possible accessible microstates for ice. Because the S0 for liquid water at 273 K = 63.34 J/K mole, there are 10 to an even larger exponent of 1,991,000,000,000,000,000,000,000 accessible microstates for water. Does this clearly show that water is “disorderly” compared to crystalline ice? Of course not."
Nonetheless many respected physicists – eg. Leonard Susskind – continue to link information and thermodynamic theories by reference to the "hidden information" of a system.
For example, Susskind will tell you that black holes evidence a state of maximum entropy given that "information" relating to the black hole is "hidden". That the "energy dispersal" of the black hole system is also at a maximum seems congruent.
And I suppose this sort of analysis is also consistent with entropy manifested in the continual expansion and cooling of the universe. If, as is postulated by many, the universe will continue to expand forever, it will cool as it does so leading to what is popularly described as the "Big Freeze" – ie. the final heat death of the universe. And this point all the accessible information relating to the closed system comprising the universe will also be lost. In other words, on both accounts entropy of the universal system will be at its maximum level.
But congruence in these uses of the term "entropy" does not mean they are the same. Rather, this outcome is almost certainly nothing more than a happy coincidence - in much the same way as I might, for example, find myself appropriating terms from Einstein's theory of General Relativity to describe my interaction with certain family members. However apt my use of the relevant terms, it doesn't suggest a manifestation of Einstein's theory in physics. Instead the use comprises a metaphor - nothing more.
Despite all of this, I will probably continue to use the term "entropy" colloquially to mean "running down" towards "chaos" or "disorder". Why? Because there is simply no other appropriate word for such an inexorable process in our language.
Perhaps that is why the myths surrounding the meaning of "entropy" continue to exist. Like it or not, the word has been appropriated into both information theory in mathematics and the colloquial English lexicon in a way that fails to reflect its true scientific meaning. [I note that it wouldn't be the first time such a thing has happened: consider how "schizophrenia" has come to be used by many as a substitute for "multiple personality disorder".]
In the end, language is a "living" construct. Certainty and "unchangeability" in terminology, however laudable in science, take a back seat in social interaction. I suspect that this is something to which physicists will just have to become accustomed.
Copyright © 2013 Dejan Djurdjevic
But we know from physics that this colloquial description of entropy is incorrect. So what is a more accurate description of this concept?
First, a disclaimer: I am not a physicist. Nonetheless, I have decided to proffer an explanation of the term "entropy" that I hope will be more accessible to the layperson than what is typically out there at present. I do so knowing that my "potted account" is, in many detailed respects, probably going to be "wrong": differences in terms will be glossed over and concepts in physics will be conflated. Nonetheless, I hope that I can give some reasonable lay account of this term that is at least an improvement on the "teenager's bedroom" analogy!
"Entropy" is a concept that arises from the second law of thermodynamics. It describes the process by which energy tends towards an even dispersal over a closed system – until none of that energy is left to effect changes across the system (ie. none of that energy remains available to do work in that system).
But what does that mean?
Well, for example, if your "closed system" comprises a glass of lemonade with ice cubes, you can expect the ice cubes to warm and melt and the lemonade to cool. Eventually you can expect the water and lemonade to reach a state where all the ice has melted and you are left with a glass of evenly mixed, and even temperature, water and lemonade. None of the energy that melted the ice cubes is available to do further "work" in the "system".
[There is another meaning of "entropy" and that goes to what is known in mathematics as "information theory". The latter defines entropy as "a measure of unpredictability or information content". I'll deal with this a bit later.]
Okay, so how did this concept of "even dispersal of energy over a system" give rise to the more colloquial account of entropy as an "inexorable trend towards chaos"? I've previously noted that this "metaphor" has been decried by physicists who specialise in thermodynamics. Yet the "metaphor" persists.
I think the argument underpinning the continued use of that metaphor in certain introductory physics texts goes something like this:
The tendency of energy in a system to disperse evenly can come to something very similar to "chaos" in our colloquial sense – namely what has been called "disorder" of the energy in that system (principally when one compares, say, a crystalline solid with the rather more "random" arrangement of particles in a liquid or a gas of the same substance). It is argued that is particularly true when the energy of the relevant system is viewed in totality. I mention "totality" because it is possible for the energy in a system to have moved towards an overall greater "disorder" while the energy in certain parts of the system is still "ordered" (eg. gas from our atmosphere might be forming clouds, ice crystals etc. while the total volume of atmospheric gas is decreasing as some of it escapes and disperses into space).
Yet from the perspective of thermodynamics, terms such as "order" and "disorder" are largely meaningless. It is true that an observer of a closed system that is near maximum entropy (ie. maximum dispersal of energy) might note a decrease in what he or she would colloquially call "order" (eg. the melting of ice in that glass of lemonade) however while this might be a rather striking "visual and crystallographic impression" it arguably has little to do with "order" in any sense known to physics.
One might counter that "order" is a descriptor of the molecular "degrees of freedom" applicable to the physical state of the system - ie. the minimum number of coordinates required to specify the position of particles comprising that system. When there is little scope for change in that system (eg. there is no more ice to melt in our lemonade glass) the "degrees of freedom" are higher. The higher the degrees of freedom, the greater the number of "bits" of "information" required to specify the system. And the greater the information required, the harder it will be to specify that system. Conversely, the fewer the bits of information, the greater the "order" of the energy in that system.
In other words, this "counter" purports to link the concept of entropy in information theory with entropy in thermodynamic theory. [For an example of how these theories are "linked" consider the following website which goes through the various equations.]
Leaving aside the exact equations, what is happening here is that an unsubstantiated leap is being made: information theory is not a branch of physics, but is instead a branch of pure mathematics. There is no reason to suppose that the similarity between the equations of both permits their interchangeability. More specifically, there is no reason to substitute "degrees of molecular freedom in a closed system" for the independent and identically distributed random variables of information theory.
Consider that this purported link ignores the fact that any differences between the accessible molecular degrees of freedom from state to state are hardly evidence of relative "order" or "disorder" in any scientific sense. As Frank L. Lambert, Professor Emeritus Occidental College in Los Angeles, says:
"Crystalline ice at 273 K has an S0 of 41.34 J/K mol, and thus, via S = kB ln W, there are 10 to an exponent of 1,299,000,000,000,000,000,000,000 possible accessible microstates for ice. Because the S0 for liquid water at 273 K = 63.34 J/K mole, there are 10 to an even larger exponent of 1,991,000,000,000,000,000,000,000 accessible microstates for water. Does this clearly show that water is “disorderly” compared to crystalline ice? Of course not."
Nonetheless many respected physicists – eg. Leonard Susskind – continue to link information and thermodynamic theories by reference to the "hidden information" of a system.
For example, Susskind will tell you that black holes evidence a state of maximum entropy given that "information" relating to the black hole is "hidden". That the "energy dispersal" of the black hole system is also at a maximum seems congruent.
And I suppose this sort of analysis is also consistent with entropy manifested in the continual expansion and cooling of the universe. If, as is postulated by many, the universe will continue to expand forever, it will cool as it does so leading to what is popularly described as the "Big Freeze" – ie. the final heat death of the universe. And this point all the accessible information relating to the closed system comprising the universe will also be lost. In other words, on both accounts entropy of the universal system will be at its maximum level.
But congruence in these uses of the term "entropy" does not mean they are the same. Rather, this outcome is almost certainly nothing more than a happy coincidence - in much the same way as I might, for example, find myself appropriating terms from Einstein's theory of General Relativity to describe my interaction with certain family members. However apt my use of the relevant terms, it doesn't suggest a manifestation of Einstein's theory in physics. Instead the use comprises a metaphor - nothing more.
Despite all of this, I will probably continue to use the term "entropy" colloquially to mean "running down" towards "chaos" or "disorder". Why? Because there is simply no other appropriate word for such an inexorable process in our language.
Perhaps that is why the myths surrounding the meaning of "entropy" continue to exist. Like it or not, the word has been appropriated into both information theory in mathematics and the colloquial English lexicon in a way that fails to reflect its true scientific meaning. [I note that it wouldn't be the first time such a thing has happened: consider how "schizophrenia" has come to be used by many as a substitute for "multiple personality disorder".]
In the end, language is a "living" construct. Certainty and "unchangeability" in terminology, however laudable in science, take a back seat in social interaction. I suspect that this is something to which physicists will just have to become accustomed.
Copyright © 2013 Dejan Djurdjevic
Comments