Nearly two years ago, I announced that I was going to teach a one-semester course at Anderson University. It’s one of the few Christian Universities that I am willing to teach at, because it doesn’t have a long list of doctrinal beliefs to which you have to agree. Instead, it seems to understand that the quest for truth is important and cannot be hindered by one specific interpretation of the Scriptures that has been developed by fallible people. Instead, if we are to learn the truth, we must honestly search the Scriptures, honestly study God’s creation, and honestly explore the various ideas that have emerged throughout the history of Christendom.
It was the first time in 19 years that I had taught a complete, semester-long college course, and I posted a few articles about my experience. I had a great time, and I decided that I wanted to do it again at some point in the future. Because I had some book deadlines with which to contend, however, I couldn’t do that right away. Now that my book deadlines have slowed down a bit, I have decided to go back to the college classroom once again.
This fall, I will be teaching thermodynamics at Anderson University. It is an upper-level course, typically taken by juniors. I use some aspects of thermodynamics in my research as a nuclear chemist, and it is actually one of my favorite topics to teach. As a result, I am really looking forward to it!
Thermodynamics is interesting for many, many reasons. It’s intensely mathematical, requires very precise thinking, and helps us understand a great deal about how Creation works. It also has some implications beyond physics and chemistry. For example, thermodynamics led to an enormous shift in the scientific consensus regarding the nature of the universe. For most of human history, the scientific consensus was that the universe is eternal: it has always existed and will always exist. We now know that’s incorrect. The universe had a beginning and will have an end, and thermodynamics was the first field to demonstrate this.
It’s all thanks to Rudolf Clausius, who formulated the Second Law of Thermodynamics back in 1850. He showed mathematically that while the total energy content of the universe stays the same (because of the First Law of Thermodynamics), the amount of energy in the universe that is available to do work is constantly decreasing. As a result, the universe will eventually reach a point at which there is lots and lots of energy, but hardly any of it is available to be used. He called this situation a state of maximum entropy and wrote:1
The more the universe approaches this limiting condition in which the entropy is maximum, the more do the occasions of further change diminish; and supposing this condition to be at last completely attained, no further change could evermore take place, and the universe would be in a state of unchanging death.
Of course, if the amount of energy in the universe that is available to do work decreases over time, think about what happens if you go backwards in time. The amount of energy available to do work increases. At some time in the past, however, it would reach the point where the amount of available energy equals the total energy of the universe. At that point, you can’t go back any further, so the Second Law of Thermodynamics not only tells us that the universe will end, but it also tells us that the universe had a beginning. British philosopher William Jevons put it this way:2
Now the theory of heat places us in the dilemma either of believing in Creation at an assignable date in the past, or else of supposing that some inexplicable change in the working of natural laws then took place.
Of course, the Bible always said that the universe had a beginning. It just took thermodynamics to convince the scientific community that the Bible is correct on that point.
REFERENCES
1. Ruldolf Clausius, “On the Second Fundamental Theorem of the Mechanical Theory of Heat,” Philosophical Magazine and Journal of Science, 35(4):419, 1868
Return to Text
2. William Stanley Jevons, The Principles of Science: A Treatise on Logic and Scientific Method, Volume II, MacMillan and Company 1874, p. 439
Return to Text
I once had a friend suggest that entropy in the universe could increase along an exponential curve. This way the further back you go in time, the closer the usable energy approaches the total energy, but it never reaches it. Thus an eternal universe would not violate increasing entropy.
My first objection was that this would fail because entropy is quantized, but I think that is actually not the case? Dr. Wile, what are your thoughts? This is not a field I know much about.
It certainly makes sense for entropy to be quantized, since most fundamental aspects of nature (like matter, energy, and charge) are quantized. Zeilinger’s principle says that information is quantized, which would imply a quantization of entropy as well. While Zeilinger has been successful at using this idea to derive some aspects of quantum mechanics, his principle remains controversial.
The real problem with your friend’s suggestion is that we know entropy can’t behave that way. You can think of entropy as a “tax.” Each time energy is converted from one form to another, you must pay this “tax.” Thus, if energy is being transformed, entropy increases. The more energy gets transformed (and for work to be done, energy must be transformed), the more entropy increases. So, there is no way that entropy can follow an asymptotic curve. As you go back in time, entropy decreases steadily, and you eventually reach the point where there is no entropy. At that point, you can’t go back any further.
I’m not as good at statistical mechanics as I should be, but I kind of doubt that proving the universe had a beginning is that easy. Because the thermodynamics of the entire universe is different from that of your ideal system: I think the proper treatments have to use quantum field theory in curved spacetime (which is what you need to derive Hawking radiation). I’m willing to believe there’s a semiclassical proof; do you know of one you could send me? I shouldn’t have any problem reading it.
I don’t understand your reasoning, Jake. Regardless of whether you are discussing things clasically or from a quantum point of view, how do you get around the energy argument? QM doesn’t allow you to reverse the general trend of entropy (although you can for certain microsystems over a short time frame), nor does it allow you to change the total energy content of the universe.
There are a couple things I want to check… I’ll say something in a few days.
Congratulations! I hope you have yet another brilliant year teaching.
I must be honest, your comment about doctrinal statements caught my attention more than anything else in this post. Are there places in your blog where you have elaborated a little more your ideas here? Thanks!
I have written a bit more about my aversion to university doctrinal statements here. I want to make it clear why I have this aversion. It’s not because I think doctrine is unimportant. It’s also not because I don’t think Christian organizations should have doctrinal statements. I think it is important to inform people about the way your organization interprets the Bible.
My problem with university doctrinal statements is that, generally speaking, the faculty are required to adhere to them. I understand the reasoning behind the practice; I just don’t agree with it. A university is a very special entity. It is not a church. It is not an evangelical ministry. It is not a Christian charity. It has aspects of those things, but it is not one of them. A university is a place of learning where the faculty are specifically tasked to push the boundaries and investigate things that most people don’t investigate. By agreeing to adhere to a doctrinal statement, a professor immediately restricts the things he or she will investigate, and I think that goes against a university’s mission. Thus, even if I agreed with every point on a university’s doctrinal statement, I wouldn’t teach at that university if I was required to adhere to the statement. I just think that such statements are incompatible with the mission of a university.
Of course, churches, evangelical ministries, Christian charities, etc., are different. Thus, I have no problem with them asking their employees to adhere to doctrinal statements. I just think a university is different.
I see, that makes a lot of sense. I myself struggle with the balance between maintaining what some call “doctrinal purity” and avoiding enforced uniformity. I suppose that’s why your statement caught my attention. Thanks for your thoughts!
I hope you don’t mind me asking for a little thermodynamic assistance.
I have a spring-mass simulator, and I want to calculate the temperature of the system. Should be simple, right? I know the exact position, mass, and velocity of every particle at any given time, as well as the potential energy stored in each spring at any time. So it seems like it should be a simple calculation.
But after many hours of searching and reading web sites on temperature and entropy, it’s not so simple. For example, one definition of temperature relies on the equipartition theory:
E = (1/2)(N)(k)(#degrees-of-freedom)(T)
Where E is internal energy, N is the total number of particles, k is Boltzmann’s constant, and T is the absolute temperature in Kelvin. But the equipartition theory requires the system to be in equilibrium, and my simulator is all about modeling transient behavior! So even though I have an exact known value for internal energy, I still cannot calculate T until my simulation has settled down to a very boring, uninteresting state. I want to calculate temperature change over time.
Temperature definitions based on entropy were even more unusable. The general definition of entropy relies on a concept called “multiplicity” which seems completely unworkable. But if I somehow figured that out, then if I can somehow take a partial differential of entropy with respect to internal energy, then I get the inverse of temperature. (Example info source: http://hyperphysics.phy-astr.gsu.edu/hbase/thermo/temper2.html#c2)
So here are my questions:
Am I overlooking something simple and obvious, or is calculating temperature, even of a system for which I know literally everything, really that complex?
I read somewhere that temperature only really has meaning for a system in equilibrium. Is that an accurate statement?
Thanks!
Remember what temperature is. It is a measure of the random motion of the molecules or atoms that make up the system. Thus, when you see “E” in equations for temperature, it generally refers to the internal energy of the system, which is usually represented as “U.” The problem with your system is that you know a lot about it, but you don’t know the random motion of the individual atoms and molecules. That’s what you need to know to calculate temperature.
The random energy of motion of the molecules or atoms (U) depends on two factors: the heat transferred (q) and the work done (w):
U = q + w
Note: Engineers use an odd sign convention where work done by the system is positive. Thus, they would say U = q – w. Chemists think correctly (ha ha), so I will use the chemistry sigh convention, which leads to the equation above.
You want to know temperature, so “U” is what has to be tracked. In an oscillator, the system works for a while (lifting the mass up, for example), but then the surroundings work (as gravity pulls the mass down). When the system works, “w” is negative, and when the surroundings work, “w” is positive (for engineers, it’s the reverse). On average, then, “w” is probably zero. Thus, you really just have to keep track of the heat transferred.
If the system is frictionless, “q” is also zero, so the temperature doesn’t change. When we add friction, energy gets taken from the kinetic energy (KE) and potential energy (PE) of the motion and is mostly converted to heat. I think that’s the key here. At each step in the simulation, you should see how much KE and PE is lost due to friction. You can probably assume that the vast majority of that energy is converted to heat, so it becomes “q.” Some will be converted to sound, but I would think it would be small enough to ignore. Assuming it all becomes heat, most of it will be absorbed by the springs (and the masses, if they are metal). Based on the composition and mass of the springs (and masses, if they are metal), you should be able to calculate the change in temperature as:
q = (mass of the material absorbing the heat)x(specific heat capacity of the material)xDelta_T
Now, of course, the big issue here is where the heat dissipated by friction goes. As the temperature of the springs and mass increase, they will start to lose heat to the air, lowering their temperature. Thus, this approach will get less and less accurate the more energy that is dissipated, but I would think this would be a good first step.
Temperature only has meaning when the temperature-measuring device is in equilibrium with what is being measured. That’s what the Zeroth Law of Thermodynamics is all about. However, the entire system doesn’t need to be in equilibrium for temperature to have meaning. As long as the system has random motion of its constituent particles, temperature has meaning. If the measuring device is in equilibrium with its surroundings, it is measuring the temperature of that part of the system. The temperature might be different in another part of the system, of course, but that doesn’t mean the concept is without meaning.
Thanks for the reply. Your comment about engineers and the sign of “w” reminds me of one of my old electrical engineering instructors who wanted to make sure we understood the importance of getting the sign of the result correct: “If you’re deer hunting, the sign makes the difference between shooting the deer and shooting yourself!”
For a simulation where I implement friction, your advice makes complete sense. I would expect that the masses would pick up most of the frictional energy as heat, and I could calculate their temperature based on what material I assumed they were made of.
However, I’d like to set up some simulations to demonstrate thermodynamics, where the masses are frictionless and considered to be abstract point particles with no internal details. The only forces acting on them would be the abstract springs, and I give them some initial velocity so they don’t just sit there.
I think what I’ll do is set up two separate groups of connected, vibrating masses, one on the left and one on the right. The left group initially is vibrating at a higher average amplitude than the group on the right. I’ll display the total internal energy of each group in real time. The energy of the left group will be greater than that of the right, and both will remain constant.
Then I’ll connect the two groups by some kind of bridge (probably by connecting new springs between masses in one group to masses in the other group.) Energy will flow from the group on the left to the group on the right, until equilibrium is reached.
But equilibrium doesn’t necessarily mean the two groups will have equal energy! And the ratio of the two energies would not just depend on the relative total mass of each group. I think it would also depend on the relative “heat capacities” of the two groups of vibrating masses. And what kinds of things will influence that heat capacity? The stiffness of the springs and the masses of the individual particles?
See, that’s why I wanted to find a way to directly calculate temperature. There has to be some way to normalize the value of the internal energy so as to predict heat flow. 🙂
Anyway, now I’m curious and I think I have some experiments I can perform. Thanks for humoring me.
Your EE instructor is spot on, David! I am not sure how you will define temperature with everything being point particles. Certainly, you can trace the energy flow, as you suggest. However, if I am understanding your simulation, the average temperature will never change. The temperature in a spring will lower as it is doing work, but it will rise again when work is done on it. Without friction, there will be no heat transfer (unless the surroundings and springs start off at different temperatures), so in the end, the temperatures will rise and fall periodically, with no average change.