When you read these words, you are receiving information. Some would call the information “too sciency, too nerdy,” but it is information nonetheless. But what, exactly, is information? Is it a real, physical quantity, or is it some esoteric construct of the mind? Rolf William Landauer spent a lot of time thinking about this question. That’s not surprising, because he was a physicist who worked for IBM, a company that deals with lots of information. In 1961, he wrote a paper for the IBM Journal of Research and Development in which he argued that information is a real, physical quantity that is governed by the Second Law of Thermodynamics. As a result, in order to erase information (such as when a file is deleted from a hard drive), a certain amount of energy must be released.1
It is important to understand what Landauer meant. He didn’t mean that it takes energy to erase information. For example, if you want to erase the writing on a whiteboard, you have to expend energy wiping the markings off the board. That makes perfect sense, but it’s not what Landauer was referring to. He said that in order for information to be erased, energy must be released into the environment. The very act information being destroyed, regardless of the method, requires a physical response: a minimum amount of energy must be released. This is because information is a real, physical quantity and is therefore governed by the Second Law of Thermodynamics.
Now the Second Law of Thermodynamics is misunderstood and misused frequently (you can read more about that here and here), so let’s start with what the Second Law actually says. It says that the entropy of the universe must always increase or at least stay the same. It can never decrease. What is entropy? It is a measure of the energy in a system that is not available to do work. However, a more useful definition is that it is a measure of the disorder in a system. The larger a system’s entropy, the “messier” it is. Using this concept of entropy, then, we can say that the disorder of the universe is always increasing or at least staying the same: the universe never gets more ordered.
Now let’s apply this concept to a computer disk. A computer disk has a bunch of bits, and each bit can have a value of either 0 or 1. On a blank disk, all the bits have the same value. Let’s say it’s 0. However, as you start putting information on the disk, the bits change. Some stay at their original value (0), but others change (they become equal to 1). So as more information gets put on the disk, there are more possibilities for the values of the bits. If you were to erase the disk again, you would set all those bits back to 0. When you do that, the disk gets more ordered. While there was information on the disk, it was possible for many of the bits to have a value of 1. When you erase the disk, that’s not possible anymore – all the bits have to have a value of 0. From the point of view of the disk, then, when you erase the information, the disk gets more ordered.
If the disk gets more ordered when information is erased and nothing else happens, the universe would become a bit more ordered, but the Second Law of Thermodynamics forbids this. Thus, in order to follow the Second Law, the very act of erasing information must release energy. Furthermore, that energy must be large enough to disorder the universe as much as or more than the disk became ordered. That way, the decrease in entropy of the system (the disk) will be offset by the increase in entropy of the disk’s surroundings so that the total entropy of the universe remains the same or increases. Landauer even used the Second Law of Thermodynamics to predict the minimum amount of energy that must be released for each bit of information that is erased.
This idea remained theoretical for more than 40 years, but it was recently tested by experiment, and it seems that Landauer was correct.
In the experiment, Antoine Bérut and his colleagues set up a simple information-containing system. A microscopic bead of silica was held in place by a laser beam that produced two “traps” in which the bead could be found. When the settings of the laser were changed, the bead would switch from one trap to the other. So the two traps represent the two values of a bit: 0 or 1. When the bead is in one trap, the bit has one value. When it is in the other trap, it has the other value. Changing the bead from one trap to another is the same as changing the value of the bit.
The researchers carefully measured the position and speed of the bead as it was moved from one trap to the next. This allowed them to measure the heat released by the bead switching traps. If the bead was moved quickly, it released a larger amount of heat. The more slowly the bead was moved, the less heat it released. This is because the speed of the switch affects the friction in the system. The slower the switching, the lower the friction involved in moving the bead from one trap to another. If it were possible to make the switch infinitely slowly, there would be no friction associated with the bead moving between traps.
If information were not a real quantity, then in the absence of friction, there should be no energy released by the bead’s movement. However, if Laudauer was right, even in the absence of friction, the bead would still release energy. In the end, the experiment showed that Landauer was right. As the switching time was reduced, the energy released got smaller. However, it was not a linear relationship. At first, the energy released decreased a lot as the switching time increased. However, as the switching time was increased further, the energy released started decreasing less and less. The data indicate that even if the switching time were infinitely long, some energy would still be released, and that energy is precisely what Landauer predicted it would be.2 So in the end, it seems that Landauer was correct. Erasing information requires the release of a certain amount of energy.
Now, of course, this research has practical applications in the field of computing. As the number of bits of information we can store increases, the energy released when those bits are erased increases as well. The heat generated by computing is already a problem, and the heat released by erasing bits of information can make that problem even bigger. Thus, engineers will probably have to take this effect into account the larger our data storage capabilities become.
However, this has applications to the creation/intelligent design/evolution issue as well. After all, life is all about information. In order for life to exist in the first place, information has to exist. In order for any physical process to change one life form into a different life form, information must be altered. If information is a real, physical quantity that is subject to the natural laws, then any theories that attempt to explain the origin and diversity of life must take this into account.
REFERENCES
1. Rolf Landauer, “Irreversibility and heat generation in the computing process,” IBM Journal of Research and Development, 5:183-191, 1961.
Return to Text
2. Antoine Bérut, et al.,”Experimental verification of Landauer’s principle linking information and thermodynamics,” Nature, 483:187–189, 2012.
Return to Text
In the experiment, the settings on the laser are changed somehow to initiate a switch. The bead is not an isolated system. It seems that to apply the 2nd Law correctly, the laser (and the “finger” that switches the laser) must be included. The same thing can be said for erasing a hard drive. Someone had to push the button to do this. The energy that is converted from usable energy to unusable energy in that button push must be included in any calculation of the total entropy of the system. While the hard drive may have become more ordered in the process, the universe as a whole became more disordered, and I don’t think that the arguments presented pin that increase in disorder to the loss of information. Rather, all physical processes required to make the change have not been considered.
What am I missing? PhD in physics or no, thermodynamics, and especially any discussion of the 2nd Law, has always baffled me.
Shawn, I think you are missing the fact that they were not looking at the total energy inventory of the system. They were only looking at the energy dissipated by the bead. In a totally frictionless environment, which is what you would have if the switch time were infinitely long, the bead would not be expected to dissipate any energy at all. However, it did, and the energy released was what Landauer calculated. It is reasonable to assume that this energy dissipation is related to the fact that in this setup, the position of the bead represents information.
Remember, the energy inputted in the system is not what is important here. Sure, it takes energy to erase information. That’s not in question. The question is does energy have to be released in order for information to be erased? The answer seems to be, “Yes.”
This simply doesn’t make sense. There is no loss of information in flipping a boolean variable.
Indeed depending on how you code it, either value on a boolean variable could be used to represent either state. I could have a database field about people called isMale or one called isFemale. A True value in one is exactly the same as a False value in the other. Therefore the information content in one state is no higher than the information content of the opposite state.
Even if that somehow weren’t the case, flipping a bit from 0 to 1 should create/destroy the same amount of information as flipping from 1 to 0 destroys/creates.
Josiah, you are right. The actual meaning of the information depends on how you code it. However, the information itself exists. There is either a “yes” or “no” to the question, regardless of what that question is. When you change that, you have erased the previous information, and that must be accompanied by a release of energy.
One assumption I don’t quite accept is that information itself can be considered simply a set of bits of 0 and 1. I can see an extention being made to genetic information which, instead of 2 states, has 4 at each base pair position. But is this a definition of all information? I don’t think that assumption can be applied to the brain. “What is a thought,” is still an active area of neuroscience, but what we do know are that thoughts are ridiculously complex. No one synapse or collection of synapses is known to be responsible for any individual thought. Applying the state model of information to a system such as abstract thought constructed by a well functioning brain seems unsupported. Would anyone consider this a valid application of a 0/1 model of information?
What this study seems to confirm is the statistical mechanics of microstates results in a demand for entropy. Is that how information is defined? It seems like that leap is a little much.
Joshua, I don’t think anyone is trying to compare this information to a thought. As Josiah says, the meaning of a series of ones and zeroes depends on the coding. However, the fact that there is a 1 or 0 in each bit does represent information.
Does this series of 1/0 represent information or just represent a distribution? I’m thinking of this in terms of the statistical mechanics of the system. If I take a series of 4 equivalent booleans and say 2 of them are up, there are a limited number of microstates to match the macrostate (1100, 1010, 1001, 0110, 0101, 0011). But if I take those 4 equivalent booleans and say 4 of them are up, then my number of microstates is limited to 1111. This is a decrease in entropy (2 up/2 down to 4 up).
So linking to what Josiah is saying, absolutely, information is dependent on the coding. By us giving the 1100 state a meaning, we assign information. So wiping a drive causes a change in entropy because we reset every bit to 0 (which has only 1 microstate), not because we have eliminated the information we assigned those collection of bits to represent. The information itself is what we choose the distribution to represent.
Joshua, I do think that when you eliminate microstates, you are eliminating information. Virtually no information can be encoded in a system where all the microstates are the same. Lots of information can be coded into a system that has a distribution of microstates. Now…what that information is depends on the code, as both you and Josiah indicate. However, the very fact that the microstates allow for something to be coded indicates that those microstates do represent information.
If microstates represent information, the conclusion from this claim then is that information itself is a fundamental physical quantity (since it can affect another physical quantity, entropy). If information is a fundamental physical quantity, then information, in any form, must affect the physical world in the same way. Electrons, for example, will affect other objects in the same way regardless of what that object is (they have a defined charge, a defined mass). This is why I brought up thoughts as information. If the authors’ claim is true and it is fundamentally information that is linked to thermodynamics, then information in any form, such as a thought, must be treated in an equivalent manner.
And here is where the connection breaks down. Dr. Wile, you implied earlier that the authors were not claiming to make this comparison to thoughts. They do so implicitly by claiming information as a physical quantity. If a physical quantity like the electronegativity of fluorine is the same regardless of the atom it’s bound to, I have to treat information the same way. Is information only fundamentally physical in one situation (in a computer), but not in others (like a thought)? If it can’t, then it isn’t fundamentally physical. If it can, then the claim stands. Unfortunately, the microstate model of information, as represented as 1s and 0s, cannot be applied to thought. Therefore, I must reject the claim. The authors still observed an incredibly important phenomenon. The conversion of a bit from 0 to 1 will release heat at the Landauer limit, which needs to be accounted for in computing. But this heat didn’t come from the information, but instead from the change in distribution of the microstate of the bead (as S = k lnW). Distribution (W) is the fundamental physical quantity being observed.
And this isn’t to say information isn’t real. There are plenty of non-physical things that are absolutely real things. Justice and redemption are two examples. They just aren’t physical and our manner of thinking of them must be non-physical as well.
I’ll have to disagree with you, Joshua. It’s not clear that a thought is a form of information. A thought might contain information, and it certainly acts on information, but I expect that a thought is a lot more than just information. Thus, it is not at all clear that treating information as a real, physical quantity has much at all to do with the nature of thoughts. At best, we could say that the information as stored in the human brain is a real, physical quantity. That doesn’t tell us much at all about the nature of a thought.
I certainly agree with you that there are things that are real that are not physical. However, I do think that this study shows us that information is both real and physical.
Although I don’t agree wit the methodology used to reach his conclusions I do think Stephen C. Meyer has an interesting section on DNA as information in the book Signature in the Cell.
AO, Meyer’s discussion of DNA as information in Signature in the Cell was one of the things I was thinking about when I said that any theory of origins must explain the origin of information. I think Meyer and the ID people are on the right track when it comes to this issue.
Precisely! The brain may contain information, and I cannot imagine that this information is processed in the manner of states and flipping of those states. I expect, as you do, that a lot more is going on. Therefore, it is not at all clear that treating information as a series of states is sufficient to describe all manner of information. The authors presume from the beginning that a bit erasure is by definition information. “In particular, erasure of information, the RESET TO ONE operation, is logically irreversible and leads to an entropy increase of kln(2) per erased bit.” [Paragraph 2] The authors interchangeably use the term bit and information (as do the cited references). An erasure of a bit is logically irreversible, but there is no evidence that information can be equated to a bit. While they have plenty of evidence to say that changes in state produce a amount of heat equal to kTln(2), the link to information itself is still reliant on an assumption. The capacity of a bit system to contain information is evident (1 or 0 do represent different things, else we would see no change), but the bit system being information itself is a faulty assumption. We see this as information erasure because we’ve assigned the bit to represent something (ex. isMale or isFemale). But regardless of the direction of the flip (0 to 1 or 1 to 0), the heat change is the same. Put another way, regardless of the information the system may contain, the heat change is the same. If the information is irrelevant, the heat change is due to a separate property, which I’m claiming is the distribution, W.
Actually, Joshua, I am not at all certain that you are correct when it comes to how the brain stores information. It is entirely possible that the brain stores information in the form of states and the flipping of those states. For example, what we seem to understand about the brain right now is that it stores information by adding connections between neurons. While we aren’t even close to learning the details, it is very possible that the connections between neurons represent some set of microstates, and by altering those microstates, the brain is storing new information. Now I expect that the brain’s method of storing information is far more sophisticated than any computer, but it is entirely possible that the method of information storage rests on the same principle. Now I do agree with you that equating information to a bit is an assumption, but given our experience with computers and how they store information, I think it is a very good assumption.
I still disagree with you that the bit system itself is not information. I agree that the meaning of the information depends on the coding, but the bit itself does represent information. When you change the bit, you are changing the information – erasing old information and putting in new information. For example, in the isMale/isFemale system, let’s suppose the coding is looking at the bit so that a 1 means “male” and a 0 means “female.” If I am switching from 1 to 0, I am changing the information to mean the person is not male but is now female. If I change from 0 to 1, I am changing the information to mean that the person is not female but is now male. Either way, information is erased and new information takes its place. That’s why the experiment shows that heat is released either way – because information is erased either way. Thus, I do think that a system of bits can reasonably be called a system of information. The coding determines the meaning of the information, but the bits themselves do represent information. I agree with you that this is an assumption. I just think it is a good one.
Right, I don’t think there is enough data yet to claim any particular method for how the brain processes information. But I think we can agree that the way a computer processes and outputs information is very different from how the brain processes and outputs information. Because of this, I find the assumptions made for one system of containing information insufficient to apply to another. As much as one can claim the state model can be applied to the brain, an equal (or better) argument can be made that it cannot. Data aren’t available yet. Because current data are insufficient, then the broad conclusion one gleans from the first system must be rejected as well. If data comes out that says the brain does operate in states, I will adjust my thoughts, but I can’t validate an assumption that crosses systems like that when the two systems are so extremely different in their output of information.
What is information without its meaning? I don’t understand how you can divorce the two, and perhaps this is my logical failing with the crucial assumption in this study. ‘Information’ becomes meaningless with this separation. It is either a yes or a no. Or a no or a yes. Or an up or a down. Or a top or bottom. Or a myriad of other binary concepts. You say a bit can just be a yes or a no depending on the question, but without the question how does ‘no’ inform anything, and thus become information? A man walking through a forest crying out ‘no’ has as much information to give as a man crying out ‘yes.’ The content, the meaning of information is what defines it. If there is no difference, if the meaning of the information is irrelevant, how can the entropy be dependent on this irrelevant variable? You are right that it is the transition that is causing the change in entropy, but the physical thing that is changing is the state of the bead. This has the capacity to hold information, once it has meaning, but is not physically information prior to that.
Ultimately, I must continue to disagree with the conclusions. I don’t think the data support the claim the authors make in their title. Their data don’t show information being linked to thermodynamics; they assume that a bit is equivalent to information, then proceed to show data that indicate bit flipping is linked to thermodynamics.
Joshua, you are the only one trying to compare computers to brains. This study doesn’t do that, and I am certainly not doing that. You seem to believe that thoughts are information and therefore anything that is learned about information must apply to thoughts. I don’t see any evidence that thoughts are information, so I don’t see any reason to try to apply this study to thoughts. This study is about information, and the only thing it should be applied to is information.
I do think you have pointed out the problem you are having with the study. Information and its meaning can, indeed, be divorced. For example, suppose you look at a message written in code, but you don’t know the cipher for the code. You will not know the meaning of the coded message, but nevertheless, it contains information. The right analysis can even show that the message has information without knowing what that information is. The information can only be determined once the cipher is applied, but the information is there, nevertheless. In the same way, a man running through a forest yelling “no” has as much information as one man running through the forest yelling “yes.” You don’t know the meaning of the information, however, because you don’t know the question they are answering. Each is answering a question, and if you knew the question, you would understand the meaning of the information.
I strongly agree with the conclusions of the study, because I think the authors have produced data that show exactly what they claim. The peer-reviewers and editors of the journal apparently agree with me.
How can you say that there is no evidence that thoughts are information? The very fact that we are having this conversation, capable of typing it out into words is evidence that our thoughts contain information. Thoughts are certainly more than information, but they absolutely contain them. We have no idea the method, but the information is clearly contained within our brains. And as I said in an earlier post, if one claims that information is itself fundamentally physical, that means it must be applied to any case where information is present.
I remain unconvinced this divorce is tenable. Without the cipher existing somewhere, the message is gibberish. An exquisitely created code holds no information if it cannot inform anyone or anything. I may need the cipher to understand the meaning, but ciphers only lead to understanding, not to inherent meaning. Does a cipher contain meaning if there is no code (information)?
I appreciate your appeal to reviewers, but as your many other posts on evolution have shown, peer-review is a necessary condition to studying science, not a sufficient one. The data the authors present here is reliant on an assumption (information = bits); they proceed to show data about the result of that assumption (bits affect thermodynamics), then claim the data directly impacts the subject of that assumption (information affects thermodynamics). This is what evolutionary biologists do. Assume evolution is true, show data about the result of that assumption, and then claim the data proves evolution is true. The actual data of this study only informs us about the bits, and any speculation beyond that is reliant on the principles of assumption. Especially in a case like this when there is a physical quantity already well studied that will answer the question (distribution, W, as my previous posts).
At this point, I believe I must bow out of the conversation. If I continue, I will simply be repeating my arguments. I found your blog many months ago. I have enjoyed reading both the original posts and ensuing discussions greatly, and I thank you for letting me participate in one (sidenote: More posts for the chemists!). I found this topic very stimulating to think about this week. I enjoyed discussing it, and hope you did as well, rather than finding me a burdensome poster like some others of note recently.
Joshua, you hit the nail on the head – “our thoughts CONTAIN information.” An automobile contains gasoline, but an automobile is not gasoline. That’s the key. As I said earlier, thoughts may well contain information, they certainly act on information, but they are not information. Using the same analogy, the way gasoline reacts to the physical laws is quite different from the way an automobile reacts to the physical laws. Thus, to say that this experiment demonstrates that the laws of thermodynamics apply to information says nothing about them applying to thoughts.
I understand that you remain unconvinced, but I remain thoroughly convinced. Let’s suppose I use a cipher to make a coded message. Then I burn the ONLY existing cipher. Did I destroy the information in the message? Of course not. It is still there. Information exists independent of its actual meaning. You ask, “Does a cipher contain meaning if there is no code (information)?” No. That’s because the cipher simply tells you the meaning of the information. Without information, the cipher is useless and thus lacks meaning. Without the cipher, you don’t know the meaning of the information, but the information is still there.
I agree that peer-review doesn’t guarantee good science. Indeed, the failings of peer review have been demonstrated time and time again. However, I simply mentioned the peer reviewers and editors to make it clear that this is not just my opinion and the authors’ opinion. It has clearly convinced a lot of other scientists. That doesn’t mean it is right, but it does mean it is worth consideration.
I would correct your statement: “The data the authors present here is reliant on an assumption (information = bits).” That is quite false. The data are independent of the assumption. The authors’ conclusion rests on the assumption that bits are one kind of information, and based on everything we know, that is a very reasonable assumption. However, the data exist independent of the assumption, and that is quite different from much of the data associated with evolution.
I am glad that you found this topic stimulating, and I thoroughly enjoyed discussing it with you. It allowed me to think more deeply about the study as well. I hope you continue to read, and I hope you comment when you have something to add to the discussion. I expect Josiah has followed this thread and enjoyed it. I expect he even agrees with you. Thus, I expect that this thread has been beneficial to my readers as well!
Oh, I will certainly continue to read. This is one of my more enjoyable lunchtime diversions. And I will gladly post when I can, particularly on the more chemistry-related topics! Good weekend to you and other readers following along!
This is really fascinating! Thanks for sharing this.