|Feature Article - August 2006|
|by Do-While Jones|
Studies about information and thermodynamics both include the concept of entropy, which is the enemy of evolution.
The fields of thermodynamics and information theory both present strong challenges to the theory of evolution. Unfortunately, these are complex subjects that are not easily understood. That’s why we are glad that evolutionist Charles Seife has done such a good job of explaining the basic concepts in the introduction and first four chapters of his new book, Decoding the Universe. The last five chapters deal with quantum physics and other subjects not relevant to the theory of evolution; but the first 118 pages of the book contain some of the best, most accurate, and easiest to understand explanations of thermodynamics and information theory.
Since he is an evolutionist, he doesn’t draw the obvious conclusions and tries to skate around them as well as he can; but he does do a pretty good job of setting the stage for those conclusions.
In the introduction he begins by saying,
Information is not just an abstract concept, and it is not just facts or figures, dates or names. It is a concrete property of matter and energy that is quantifiable and measurable. It is every bit as real as the weight of a chunk of lead or the energy stored in an atomic warhead, and just like mass and energy, information is subject to a set of physical laws that dictate how it can behave—how information can be manipulated, transferred, duplicated, erased or destroyed. And everything in the universe must obey the laws of information, because everything in the universe is shaped by the information it contains. 1
He’s really nailed it. We might nit-pick and say that information isn’t a “property of matter and energy”, but is a “property like matter and energy.” Never-the-less, we agree in principle. One can always dispute the best way to express a concept. The choice of “of” or “like” is not that important.
It is important, on the other hand, that in the list of things that can be done to information (it “can be manipulated, transferred, duplicated, erased or destroyed”) the obvious word that is missing is “created.” Information may be discovered, but it can’t be created. Since it exists now, and can’t be created now, it must have existed since “the beginning” (whenever, and whatever that beginning was).
The laws of information are beginning to reveal the answers to some of the most profound questions of science, but the answers are, in some ways, more disturbing and bizarre than the paradoxes they solve. Information leads to a picture of a universe speeding toward its own demise … and of an incredibly byzantine cosmos made up of an enormous collection of parallel universes. 2
His last conclusion about parallel universes is based more on quantum theory than information theory, and is irrelevant to our discussion of evolution. But the first four chapters of his book build a very solid foundation of information theory and thermodynamics upon which he builds his byzantine cosmos. Although the last part of his book is controversial, the foundation is good solid science that is relevant to our understanding of the physical world in general and the theory of evolution in particular.
Information theory and the laws of thermodynamics both lead “to a picture of a universe speeding toward its own demise,” not one evolving from chaos to order.
Information theory and thermodynamics are difficult subjects to explain given any amount of time and space, and especially difficult when limited to a “six-page newsletter,” which is already eight pages again this month. That’s why we recommend you read for yourself the introduction and first four chapters of Seife’s book.
In Chapter One he explains that information is useless unless it moves from place to place. He uses a wartime example. If soldiers can’t communicate they don’t know what to do.
There would be no point for us to write this essay if you could not read it. Somehow the information in our computer has to be transferred to you. It has to be printed on paper and sent to you through the mail, or it has to be converted to electrical signals and sent to you over the Internet.
In the same way, the information in a DNA molecule is of no value if it isn’t read by some biological process that knows what to do with the information. Information is of use only when it flows from the information source to the information destination. It is information flow that gets the job done. Everything that happens is the result of information flow; and everything that happens causes information to flow. (That’s a consequence of the Heisenberg Uncertainty Principle, but let’s not go there.)
The same thing is true of heat, so in Chapter Two, he switches from information to thermodynamics. He correctly observes that energy can be thought of in terms of heat, and that useful work only gets done when heat flows from one place to another. So, there is an analogy between information flow and heat flow.
There’s a lot more important information in Chapter Two. He talks about the conservation of energy, Carnot cycles, and Maxwell’s demon. It’s all good, correct, and important. We can’t repeat it all here. Buy the book and read it for yourself. Everything in Chapter Two leads to what “was inscribed on Boltzmann’s grave: S = k log W, the formula for the entropy of a container full of gas.” 3
In Chapter Three he switches back to information theory and Claude Shannon’s discovery of the concept of entropy in information systems, and that the equation for computing informational entropy is k log W.
Just because Shannon entropy—the measure of information—looks, mathematically, exactly the same as Boltzmann entropy—the measure of disorder—it doesn’t necessarily mean the two are physically related. Lots of equations look the same and have little to do with each other; mathematical coincidences abound in science. But, in fact, Shannon entropy is a thermodynamic entropy as well as an information entropy. Information theory, the science of the manipulation and transmission of bits, is very closely tied to thermodynamics, the science of the manipulation and transfer of energy and entropy. 4 [italics in the original]
People have long known that it is hard to keep hot and cold separated. Hot things cool down and cold things naturally warm up. That’s why the ice melts in your refrigerator if the power goes out. Entropy is a measure of how evenly distributed heat is. As the ice melts in your refrigerator, the entropy of the refrigerator, and the air in your kitchen surrounding it, increases. The same thing is true of information.
Once you stop applying energy, though, that stored information leaks out into the environment—for Nature, it seems, attempts to dissipate stored information just as it attempts to increase entropy; the two ideas are exactly the same. 5
He is right. He correctly reaches this brilliant conclusion at the end of Chapter Three.
The laws of information had already solved the paradoxes of thermodynamics; in fact, information theory consumed thermodynamics. The problems in thermodynamics can be solved by recognizing that thermodynamics is, in truth, a special case of information theory. Now we see that information is physical, by studying the laws of information we can figure out the laws of the universe. And just as all matter and energy is subject to the laws of thermodynamics, all matter and energy is subject to the laws of information. Including us.
Though living beings seem as if they are inherently different from computers and boxes of gas, the laws of information theory still apply. We human beings store information in our brains and our genes just as computers store information in their hard drives, and in fact, it seems that the act of living can be seen as the act of replicating and preserving information despite Nature’s attempts to dissipate and destroy it. Information theory is revealing the answer to the age-old question, What is life? That answer is quite disturbing. 6
It is disturbing to him because he is an evolutionist. Science is leading him down a path he doesn’t want to go.
We rarely quote creationist sources because we know they have little or no credibility with our target audience. So, we will continue to let evolutionist Charles Seife reveal the “disturbing” answer to the age-old question, What is life? But creationist George Javor gives exactly the same answer in his book, Evidences for Creation. Frankly, Javor does a better job of it because he doesn’t have to work hard to avoid the obvious conclusion. Javor says,
The key argument emerges from the biochemical understanding of what life and death are. Chapters 13 and 14 describe some of the unique properties of living matter in preparation for the main argument presented in chapter 15, which is that in living matter, all of the hundreds of linked chemical reactions must be in states of nonequilibria. Death occurs when the biochemical reactions of the cell reach their end point, equilibria. 7
Evolutionist Seife reaches exactly the same conclusion in Chapter Four of his book.
Looking at it from a physicist’s perspective, a living organism, Schrödinger noticed, is continuously fighting off decay. It maintains its internal order despite a universe that is increasing entropy. By eating food, by consuming energy that ultimately comes from the sun, the organism is able to keep itself far from equilibrium: from death. And though Schrödinger didn’t use the phrasing that information theorists use—he was speaking before information theory was born, after all—he explained that life was a delicate dance of energy, entropy, and information. 8
This is not the way Nature usually behaves. Entropy naturally increases in a system that is left to its own devices. A box of gas quickly settles into equilibrium. Information tends to dissipate; stored information eventually diffuses throughout the universe. Information spreads out, especially in big, complex, warm systems like living creatures. And once a creature dies, it immediately begins to decay; its flesh falls apart and so do the molecules that make up that flesh. With them, the creature’s genetic code is, over time, scattered to the winds. Somehow, being alive allows living beings to preserve their information, seemingly flouting entropy for a short time. Once a creature dies, though, that ability is lost forever, and entropy wins as the creature’s information is scattered. 9
We have previously written about how hard it is to define “life.” We can say that something alive is something that is not dead, but then we have the problem of defining what “dead” means. These two authors (one a creationist, and one an evolutionist) have the same understanding of death. Javor actually does a better job of explaining it because he goes into more biological details. We will summarize what both say in our own words.
One rather disrespectful way of describing a dead person is to say that he has “assumed room temperature.” Disrespectful as it is, it is never-the-less true. The second law of thermodynamics says that heat will distribute itself as evenly as it can. Therefore, the corpse will assume room temperature. Living bodies don’t.
This is why life is such a big problem for evolutionists. Information, stored anywhere in general, and in DNA molecules in particular, tends to dissipate.
Information collects in libraries because people spend energy to gather that information from all over and consciously transport it to the library. When uncontrolled energy was added to the library at Alexandria in the form of heat, it did not increase the information contained in it. When the library at Alexandria burned down, the information in all those books dissipated into ashes and was lost.
There is information in a DNA molecule. How did it collect there? The scientific laws of thermodynamics and information theory tell us that information doesn’t just collect naturally. It dissipates naturally. Genetic mutations cause information to be lost, which generally results in disease or death.
Living things get information passed down from their parents through DNA. The parents got that information from the grandparents. You can keep going back generation after generation, but eventually you run out of parents.
Where did the first living things get their information? Evolutionists have to believe (despite all scientific evidence to the contrary) that somehow information naturally collected in a cell, and that information caused the cell to use energy in metabolic and reproductive processes. And then, somehow, information for making cardio-vascular systems just accidentally appeared. Random, unguided processes somehow produced a brain, complete with nerves capable of sensing the environment, and algorithms capable of reacting favorably to that environment. The information for building many different kinds of eyes just happened to flow into the DNA molecule.
Natural laws (the laws of information theory and thermodynamics) tell us that energy and information naturally tend to dissipate. We see that this is exactly what happens when something dies. For life to arise naturally, these laws would have to run in reverse. It is difficult (in fact, it is impossible) for a scientist to reconcile the theory of evolution with the laws of thermodynamics and information theory. Remember,
… the act of living can be seen as the act of replicating and preserving information despite Nature’s attempts to dissipate and destroy it. Information theory is revealing the answer to the age-old question, What is life? That answer is quite disturbing. 10
It is only disturbing if one wants to continue to believe in evolution, in spite of all the evidence against it. The only logical answer is that all the information required for life had to be present at the very beginning. It could not have collected gradually over time.
|Quick links to|
|Science Against Evolution
|Back issues of
of the Month
Charles Seife, Decoding the Universe, 2006, page 2
2 ibid., page 3
3 ibid., page 55
4 ibid., page 77
5 ibid., page 80
6 ibid., page 87
7 Javor, Evidences For Creation, Review and Herald Publishing Association, 2005, page 97, (Cr+)
8 Charles Seife, Decoding the Universe, 2006, page 89
9 ibid., pages 90-91
10 ibid., page 87