EN7: The 2nd Law of Thermodynamics & Evolution
Stuhr Evolutionist: “The 2nd law of thermodynamics does not apply to the origin of life because (1) the earth is an open system and the 2nd law applies only to isolated systems, and (2) any reduction in entropy from the origin of the first cells is counter-balanced by an overall increase of entropy in the Earth.”
Creationist: “Apparently, you don’t understand how to apply the 2nd law at all, do you? Even though the 2nd law is defined precisely in terms of isolated systems, it has wonderful applicability to all kinds of chemical and mechanical systems, which can be approximated – in a strong engineering sense – as isolated systems. Furthermore, what happens in a test tube in Michigan is not typically affected by whether the sun is shining in Madagascar, so your second objection is specious.”
In the debate between Ken Ham and Bill Nye, on February 4, 2014, Nye the evolutionist made the typical outrageous assertion that the flow of energy from the Sun to the Earth was sufficient to explain the origin of life. All it takes is energy, after all. How ignorant! Sunlight degrades and destroys. Sunlight ages and wrecks skin, wears out roofing materials, and destroys paint jobs . . . have you noticed? Out in the countryside, have you noticed an old shell of an automobile sitting beside a barn? Is sunlight enhancing that car’s performance? For sunlight to be useful in producing electricity, you need well-designed solar cells and efficiently designed circuitry. For sunlight to sustain and multiply plant life, you require the incredibly complex nano-machinery of photosynthesis. If I want to travel from Chicago to Phoenix, I need a lot of energy to make the trip, right? I could pour sixty gallons of gasoline on myself and light a match. That would produce a lot of energy. But to make the trip to Phoenix, I must employ a carefully designed automobile to efficiently transform that heat energy into controlled and directed motion. You need machines — designed machines — to make functional use of raw energy!
I’ve discovered a marvelous example of how otherwise intelligent people lose their rationality when it comes to their religious commitment to evolution. Bad science is the result. In this case, we find that physicists who are experts in statistical thermodynamics – who understand the mathematics and physics of the 2nd law at a very deep level – fail to apply it in the most important case of all. When it comes to the fantasy of evolution, they warp the logic of physics in ways that should shame a college freshman.
In other articles on this site, notably Probabilities vs. Impossibilities and EN4: Shannon’s Theory and its Relevance to DNA, I address the probabilistic slam dunks against evolution. In this article I emphasize that the 2nd law of thermodynamics is at its core a probabilistic principle by which our universe operates. It can be directly applied to biochemical systems, including those evolutionary speculations on the origin of life and its alleged evolutionary development.
Here is one of many equivalent statements of the 2nd law:
The total entropy of an isolated system will increase over time until equilibrium is reached. At equilibrium, the entropy is maximum and remains constant.
What is entropy? The entropy, “S”, is given by S = k ln W, a result derived by physicist Ludwig Boltzmann in the 19th century. In this expression, k is a constant (Boltzmann’s constant), “ln” is the natural logarithm, and “W” is the number of possible “Ways” that atoms (or other particles) can arrange themselves within the physical constraints they experience. Any particular “macrostate” – like a gas at a pressure of 1 atmosphere and a temperature of 72 degrees (F) – can be represented by a large variety of “microstates”, or arrangements of the individual atoms that give you the same macro readings. “Macro” means we are looking at large scale measurements. “Micro” implies that we are tracking what the individual atoms are doing.
The last two paragraphs may have seemed vague if you have never had a course in statistical physics. So let me offer some simple examples which make the point as thoroughly as anyone needs in order to understand the issues:
1. Pump the air out of a small room until you have a total vacuum. Insert a bottle pressurized with air, but still closed. The bottled gas is under high pressure, let’s say 100 atmospheres (atm), and is at 72 degrees F. It is clear there are a lot of ways the molecules in the bottle can be arranged and you would still measure 100 atm and 72 degrees. But not an infinite number of ways. Each molecule is constrained to the relatively small volume of the bottle. Now let the gas escape slowly and wait until the gas fills the room. Once everything settles down, and we are at equilibrium, we measure the room conditions to be 1 atm and 72 degrees. Now it is obvious that there are MANY more choices of location for each molecule in the large room. The number of “ways” is clearly much bigger now than before. Thus the entropy has increased. By the way, the very large number of ways is easier to handle after we take the logarithm of the big number, as Boltzmann prescribed. Also, if we increase the temperature in the room, the entropy increases. A higher temperature means that the average speed of the molecules increases. So each molecule has a wider range of speeds that it can “choose” from, or alternatively, that it will experience after a number of collisions.
Now, is it possible for the molecules to randomly bounce around in such a way that they accidentally all wind up back in the open bottle? And then you could seal it up again, perhaps! The answer is that it is physically possible, but probabilistically impossible. It wouldn’t happen if you waited a trillion years. So you don’t have to worry that all the air in your living room might just randomly crowd into a corner some evening, leaving you with nothing to breathe.
Yet it is physically possible for such a crazy event to happen. It would not violate the 1st law of thermodynamics, in that energy could still be conserved. Momentum and angular momentum conservation are also safe, not to mention such conservation laws as those that concern charge, spin, baryon number, lepton number, etc. All of those are hard and fast laws of physics in that you won’t ever see an exception, even in detail. The 2nd law, however, is probabilistic. Any given atom could bounce around to any location in the room, or back in the bottle. But when large numbers of atoms are involved, like the air in the room, they will move inexorably toward an equilibrium condition, always distributed evenly in space, if you let them alone. The odds against all of the molecular positions and velocities lining up just right to asphyxiate you are ridiculous. You have a much better chance of winning the lottery every day for the next ten years.
2. Put ice cubes into a glass of warm water. The entropy is low to start with, because a lot of the atoms are locked into ice crystals and so there aren’t a lot of ways to rearrange them. Over time, though, the ice melts, the water cools, and any given water molecule can be found anywhere in the glass. Now there are a lot of ways to rearrange the molecules and still have – as far as you can measure with such instruments as a thermometer – a glass of uniformly cool water. The entropy is now much higher, in fact maximized once equilibrium is achieved. Is it possible for the ice to reappear without external help? Maybe by chance the heat left in some cubical regions of the water will spontaneously leap into the environment and the water will freeze. How could that happen? Maybe the molecules will luckily move in such a coordinated way that the motion (heat) is directed outward, so that little cubes of cold water freeze. Nope. The 2nd law stipulates that it will never happen. It is ridiculously improbable for a system to move from a high entropy state to a low entropy state.
3. You fill the bottom of your cup with coffee, seal it off with a plastic disk, and then pour milk into the cup above the disk. The milk molecules are constrained to the top half of the cup and the coffee to the lower half. The entropy is relatively low. Now remove the disk and watch it all mix. When you have equilibrium, the entropy is large and maxed out. How long would you have to wait for the coffee and milk to unmix? It will never happen. The 2nd law wins again.
4. Your son’s room gets messier and messier as the week goes on. You made him organize it on the weekend, but without attention, it deteriorates quickly. Now, this is not “thermodynamic entropy,” but is related because this also is a probability issue. The room will never spontaneously organize into “neatness.” The mess is high entropy because there are a lot of ways that a room can be messy, but very few, in comparison, that represent a low entropy “neat” configuration. A snapshot of the room on Thursday shows a somewhat different microstate than on Friday. But both microstates represent the same macrostate – “messy.” Analogously, in example 1, the macrostate (1 atmosphere, 72 degrees) can be represented by a gazillion microstates, with molecules arranged in a lot of different ways.
Let’s begin to approach the issue at hand. Creationists contend that evolution, from the alleged chemical origin of life right on through the supposed development of complex life forms, violates the 2nd law at every stage. Evolutionists, normally respecting the 2nd law, profess that this cannot be so. So they claim that the 2nd law is not violated. Now everyone understands that life, from the molecular level (proteins and DNA), to cellular substructures (chromosomes, mitochondria, filtering membranes), to cells (structures of city-like complexity), to tissues, to organs, to creatures with trillions of cells (like humans), is incredibly complex. How such nanomachines could arise from air, dirt, and water must be explained. From the viewpoint of the 2nd law, it is easy to explain why gas expands, why ice cubes melt, why liquids mix, and why dead bodies decay into dust. Dead bodies into dust – sure – entropy increases with decay. But dust into living creatures? That’s high entropy moving toward very low entropy, just chemically, even before we worry about the information content in functional proteins, for example.
If you were paying attention above, you would have realized that none of the 4 examples I used portrayed an isolated system. The gas in any room on this Earth will be affected to some degree by the environment beyond the walls. The walls may get hotter or cooler, or someone may open the curtains and let sunlight in through a window. The cool water in example 2 is likely in a warmer room, so the water will continue to warm up over time. The cup of coffee and milk may be sitting under a fluorescent light, and someone may even blow across the surface to promote faster cooling. Would such environmental influences change the basic results? Of course not. The concept of the “isolated system” is employed to make the mathematics pure and precise. But there is NO SUCH THING as an isolated system – anywhere in this universe. Yet the 2nd law applies to all of these cases. To make detailed calculations of the entropy, scientists and engineers always have to keep track of other variables and make decisions regarding what effects are more or less important. The evolutionist cannot just dismiss the 2nd law with regard to the origin of life by claiming that it occurred in a somewhat open system. Just as he would not dare to claim that the 2nd law is not applicable in the cases cited above.
So, guess what argument the evolutionist uses to dismiss the 2nd law as an issue regarding evolution? He says that as improbable as the first cellular life was, no matter how much the entropy had to decrease in that warm little pond where the first cells popped into existence, the rest of the Earth enjoyed an increase in entropy that more than balanced that decrease. After all, he points out, the Earth is not an isolated system. Sunlight comes in, proto-life forms can make use of that energy, and the Earth radiates much infrared energy away. No matter how big a decrease in entropy is required in that little pond, everything is hunky dory as long as the opposite side of the Earth is shooting out infrared photons. In other words, he doesn’t want to define “the system” as the pond, which must go from high entropy to low entropy. Since a miracle must occur in the pond, he redefines “the system” as the entire Earth. As if the 2nd law cannot be used for something small, like a pond, or a coffee cup, or a glass of ice water.
You might think that such arguments come from mere biologists who took the non-calculus physics track. Au contraire! I have heard this argument from some of the brightest Ph.D. physicists in the country. For example, perhaps the best textbook author on the subject of statistical physics is Ralph Baierlein. His text, Thermal Physics, is brilliant and lucid throughout . . . except when he applies the 2nd law to evolution, as above. Also, noted Caltech physicist Sean Carroll, in his video course, Mysteries of Modern Physics: Time, uses the same argument.
I’ve given you a couple of minutes to cogitate. Can you see the fallacy yet? The 2nd law has enormous validity at the small system level. If you only applied it in planet-sized problems, it wouldn’t have much validity for everyday science and engineering. In addition to the “life-sized” examples above, let’s work through some more, that lead directly to the question of life’s origin and development.
A. The 19th century French engineer, Sadi Carnot, developed the foundational theory for the study and design of heat engines. Your automobile engine burns fuel to produce hot gas. The hot gas performs mechanical work, using up much of its energy to push the car (via pistons, etc.) against frictional forces. The somewhat cooler exhaust gas is expelled, still retaining a good bit of thermal energy (the exhaust pipe is hot), and the cycle starts over again. The efficiency of an engine is related to how much of the hot gas thermal energy is converted into work. Efficiencies are always less than one. You can’t get more energy into your drive train than you start with after combustion. In fact you can’t even get close to 100% of it. Efficiencies can be calculated and the results are tied intimately to the 2nd law.
During my career I’ve supervised and hired quite a number of scientists and engineers. Suppose you listen in on the end of an interview I’m conducting with a Ph.D. who wants to join our research team.
Me: “I wonder if you could explain something that has been a bit of a thermodynamic mystery to me.”
Hopeful: “Sure. Thermodynamics is my specialty. I did my dissertation on the subject and have published widely.”
Me: “Well, it’s quite remarkable, actually. For the last several months I’ve been able to run my car without adding any fuel at all. I’ve been quite careful to make sure that nobody is sneaking fuel into my tank, and I’ve put several thousand miles on the car. It seems the efficiency of my engine, plus drive train and the rest, actually exceeds unity. The car must be sucking thermal energy from the environment and funneling it into the combustion chambers to drive the pistons. Is that even possible?”
Hopeful: “I’ve never heard of such a thing before. Let me think. It doesn’t necessarily violate the law of energy conservation. As long as the air outside is cooling off as heat is getting sucked into your engine, there isn’t any new energy being created.”
Me: “But what about the 2nd law?”
Hopeful: “Ah, I have it. I know that evolution has occurred, with dramatic reductions in entropy counter-balanced by increases in the entropy of the overall Earth system. This must be something similar. There is no problem with a particular heat engine that violates the 2nd law, since entropy is still increasing much more rapidly across the rest of the Earth. Besides, your engine is not an isolated system, you know. Sunlight is beating on your car and noise from other traffic penetrates the system.”
Me: “So what you’re saying is that I can enjoy a statistical miracle under my hood as long as the rest of the planet is falling apart.”
Would you hire the fellow? I wouldn’t. What should he have said? How about, “That’s impossible, based on the 2nd law. I don’t believe you . . . respectfully, sir.”
B. Refrigerators are heat engines running in reverse. You’re certainly aware that a refrigerator in your house actually heats up your house somewhat. The fridge is burning energy in a carefully designed way to reduce entropy inside (lower temperature, slower moving molecules, not as many ‘ways’ or speeds for the molecules to choose from), but at the expense of inefficiently throwing off a lot of heat outside the machine, raising the room temperature. The outside entropy increases a good bit more than the inside entropy decreases. You can’t help but add more heat outside than you want to. You can’t beat the 2nd law.
You see, you can DESIGN a machine to consistently reduce entropy in a local region. It takes intelligence to do so. Even so, of course, the Earth’s total entropy increases when you turn on the fridge because you release heat to the environment. You don’t have to depend on entropy increasing in Thailand to compensate for your freezer’s decrease. Your refrigerator, all by itself, by design, reduces entropy inside while generating even more outside.
C. You’re teaching a high school chemistry lab. Each of 20 students is performing electrolysis, using an electric current to dissociate water, producing gaseous hydrogen and oxygen. Nineteen of the twenty students show results just like the textbook promises: the water level drops and separate flasks fill up with hydrogen and oxygen gas. One of the 20 students, known as the maverick in the class, is having a problem. When you ask him how he screwed up . . .
Maverick: “Well, it was working right for a while, but then it started going backwards. Look! The water level is higher now than when it started. The hydrogen and oxygen flasks were filling up, but now they’re empty. And when I put a current meter on the circuit, it showed the current flowing backwards!”
Teacher: “Ok, wise guy, have you got an explanation for this?”
Maverick: “Yeah, I’ve been thinking about it and realized that this must be one of those fluctuations in entropy. You know, like the entropy reductions that my biology teacher said must have occurred when evolution got started. So I figure that the gases must have spontaneously started recombining into water. And when the system ran out of H2 and O2, the system must have sucked more out of the atmosphere! So if the entropy is going down here, that must mean that it’s just going up over in Australia, or on Jupiter, or somewhere.”
Teacher: “So what’s driving the current backwards?”
Maverick: “Hmm. That’s kind of a mystery, I admit.”
Teacher: “Maybe a simpler explanation is that you just topped off the water level when I wasn’t looking and you’re lying about the rest. What would you bet on?”
We could multiply scenarios indefinitely, of course. The evolutionary scenario starts with a random collection of luckily chosen chemicals, perhaps in a pond, or near a deep sea vent, or on hot clay. (For our purposes we won’t start with the REAL evolutionary scenario, namely that nothing turned into an expanding cloud of hydrogen and helium, and somehow, incredibly well-organized and dynamically balanced galaxies, stars, and planets fluctuated into existence.)
The formation of protein chains, nucleotide chains, and the wealth of other necessary structures would require violations of the 2nd law at every step! Chemically, such chains tend to deconstruct spontaneously — the reactions flow in the direction opposite to the way needed by evolution. Then, if we consider the precisely specified information content in a functional gene (or protein), the challenge to evolution is much more severe. In other articles, for example, I’ve discussed the impossibility of information-rich structures developing from random processes. Information (precisely specified complexity) is essentially the inverse of entropy (lots of random choices that don’t work). Low information, High entropy, and vice versa. The information content of the DNA code can be quantified. It’s also called “configurational entropy,” namely, the amount of entropy or, inversely, information associated with the configuration or specific ordering of the nucleotides that code for protein construction or a cell’s regulatory processes.
Both chemically and informationally, there are a lot more ways – microstates – for a collection of amino acids to wander around in solution, compared to the 1 way they can chain-link together in just the right configuration for usefulness. (Random mixture of amino acids – high entropy. Functional chain – low entropy.) But, they say, this happens in an open system, not an isolated system. Nope, these scenarios can be analyzed just like heat engines or chemicals in test tubes, which are also open systems in a technical sense. Simple analysis can tell you whether a stray photon from the sun or any other environmental influence will change the chemical dynamics. Any evolutionary origin of life scenario is subject to the same 2nd law constraints as the rest of the universe.
Only by using the marvelously sophisticated machinery of a cell, with its construction scaffolds and energy conversion tools, can amino acids form into long chains, which are then folded by the cell into complex 3-D tools — proteins. As your refrigerator uses its designed machinery to reduce entropy in a specified volume, so the cell can construct low entropy configurations, like proteins. The cell introduces additional entropy into its environment (waste products and heat, just like any engine) while it is producing a low entropy interior.
It takes good engineers to construct reasonably efficient heat engines and refrigerators. How much more brilliance is required to construct the nanomachinery of life? The technology is well beyond what humans can do . . . microbiologists are still struggling to visualize the 3-D structures of proteins. Exactly how the processes of cellular life work entail mysteries that will engage researchers for decades to come. Imagine that a refrigerator (plus generator with fuel) is introduced to a primitive non-tech culture. The best minds of that culture may eventually figure out what some of the machinery is for and how it works. They are still a long way from designing and manufacturing their own refrigerators. You could extend this analogy to computer systems. The tribal researchers would have an even greater challenge to analyze a computer’s microsystems. The evolutionary microbiologist has yet to understand much of what goes on inside the cell, which functions via . . . not microsystems, but nanosystems. Yet he imagines that this system of nanomachines, which is far beyond his ability to comprehend, came into existence by luck, all the key formative reactions running against probability, against the 2nd law. What foolishness! No tribal culture would be so stupid, so arrogant.
If you still believe that there is science behind evolution, I dare you to find any published research paper that lists and analyzes – quantitatively – the chemical reactions that would be necessary to go from non-life to life. You won’t find any such paper. You find a lot of word games with maybe a little equation here or there. If it is even possible for chemistry to produce life, then why not write down a plausible system of reactions? The reason is that every biochemist knows that any reaction that he would offer runs in the wrong direction! He cannot offer even a speculative scenario . . . apart from vague word games. Conclusion – there is NO THEORY of evolution. No science, no experiments, no theory, zip, nada, nothing. Your faith is truly blind.
Chemistry is going on within you, around you, and all over the world. Your body burns fuel to keep you alive. Your power plant burns fuel to provide electricity. Manufacturing plants employ chemical processes to produce the products and packaging that line the shelves of the stores. Chemistry is going on in the soil and in the atmosphere. The 2nd law of thermodynamics comes into play continually. A given reaction is more likely to proceed forward than backward because there are more microstates – ways to be – accessible to the products. If we are surprised by the result of a chemistry experiment, then we didn’t take into account some key variable.
The very existence of the field of physical chemistry, and all of the technologies and industries associated with it, depends on reliable calculations tied to the 2nd law. Light a match in the natural gas flow in your backyard grill and you know what happens. The reaction goes one way. You could wait forever and you won’t see that reaction go backward. Exhaust gases and water vapor won’t ever climb back into your grill, suddenly suck a huge dollop of heat out of the environment, and reconstitute the natural gas you started with.
In summary, you can apply the 2nd law to the supposed chemical origin of life, just like for any other application. The 2nd law precludes all fantasies that involve the undesigned growth of complex structures, via chemical reactions that simply cannot proceed in that direction. Protein chains dissociate under natural conditions. They don’t grow spontaneously. You have to design a system to enable the construction of functional machinery.
What about evolution from single-celled creatures to large animals, like mammals? Can’t random mutations and natural selection just take over, once you have a single cell to work with? No. Actually, the situation gets even worse. I discuss this subject at some length in my book, Creation vs. Evolution: No Contest!, which can be downloaded from my free e-book store. Briefly, the probability problems – the 2nd law barriers – are yet greater. The number and complexity of information rich structures for large animals are multiplied well beyond those of an autonomous single cell. Additionally, time becomes an even larger problem. In a test tube mixed with chemicals, you can, in principle, observe millions of reactions to see how many go in an expected direction. Whenever molecules collide, there is an opportunity for a reaction.
In the mutation / natural selection scenario, a mutation represents a single reaction in the DNA chain that produces an unexpected product. You then have to wait a generation time (or more) to see whether there is any effect. Trillions of years could go by without anything interesting happening. By the way, as an indication of how this all works out, observe that the fossil record includes quite a number of extinct creatures. Mutations (plus environmental calamities) can kill not only individuals, but entire species. The increasing mutational load in the human genome (there are more diseases now than ever before) is yet another example of increasing entropy. There are no species that are evolving. We are all devolving. The genome of every species on Earth is deteriorating – increasing in entropy.
Alternatively, we don’t observe new kinds of creatures popping into existence. Yes, I know that evolution never seems to happen when anyone is looking. Maybe it’s simply not happening. The Author of the 2nd law of thermodynamics would affirm just that. In fact, He did! This creation was formed in a low entropy state – Genesis 1. Since Genesis 3, entropy has been increasing and it all continues to fall apart. Even stars explode on occasion. Yet there is a New Heaven and a New Earth in our future – 2 Peter 3, Revelation 21 and 22. You don’t have to worry about putting a stop to the lethal increases in entropy you see all around. God will do that. You had better make sure that you’re on His side when He does so.
– end