I’ve heard some people (Father Chad Ripperger for example) say that the second law of thermodynamics (entropy) occurred after and because of The Fall. It’s an interesting idea, but I don’t know of any scientific evidence that supports this. What do you think?
You don’t have to. I understand because I am a very busy man myself. I’m sure there will be other opportunities later when you will be able to prove your point.
The simplest way I can explain it is by saying that thermodynamic entropy is a physical property of matter, which is a function of the distribution of heat within a system: substance, temperature, and pressure. Given a system of two bodies at different temperatures and/or pressures, the entropy of that system will be at some value. If you allow the heat to flow from one to the other until they stabilize at a single temperature and pressure, the final entropy of the system will be at a higher value.
Saying that entropy exists as a consequence of the Fall is like… you might as well say that same thing about density, electrical resistance, or thermal conductivity.
Most of the people at AIG fall into the error of thinking that, since thermodynamic entropy has something to do with disorder (that is, the arrangement of heat energy in a system), therefore any change in something that looks ordered indicates a change in thermodynamic entropy. They totally miss that thermodynamic entropy has to do with
the order of the heat energy of the system; it has nothing to do with the colors of marbles or tiles that one might mix up, how messy you room gets, or with what they call “information”. Thermodynamic entropy is a function of what something is made of, its temperature, and its pressure. Period.
Deeper explanation requires some math, and I’ll skip that unless you ask otherwise. I will, however, mention that the math includes something called the
information function, which has wider use than just thermodynamics and is a very interesting area of mathematics. It was developed by a guy named C. E. Shannon, who was trying to understand the limits of data compression and the uncertainties in data transmission over a noisy channel. It turns out that random strings (which don’t have much meaning) have more information than nonrandom strings (which do); the character string “nesweb_ll_,c_hmtwstatmo__a_rf.la_dataseyalweheiii_s_” has more information and less meaning than the string “mary_had_a_little_lamb,_its_fleece_was_white_as_snow” (which uses the exact same characters). That’s one of the things that the AIG crowd gets exactly backwards.