Absurdity. Shannon was an electronics engineer who earned the title “father of information theory” because of his ground-breaking work in information.
Yep. His stuff works. Yours doesn’t. That’s why you’re unwilling to show us.
But, one thing is abundantly clear: Shannon was NOT a biologist, and he never established anything in regards to biology.
Not surprisingly, you’re completely wrong about that, too:
**Claude Shannon: Biologist
The Founder of Information Theory Used Biology to Formulate the Channel Capacity
THOMAS D. SCHNEIDER
Shannon’s crucial concept was that the spheres must not intersect in a communications system, and from this he built the channel capacity formula and theorem. But, at its root, the concept that the spheres must be separated is a biological criterion that does not apply to physical systems in general. Although it is well known that Shannon’s uncertainty measure is similar to the entropy function, the channel capacity and its theorem are rarely, if ever, mentioned in thermodynamics or physics, perhaps because these aspects of information theory are about biology, so no direct application could be found in those fields. Since he used a property of biology to formulate his mathematics, I conclude that Claude Shannon was doing biology and was therefore, effectively, a biologist—although he was probably unaware of it.
It is not surprising that Shannon’s mathematics can be fruitfully applied to understanding biological systems [7], [8], [14]. Models built with information theory methods can be used to characterize the patterns in DNA or RNA to which proteins and other molecules bind [15]-[19] and even can be used to predict if a change to the DNA will cause a genetic disease in humans [20], [21]. Further information about molecular information theory is available at the Web site
ccrnp.ncifcrf.gov/~toms/.
What are the implications of the idea that Shannon was doing biology? First, it means that communications systems and molecular biology are headed on a collision course. As electrical circuits approach molecular sizes, the results of molecular biologists can be used to guide designs [22], [23]. We might envision a day when communications and biology are treated as a single field. Second, codes discovered for communications potentially teach us new biology if we find the same codes in a biological system. Finally, the reverse is also to be anticipated: discoveries in molecular biology about systems that have been refined by evolution for billions of years should tell us how to build new and more efficient communications systems. **
IEEE Eng Med Biol Mag. 2006; 25(1): 30–33.
On the other hand, I have shown how it is that insertions and mutations DO NOT ADD information…
Barbarian observes:
No, you simply asserted that, but you haven’t shown us any reason to believe it.
Sorry, that’s wrong. Let’s get started…
The genetic information in a population, for any specific gene:
H = - K Sigma i = 1 n (pi log(p)i) ,
In other words, the sum of all the frequencies of all the alleles for that gene, multiplied by the log of the frequencies of the allele.
So, for example, if we have two alleles each 0.5 frequency, the information for that gene is about 0.30. Suppose a new mutation arises, and soon there are three alleles each about 0.33. Now the information is about 0.48.
That’s how it works. Do you know of any process, required for evolution that is prohibited by “information?”
Barbarian observes:
Birds used to have teeth. Is the loss of teeth a loss of information, or a gain of information?
According to evolution, most likely either a decrease in information, or simply an exchange of information.
No, that’s wrong. There’s only one more choice. Would you like to try again?
Joking aside, birds lack teeth, because there’s a new allele that suppresses the formation of teeth. They still have the genes for teeth. Hence, this mutation added information.
Therefore, order is not information (not equal to information), and neither is it sufficient to define information (contrary to what Barbarian was attempting to claim).
You’ve gotten confused again. I told you that information was a measure of uncertainty. You may find that objectionable, but nevertheless, if you use that definition, it’s possible to greatly compress data and see that it gets transmitted with minimum error. In other words, it works.
That’s a pretty good thing, um?