Hooberus: And lets have a small talk about scholarship.You can read the following in the last article you cited:
For example, under Shannon information, which the NCSE would claim is “the sense used by information theorists,” the following two strings contain identical amounts of information:
String A:
SHANNONINFORMATIONISAPOORMEASUREOFBIOLOGICALCOMPLEXITY
String B:
JLNUKFPDARKSWUVEYTYKARRBVCLTLODOUUMUEVCRLQTSFFWKJDXSOB
Both String A and String B are composed of exactly 54 characters, and each string has exactly the same amount of Shannon information—about 254 bits. 9 10 Yet clearly String A conveys much more functional information than String B, which was generated using a random character generator.
Further down he write:
"In contrast, proponents of intelligent design would define “new” genetic information as a new stretch of DNA which actually performs some different, useful, and new function. For example, consider the following string:
DUPLICATINGTHISSTRINGDOESNOTGENERATENEWCSI
This 42-character string has ~197 bits of Shannon information. Now consider the following string longer:
DUPLICATINGTHISSTRINGDOESNOTGENERATENEWCSIDUPLICATINGTHISSTRINGDOESNOTGENERATENEWCSI
This procedure just added 42 “new” characters, but no new function has been produced."
This sound really interesting, and i tried to read about the author, Casey Luskin:
"Casey Luskin is an attorney with graduate degrees in both science and law. He earned his B.S. and M.S. in Earth Sciences from the University of California, San Diego. ... Casey has published in both law and science journals, including Journal of Church and State; Montana Law Review; Geochemistry, Geophysics, and Geosystems; Hamline Law Review; and Progress in Complexity, Information, and Design. He has published in print and online popular media such as Research News and Opportunities in Science and Theology; Human Events; U.S. News & World Report, BeliefNet; Salvo Magazine; Touchstone Magazine; the Tampa Tribune; the San Diego Union Tribune; the Washington D.C. Examiner, and the Philadelphia Inquire"
Now something seemed quite odd about his vocabulary, he mix up terms that are not the same but i thought that a man you quoted, who apparently had a degree and has puplished in this field, would know the terms he described surely even a creationist would not make such a stupid-ass mistate to write about shannon entropy without knowing what it is. Boy was i surpriced! The thing about shannon entropy is that you want to define it based on a propability distribution. It is clear from the last quote he calculate it based on the uniform propability distribution and thus arrive at 197 bits for the string:
"DUPLICATINGTHISSTRINGDOESNOTGENERATENEWCSI"
However this is completely bogus. If you want to use calculate the information in a signal, you want to use the propability distribution the signal is generated from, ie. you can measure from the signal. Doing that you arrive at something more interesting. First off, in the first example the actual amount of information is the signal is:
"DUPLICATINGTHISSTRINGDOESNOTGENERATENEWCSI" : 157 bits.
and for the other Strings:
A : 211 bits
B : 232 bits
As we expect, Shannon entropy allow us to understand which signal is generated from a random number generator, and which is not. This is how it is usually used. And if you want to argue that i should use the uniform distribution, keep in mind that he is talking about detecting design. if we assume no design, ie. completely random numbers, we would expect shannon information to give that answer - i challenge anyone to find a good argument for that use, or any example in genetics where it is used like that. Shannon information is allways used this way i described above when you want to use it as a meaningfull measure in text, image or sound data, except when you are Dr. Casey, apparently.
Why on earth can a creationist so-called expert in entropy not properly explain what it is, or use it to calculate the entropy of very simple text strings, but still puplish papers about the subject and be seen as an authority?