OgreMkV
Posts: 3626 Joined: Oct. 2009

Quote (Joe G @ Mar. 11 2012,10:41)  Quote (OgreMkV @ Mar. 09 2012,11:44)  While my pizza is heating, I figured I'd delve into a more common definition of information.
Quote  Information theory is based on probability theory and statistics. The most important quantities of information are entropy, the information in a random variable, and mutual information, the amount of information in common between two random variables. 
Well, that first sentence let's out Joe right there.
Of course, the second sentence probably screws him up too. (Hint: This isn't the same entropy as in the second law of thermodynamics, which probably explains the creationist attempt to say that information can't increase, because entropy can't decrease... in a closed system.)
Quote  entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message. 
Now, in our two DNA sequences... there are four characters in use. Doesn't matter what they are, just that there are four of them. We could spell them out, we could assign numbers, or code phrases (adenine = swamp ass). But it doesn't matter, because there are still only 4 choices.
Therefore it only takes 2 bits to unambiguously identify those 4 options. Two bits is also nice because it does not have any excess. I.e. there is no repeatability (like how UCU, UCA, UCG, UCC, AGU, and AGC all stand for serine in translating mRNA > amino acids). So, 2 bits per letter really is the shortest possible sequence that we can describe these two molecules... without compression. But since the tension is already so thick... nevermind.
Now, we have a total of 1698 characters in each sequence. With 2 bits per characters, we get each sequence can be described with 3,396 bits.
Again, any changes in the sequence don't matter. Because every bit is equivalent to every other bit. 01 is not somehow more important than 00. So the the point mutation shown in bold does not (cannot) affect the actual information content in the sequence.
Since each sequence is 1698 characters, then each sequence is 3396 bits.
They contain the same amount of information.
Joe, do you agree or disagree? If you disagree, state clearly why.
Again, because I know that you don't understand this. This question has nothing to do with ID or any other notions that you may hold dear. This is a simple question.
Using Shannon information theory, these two sequences have the same amount of information, correct? 
Kevin as I have already told you Shannon is useless here as Shannon only refers to mere complexity and information carrying capacity. Meyer goes over that in "Signature in the Cell"
"Information" as IDists use it is the same as Information Technology it has to convey meaning or have a function.
And as I have also told you variational tolerance is key because if any polypeptide can perform the function then it isn't specified.
Information. The information age. Information technology. Information theory.
When IDists speak of complex specified information they are using it in the following sense:
information the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects
It is producing those specific events which make the information specified!
When Shannon developed his information theory he was not concerned about "specific effects": Quote  The word information in this theory is used in a special mathematical sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning. Warren Weaver, one of Shannon's collaborators 
And that is what separates mere complexity (Shannon) from specified complexity. 
And now we see why Joe refuses to answer my simple question. Because he MUST be able to dodge the bullet (as it were).
Because here's what you said, Joe, on your blog on January 19th of this year:
Quote  Complex specified information is a specified subset of Shannon information. That means that complex specified information is Shannon information of a specified nature, ie with meaning and/ or function, and with a specified complexity.
Shannon's tells us that since there are 4 possible nucleotides, 4 = 2^2 = 2 bits of information per nucleotide. Also there are 64 different coding codons, 64 = 2^6 = 6 bits of information per amino acid, which, is the same as the three nucleotides it was translated from.
Take that and for example a 100 amino acid long functioning protein a protein that cannot tolerate any variation, which means it is tightly specified and just do the math 100 x 6 + 6 (stop) = 606 bits of specified information minimum, to get that protein. That means CSI is present and design is strongly supported.
Now if any sequence of those 100 amino acids can produce that protein then it isn't specified. IOW if every possible combo produced the same resulting protein, I would say that would put a hurt on the design inference.
The variational tolerance has to be figured in with the number of bits.

In other, I did EXACTLY the same thing that you did, but I'm wrong, because... well... I'm doing it I guess.
You see Joe, the problem is that both of those sequences do exactly the same thing, except one of them does something else too.
So, by this measure, since one allele does something the other does not, does that make it more complex? (That's a yes/no question for you Joe)
What's very interesting is that there are some 2030 alleles that are exactly the same except for one nucleotide.. and almost all of them perform the same function.
Yet, there are another 10^(some really big number) of sequences that long that DO NOT perform that function.
But, yet again, none of this has to do with the actual question, which Joe is much to chicken to answer.
Actually it does have a lot do to with what Joe talks about. He's just cunning enough to realize this is a trap, but is much to dumb to realize that no matter how he answers, he's wrong... because his entire metric is stupid anyway.
But there we go.
More quotes from JoeG on this: Quote  Specified Information is Shannon Information with meaning/ function. 
Quote  Shannon's theory and algorithmic information theory are about complexity, not content 
Tell me Joe, what is the context of these two alleles? Have you figured it out yet? I know, do you?
Tell me Joe, what is the meaning of these two alleles? Have you figured it out yet? I know, do you?
 Ignored by those who can't provide evidence for their claims.
http://skepticink.com/smilodo....retreat
