Joined: Oct. 2009
hmmm... apparently the information content of the webpage was too much for my old PC to calculate... or something like that.
I still call BS on the calculation.
|A simple character count reveals 202 characters which translates into 1010 bits of information/ specified complexity..|
Those are the only two numbers in the whole darn post. There is no explanation of how 202 characters = 1010 bits.
The implication is that 5 bits = 1 character, but why?
Standard ASCII had 8 bits per character (including all numbers, punctuation, control functions, and a space).
Because this is from an ID blog and there's really no chance that they understand these types of things, I'll quote from wiki
Wikipedia - Entropy (Information Theory)
|A fair coin has an entropy of one bit. However, if the coin is not fair, then the uncertainty is lower (if asked to bet on the next outcome, we would bet preferentially on the most frequent result), and thus the Shannon entropy is lower. Mathematically, a coin flip is an example of a Bernoulli trial, and its entropy is given by the binary entropy function. A long string of repeating characters has an entropy rate of 0, since every character is predictable. The entropy rate of English text is between 1.0 and 1.5 bits per letter, or as low as 0.6 to 1.3 bits per letter, according to estimates by Shannon based on human experiments.|
So, Joe, please do explain how you arrived at your 'calculation'*. Why you used the values that you did and what was the point of the exercise (other than showing off 3rd grade level math skills).
* I hesitate to call multiplying a 3 digit number by 5 a 'calculation'. In the strictest sense of the word, it is a calculation, but it is so trivial that any reasonably competent 3rd grader could accomplish the same thing and with more explanation as to 'why' he/she performed the calculation.
Ignored by those who can't provide evidence for their claims.