RSS 2.0 Feed

» Welcome Guest Log In :: Register

Pages: (167) < [1] 2 3 4 5 6 ... >   
  Topic: AFDave's UPDATED Creator God Hypothesis 2< Next Oldest | Next Newest >  
ericmurphy



Posts: 2460
Joined: Oct. 2005

(Permalink) Posted: Oct. 23 2006,05:50   

Quote (afdave @ Oct. 23 2006,10:33)
ABSURDITY ILLUSTRATED -- WINSTON CHURCHILL VS. WHITE NOISE

This has got to be the all time winner for dumb questions on this thread.  I won't emabarrass the person who asked it by naming him ... I'll just use it to illustrate how diseased a brain can become when affected by years of "glue sniffing" from the "Brown Bag of Darwinism."

Here's the question (addressed to me) ...     "Which has more information: a digital recording of a Winston Churchill speech; or an equally-long digital recording of broadband white noise?"

Words fail me ...

Were it not for the seemingly level headed types like Cory, Drew Headley, Russell and a few others that have at least had some good questions, all might be lost.

OK, here's an answer ... My answer to this poor questioner would be "Ask the Germans" ... I can assure you that the Nazi Intelligence Division didn't even spend a nano-second trying to decide the answer to this question.  It was quite obvious to them that Churchill's words contained information, white noise did not.  It should be obvious to you also.  I can only hope for your sake that you agree.  If you do not, God help you!

The source of this poor guy's confusion lies in his misunderstanding of Shannon Information Theory.  He seems to think that "Information = Randomness" which of course, is the exact opposite of the truth.  Here is what Dr. Thomas Schneider of the National Institutes of Health has to say ...    

Dave, you're an idiot. You don't know what the definition of "information" is, which is why it is utterly pointless to discuss anything about "information" in the context of genetics.

The proper answer to the question I posed is "b) a digitial recording of broadband white noise." In fact, broadband white noise is the type of signal that has maximal information. It is impossible to construct a signal that has more information than white noise.

The reason you got this question wrong, Dave, is because you're too freaking lazy to do any research that involves looking anywhere other than AiG. If you looked up the terms "Shannon," or "Kolomogorov," and "information," you could have found the answer to this question in less than five minutes. But instead, in your collossal ignorance, you assumed you knew the answer. You were wrong. As usual.

Now, your next homework assignment is to explain why it is true that a digital recording of broadband white noise contains more information than a Winston Churchill speech of the same length. Are you up to the challenge, or are you too lazy?

 
Quote
There are many many statements in the literature [Talk Origins, for example, although, as I am learning, we unjustly honor this source if we call it "literature"] which say that information is the same as entropy. The reason for this was told by Tribus. The story goes that Shannon didn't know what to call his measure so he asked von Neumann, who said `You should call it entropy ... [since] ... no one knows what entropy really is, so in a debate you will always have the advantage' (Tribus1971).

Shannon called his measure not only the entropy but also the "uncertainty". I prefer this term because it does not have physical units associated with it. If you correlate information with uncertainty, then you get into deep trouble. Suppose that:

information ~ uncertainty

but since they have almost identical formulae:

uncertainty ~ physical entropy

so

information ~ physical entropy

BUT as a system gets more random, its entropy goes up:

randomness ~ physical entropy

so

information ~ physical randomness

How could that be? Information is the very opposite of randomness!

The confusion comes from neglecting to do a subtraction:

Information is always a measure of the decrease of uncertainty at a receiver (or molecular machine).

If you use this definition, it will clarify all the confusion in the literature.

Note: Shannon understood this distinction and called the uncertainty which is subtracted the 'equivocation'. Shannon (1948) said on page 20:


R = H(x) - Hy(x)

"The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

The mistake is almost always made by people who are not actually trying to use the measure. [like this poor guy here at ATBC and the poor guy at Talk Origins who snookered him]


As a practical example, consider the sequence logos. Further discussion on this topic is in the http://www.ccrnp.ncifcrf.gov/~toms/bionet.info-theory.faq.html under the topic I'm Confused: How Could Information Equal Entropy?

For a more mathematical approach, see the Information Theory Primer.

Some questions and answers might make these isues more clear.

http://www.lecb.ncifcrf.gov/~toms/information.is.not.uncertainty.html

BTW, I found this link simply by reading up a little on Shannon Information Theory here http://en.wikipedia.org/wiki/Shannon_information and following the external links at the bottom of the article.

It is also quite obvious from the example in the Wikipedia article of colored balls that Shannon's "uncertainty" measure most certainly DOES NOT equate to "maximim information" but is in fact the opposite.

This has got to be one of the most serious gaffes of ATBC members on my threads to date ... even worse than the "Green eyes are the result of a mutation" gaffe.

Nice try, Dave. Now, when you do your little research project I assigned you, you might want to look up the term "compressibility," and see what that has to do with Shannon Information. And remember, Dave: you don't get to make up definitions for things that are already defined in the literature. Shannon Information is a well-defined term, it's used constantly in information theory, and you don't get to change it to serve your purposes.

--------------
2006 MVD award for most dogged defense of scientific sanity

"Atheism is a religion the same way NOT collecting stamps is a hobby." —Scott Adams

  
  4989 replies since Sep. 22 2006,12:37 < Next Oldest | Next Newest >  

Pages: (167) < [1] 2 3 4 5 6 ... >   


Track this topic Email this topic Print this topic

[ Read the Board Rules ] | [Useful Links] | [Evolving Designs]