RSS 2.0 Feed

» Welcome Guest Log In :: Register

Pages: (1000) < [1] 2 3 4 5 6 ... >   
  Topic: Official Uncommonly Dense Discussion Thread< Next Oldest | Next Newest >  

Posts: 182
Joined: June 2006

(Permalink) Posted: Feb. 22 2007,10:56   

He suggests the complete works of Shakespeare are both specified and complex.

Ah, but here Dembski says that if "METHINKS..." is the outcome of an evolutionary algorithm, then it has a complexity of zero:  
It follows that Dawkins's evolutionary algorithm, by vastly increasing the probability of getting the target sequence, vastly decreases the complexity inherent in that sequence. As the sole possibility that Dawkins's evolutionary algorithm can attain, the target sequence in fact has minimal complexity (i.e., the probability is 1 and the complexity, as measured by the usual information measure, is 0).

So if I hand great_ape the complete works of Shakespeare, he can't tell me whether they has CSI or not unless I tell him how they were generated.  CSI is useless for determining design because you have to know the causal story in order to determine whether something has CSI.

"I wasn't aware that classical physics had established a position on whether intelligent agents exercising free were constrained by 2LOT into increasing entropy." -DaveScot

  29999 replies since Jan. 16 2006,11:43 < Next Oldest | Next Newest >  

Pages: (1000) < [1] 2 3 4 5 6 ... >   

Track this topic Email this topic Print this topic

[ Read the Board Rules ] | [Useful Links] | [Evolving Designs]