RSS 2.0 Feed

» Welcome Guest Log In :: Register

  Topic: Peer review in the intelligent design community< Next Oldest | Next Newest >  

Posts: 15
Joined: May 2005

(Permalink) Posted: April 17 2012,09:41   

There is a new paper up at bio-complexity, the Disco Institute website that claims to be a peer-reviewed publication. However, a quick read of the text shows that any peer-review it has received is so poor that it doesn't even qualify as proofreading.

The most glaring example is the caption to figure 12, which gets the contents of the two graphs the wrong way around. Figures 7 & 8 are also dubious, in that they give the exact same probability for finding a good solution as for finding a merely adequate one, which is non only obviously suspect, but also contradicts the rest of their data (cf figure 4). Another example is in the section headed "Effect of selection skew" where is written "As noted, it is easy to obtain a solution with a score of 1246. The optimal solution has a cost of 1212. Consequently, there is a very small range of possible costs that are of interest, especially when contrasted with the range of possible costs, 0 - 100000.". The problem here is that that is not the range of possible costs. No solution can have a cost less than the optimal 1212, and the upper value of 100000 is an arbitrary high number assigned to invalid configurations in order to exclude them from consideration. An actual calculated cost cannot* exceed 33000.

But possibly the worst claim made in the paper is this: "Every search algorithm terminates with a solution that has a specific cost." That's complete nonsense that would raise alarm bells with anyone having any familiarity with genetic algorithms and evolutionary simulations at all, let alone with the Steiner tree search algorithm being discussed in the paper. Elsewhere in the paper the actual termination condition for this algorithm is stated, and it isn't at a solution with a specific cost - it's after a set number of generations. It could instead have been implemented to stop either when several generations produced no improvement, or on user command, both of which are widely used in other genetic search algorithms. But it could not have been implemented with the termination condition that the authors claim all search algorithms stop at, since the simulation being discussed can and has been used to find Steiner trees for arrangements for which the optimal solution is unknown - and you can't program an algorithm to stop at a solution with a specific cost if you don't know what that specific cost is.

Any proper attempt at peer review would pick up these points and other less glaring errors. The proposed conclusion is that papers published in 'Biocomplexity' are peer-reviewed not for quality, but for conformity.


*This is an estimate based on the worst case i can think of; I haven't done an exhaustive calculation.


Posts: 2088
Joined: April 2007

(Permalink) Posted: April 17 2012,10:44   

Maybe you want to move your interesting post to the already existing Bio-Complexity thread.

"[...] the type of information we find in living systems is beyond the creative means of purely material processes [...] Who or what is such an ultimate source of information? [...] from a theistic perspective, such an information source would presumably have to be God."

- William Dembski -

  1 replies since April 17 2012,09:41 < Next Oldest | Next Newest >  


Track this topic Email this topic Print this topic

[ Read the Board Rules ] | [Useful Links] | [Evolving Designs]