rthearle
Posts: 10 Joined: May 2005

There is a new paper up at biocomplexity, the Disco Institute website that claims to be a peerreviewed publication. However, a quick read of the text shows that any peerreview it has received is so poor that it doesn't even qualify as proofreading.
The most glaring example is the caption to figure 12, which gets the contents of the two graphs the wrong way around. Figures 7 & 8 are also dubious, in that they give the exact same probability for finding a good solution as for finding a merely adequate one, which is non only obviously suspect, but also contradicts the rest of their data (cf figure 4). Another example is in the section headed "Effect of selection skew" where is written "As noted, it is easy to obtain a solution with a score of 1246. The optimal solution has a cost of 1212. Consequently, there is a very small range of possible costs that are of interest, especially when contrasted with the range of possible costs, 0  100000.". The problem here is that that is not the range of possible costs. No solution can have a cost less than the optimal 1212, and the upper value of 100000 is an arbitrary high number assigned to invalid configurations in order to exclude them from consideration. An actual calculated cost cannot* exceed 33000.
But possibly the worst claim made in the paper is this: "Every search algorithm terminates with a solution that has a specific cost." That's complete nonsense that would raise alarm bells with anyone having any familiarity with genetic algorithms and evolutionary simulations at all, let alone with the Steiner tree search algorithm being discussed in the paper. Elsewhere in the paper the actual termination condition for this algorithm is stated, and it isn't at a solution with a specific cost  it's after a set number of generations. It could instead have been implemented to stop either when several generations produced no improvement, or on user command, both of which are widely used in other genetic search algorithms. But it could not have been implemented with the termination condition that the authors claim all search algorithms stop at, since the simulation being discussed can and has been used to find Steiner trees for arrangements for which the optimal solution is unknown  and you can't program an algorithm to stop at a solution with a specific cost if you don't know what that specific cost is.
Any proper attempt at peer review would pick up these points and other less glaring errors. The proposed conclusion is that papers published in 'Biocomplexity' are peerreviewed not for quality, but for conformity.
Roy
*This is an estimate based on the worst case i can think of; I haven't done an exhaustive calculation.
