Wesley R. Elsberry
Posts: 4936 Joined: May 2002

Quote (midwifetoad @ Feb. 06 2009,09:41)  The salesman's route problem seems to have been well researched, even to the point where corporations are using genetic algorithms in daily planning. As someone Egnorant of the underlying details, I'd appreciate a layman's overview of the methods used to avoid a brute force search.
How does this relate to Dembski's claims?
And how soon before I can haz a GPS that will plan a multistop route for me? 
The TSP is an NPcomplete problem; proposed candidate solutions can be efficiently checked, but there is no general efficient algorithm to give the optimal tour for the TSP. This is one reason I've consistently used the TSP as an example: human design of algorithms to solve the TSP hasn't managed an efficient solution. Coming up with an efficient solution to the TSP would prove that P=NP, a major outstanding issue in mathematics and computer science. Evolutionary computation approaches are based upon finding approximate solutions to the TSP rather than finding the optimal tour. For many practical applications, finding a tour that is close to optimal is good enough. Being within a few percent of the optimal length can help a lot.
Brute force search becomes impractical for small values of N; Wikipedia gives the practical brute force limit as under N=20. Exact solutions have been found for TSPs with tens of thousands of nodes. The times to develop those seem to be given in units of "CPU years". Going to approximate methods helps get good solutions to tours with up to millions of nodes. There are a variety of approaches that use iterative improvement to get better solutions. Using a genetic algorithm approach, genetic operators that preserve valid tours include inversion and 2opt methods noted as good means of getting to iterative improvement in conventional approximate iterative methods.
I don't know when your GPS may feature an approximate TSP solver.
Now, about Dembski and the NFL...
NFL is not about whether an algorithm can find a solution. It is assumed that all algorithms can find solutions. NFL is about comparative efficiency of an algorithm applied to every cost function of a problem.
So for the TSP, a problem is a specific length of tour and data. The cost function part? That just means that every way that you can combine each tour and cost value. You are likely to think of the TSP where the shortest tour gets the least cost and all the tours are associated with a cost proportional to the total distance... that is just one of the cost functions that are possible. Now think of your set of possible tours as X, and the set of possible costs as Y. The NFL result says that any algorithm performs just as well as any other when the performance is averaged over all the ways that the elements of set X (the tours) can be paired with the elements of set Y (the costs). Most of the cost functions will not pair a cost proportional to distance with a candidate tour, thus any algorithm that depends on that sort of cost function will do poorly except in the small number of cases when the cost function is approximately close to the one that does assign costs proportional to distance. Worse, there will be those cost functions that assign things either exactly or nearly exactly in the reverse way, where higher costs are paired with shorter tours. An algorithm that minimizes costs will perform very badly indeed, worse than blind search, on those cost functions of the problem. That's why all algorithms perform on average just as well or badly as blind search in terms of the NFL theorems.
Response to Dembski from 1999/09/14:
Quote  WAD>This conclusion may seem counterintuitive, especially given WAD>all the marvelous properties that evolutionary algorithms do WAD>possess. But the conclusion holds. What's more, it is WAD>consistent with the "no free lunch" (NFL) theorems of David WAD>Wolpert and William Macready, which place significant WAD>restrictions on the range of problems genetic algorithms can WAD>solve.
The "conclusion" cannot be said to hold in advance of the argument that would lead to it.
I have the Wolpert and Macready paper, "No Free Lunch Theorems for Search", obtained via ftp from the Santa Fe Institute, beside me here. I can find no reference to NFL limiting or restricting the range of problems *any* algorithm can solve. So far as I can tell, The NFL is about comparing the efficiency of algorithms, not decreeing which ones can or cannot be solved by any particular algorithm. For example, there is this quote to be found therein:
"As another example, even if one's goal is to find a maximum of the cost function, hillclimbing and hilldescending are equivalent, on average."
One's intuition is not to deploy a hilldescending algorithm in order to find maxima. This is an indication that Wolpert and Macready's findings are not about whether hilldescent is *capable* of finding maxima; it is taken as a given that they are.
"Restriction" is a different question from comparison of efficiency. How does Dembski reconcile his statement of NFL determining a restriction upon range of problems for the particular case of evolutionary algorithms when the central result of NFL is that *all* algorithms are equivalent when their performance is averaged over all cost functions?
The central question that Dembski poses concerning evolutionary algorithms is not one of comparative efficiency, but rather that of essential capacity. I find it difficult to see on what grounds Dembski advances NFL as a relevant finding concerning the capability of evolutionary algorithms to perform tasks. I ask Dembski to clarify his reasoning on this point.

And from a bit later in 1999:
Quote  Dembski's invocation of Wolpert and Macready's "No Free Lunch" theorem suffers from the same error as his last use of it in "Explaining Specified Complexity". Wolpert and Macready's results are about comparative efficiency, not essential capacity. As mentioned before, Wolpert and Macready treat all algorithms as having the capacity to solve the problem at hand on every possible cost function for that problem. The example that they give of hilldescending algorithms solving hillclimbing problems illustrates this point nicely.
One can characterize the fitness functions which cause some evolutionary algorithm to become less efficient than other algorithms or blind search: the fitness function is "misleading". That is, nearby candidate solutions in genetic space map to worseperforming points when evaluated by the fitness function, and thus away from the solution that would terminate the search. What Dembski needs to do is show that biological genetics instantiates such a situation. Unfortunately for Dembski, the diversity of variants of proteins which perform the same functions would tend to indicate that, in general, that the biological fitness functions do not match the relevant features of misleading cost functions.

NFL applies to all algorithms, not, as Dembski's rhetoric often misleadingly implies, just algorithms instantiated by evolutionary computation.
 "You can't teach an old dogma new tricks."  Dorothy Parker
