N.Wells
Posts: 1836 Joined: Oct. 2005
|
But your program doesn't do any of the major things mentioned there.
Quote | An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. |
Yours does none of that.
Quote | Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions... Evolution of the population then takes place after the repeated application of the above operators. |
You don't have candidate solutions compete against each other. You don't have a fitness function that is used to compare multiple individuals/solutions and which subsequently determines which solutions / individuals get to go on to generate new variations, thereby driving the direction of evolution of the population.
In more detail, Quote | Implementation of biological processes Generate the initial population of individuals randomly - first generation Evaluate the fitness of each individual in that population Repeat on this generation until termination (time limit, sufficient fitness achieved, etc.): Select the best-fit individuals for reproduction - parents Breed new individuals through crossover and mutation operations to give birth to offspring Evaluate the individual fitness of new individuals Replace least-fit population with new individuals
| None of those have any relation to anything you do.
Let's look at Ferreira's paper, which is one of the models that you invited us to examine: Quote | Gene expression programming, a genotype/phenotype genetic algorithm (linear and ramified), is presented here for the first time as a new technique for the creation of computer programs. Gene expression programming uses character linear chromosomes composed of genes structurally organized in a head and a tail. The chromosomes function as a genome and are subjected to modification by means of mutation, transposition, root transposition, gene transposition, gene recombination, and one- and two-point recombination. The chromosomes encode expression trees which are the object of selection. The creation of these separate entities (genome and expression tree) with distinct functions allows the algorithm to perform with high efficiency that greatly surpasses existing adaptive techniques. The suite of problems chosen to illustrate the power and versatility of gene expression programming includes symbolic regression, sequence induction with and with-out constant creation, block stacking, cellular automata rules for the density-classification problem, and two problems of boolean concept learning: the 11-multiplexer and the GP rule problem. |
Yours doesn't have any of that.
Quote | The most prominent effort is developmental genetic programming (DGP) [4] where binary strings are used to encode mathematical expressions. The expressions are decoded using a five-bit binary code, called genetic code. Contrary to its analogous natural genetic code, this “genetic code”, when applied to binary strings, frequently produces invalid expressions (in nature there is no such thing as an invalid protein). Therefore a huge amount of computational resources goes toward editing these illegal structures, which limits this system considerably. |
Well, that's certainly not your model.
Let's look at Clune et al. They talk about a project Quote | However, this project used a simple model of a six-legged insect that had only two degrees of freedom per leg | Hey, they're modelling insects with all six legs. Apparently they bothered to ground-truth their model.
Quote | As Fig. 3 reports, while both encodings are able to improve over time, HyperNEAT vastly outperforms FT-NEAT in every generation (p<.0001 comparing the fitness of the best organism from each encoding for each generation. This and all future p values come from a non-parametric Wilcoxon rank sum test). | Oh, you don't have any statistical tests and you don't compare your model quantitatively to any other model.
Quote | each plotted group is significantly different from every other (last generation p<.0001), except that the fraction of offspring better than both parents for each encoding was statistically indistinguishable (last generation p>.41). |
Your model doesn't produce offspring, let alone improved ones, and the only p's in all your writings came during your bathroom breaks.
Quote | .... This paper demonstrates that HyperNEAT, a new and promising generative encoding for evolving neural networks, can evolve quadruped gaits without an engineer manually decomposing the problem. Analyses suggest that HyperNEAT is successful because it employs a generative encoding that can more easily reuse phenotypic modules. It is also one of the first neuroevolutionary algorithms that exploits a problem's geometric symmetries, which may aid its performance. We compare HyperNEAT to FT-NEAT, a direct encoding control, and find that HyperNEAT is able to evolve impressive quadruped gaits and vastly outperforms FT-NEAT. Comparative analyses reveal that HyperNEAT individuals are more holistically affected by genetic operators, resulting in better leg coordination. Overall, the results suggest that HyperNEAT is a powerful algorithm for evolving control systems for complex, yet regular, devices, such as robots. |
By all means correct me if I am wrong, but you don't seem to have genetic operators or generative encoding.
Short version: those papers don't seem to have anything to do with your model or your coding.
More groundless assertion from Gary - why is this not surprising?
|