RSS 2.0 Feed

» Welcome Guest Log In :: Register

Pages: (14) < [1] 2 3 4 5 6 ... >   
  Topic: Evolutionary Computation, Stuff that drives AEs nuts< Next Oldest | Next Newest >  
deadman_932



Posts: 3094
Joined: May 2006

(Permalink) Posted: June 12 2009,11:47   

Quote (mammuthus @ June 12 2009,10:58)
Jorge Fernandez at TWeb is in contact with Sanford.  He just posted the following from Sanford:

   
Quote
Hi Jorge - I have been traveling ...The comment...about "cooking the books" is, of course, a false accusation. The issue has to do with memory limits. Before a Mendel run starts it allocates the memory needed for different tasks. With deleterious mutations this is straight-forward - the upper range of mutation count is known. With beneficials it is harder to guess final mutation count - some beneficials can be vastly amplified. Where there is a high rate of beneficials they can quickly exhaust RAM and the run crashes. Wesley Brewer [one of the creators of Mendel] has tried to avoid this by placing certain limits - but fixing this is a secondary priority and will not happen right away. With more RAM we can do bigger experiments. It is just a RAM issue.

Best - John


This is in response to - "Wes Elseberry made a comment that I think could be a good title, 'Mendel's Accountant
cooks the books."  I assume that they're talking about the failure of the program to increase fitness when a high number of beneficial mutations are specified...
[snip]

Sanford also says:
 
Quote
"The fact that our runs crash when we run out of RAM is not by design. If someone can help us solve this problem we would be very grateful. We typically need to track hundreds of millions of mutations. Beneficials create a problem for us because they amplify in number. We are doing the best we can. I would urge your colleagues [Heaven help me - John is under the impression that you people are my colleagues ... brrrrrrrr!] to use more care. In science we should be slow to raise claims of fraud without first talking to the scientist in question to get their perspective. Otherwise one might unwittingly be engaging in character assassination."

http://www.theologyweb.com/campus....unt=131

That's interesting, because the 2008 ICR "Proceedings of the Sixth International Conference on Creationism (pp. 87–98)." Has a "paper" by John Baumgardner, John Sanford, Wesley Brewer, Paul Gibson and Wally Remine.

The title of that paper is  "Mendel’s Accountant: A New Population Genetics Simulation Tool for Studying Mutation and Natural Selection"  (.PDF link)

So what does John Sanford say there? Well, he says this:  

 
Quote
Mendel  represents  an  advance  in  forward-time simulations by incorporating several improvements over previous simulation tools...
Mendel is tuned for speed, efficiency and memory usage to handle large populations and high mutation rates....
We  recognized that to track millions of individual mutations in a sizable population over many generations, effcient use of memory would be a critical issue – even with the large amount of memory commonly available on current generation computers. We therefore selected an approach that uses a single 32-bit (four-byte) integer to encode a mutation’s fitness effect, its location in the genome, and whether it is dominant or recessive. Using this approach, given 1.6 gigabytes of memory on a single microprocessor, we can accommodate at any one time some 400 million mutations...This implies that, at least in terms of memory, we can treat reasonably large cases using a single processor of the type found in many desktop computers today.


I await the actual achievement of these claims with un-bated breath. All emphases are mine.

--------------
AtBC Award for Thoroughness in the Face of Creationism

  
  418 replies since Mar. 17 2009,11:00 < Next Oldest | Next Newest >  

Pages: (14) < [1] 2 3 4 5 6 ... >   


Track this topic Email this topic Print this topic

[ Read the Board Rules ] | [Useful Links] | [Evolving Designs]