RSS 2.0 Feed

» Welcome Guest Log In :: Register

Pages: (500) < [1] 2 3 4 5 6 ... >   
  Topic: Uncommonly Dense Thread 2, general discussion of Dembski's site< Next Oldest | Next Newest >  
franky172



Posts: 158
Joined: Jan. 2007

(Permalink) Posted: April 30 2009,09:31   

Allen_MacNeill in response to Nakashima

Quote
Your analogy between Occam’s Razor and “least squares” method of line-fitting is one I have never read before, but one that seems quite useful. With your permission, I would like to use it my forthcoming book on evolution.


Nakashima makes a very good point that preference for simple models in science is a form of Occam's razor, with LS regression a special case of that.  

Allen, if you are interested, there is a wide literature on mathematical models for simplicity that generally follow Occam's razor (often in the form of "sparseness" regressors).  For examples, see the Support Vector Machine, and Relevance Vector Machines for classification and regression.

Also, many Bayesian treatments of regression problems can be viewed as application of Occam's razor - see "Pattern Recognition and Machine Learning", by Bishop.  The first section of Chapter 1 of "Pattern Rec" presents a nice overview of the problem of model inference and mentions ridge regression, shrinkage,  and other approaches to constraining model complexity (although "Occam's Razor" is not explicitly mentioned, see the Chapter by MacCay: http://www.cs.toronto.edu/~mackay/itprnn/ps/345.357.pdf , or a google for "Bayesian Occam's Razor").

Enjoy.

  
  14997 replies since July 17 2008,19:00 < Next Oldest | Next Newest >  

Pages: (500) < [1] 2 3 4 5 6 ... >   


Track this topic Email this topic Print this topic

[ Read the Board Rules ] | [Useful Links] | [Evolving Designs]