Joined: Jan. 2007
Allen_MacNeill in response to Nakashima
|Your analogy between Occam’s Razor and “least squares” method of line-fitting is one I have never read before, but one that seems quite useful. With your permission, I would like to use it my forthcoming book on evolution.|
Nakashima makes a very good point that preference for simple models in science is a form of Occam's razor, with LS regression a special case of that.
Allen, if you are interested, there is a wide literature on mathematical models for simplicity that generally follow Occam's razor (often in the form of "sparseness" regressors). For examples, see the Support Vector Machine, and Relevance Vector Machines for classification and regression.
Also, many Bayesian treatments of regression problems can be viewed as application of Occam's razor - see "Pattern Recognition and Machine Learning", by Bishop. The first section of Chapter 1 of "Pattern Rec" presents a nice overview of the problem of model inference and mentions ridge regression, shrinkage, and other approaches to constraining model complexity (although "Occam's Razor" is not explicitly mentioned, see the Chapter by MacCay: http://www.cs.toronto.edu/~mackay/itprnn/ps/345.357.pdf , or a google for "Bayesian Occam's Razor").