Random Forests vs MARS vs Linear regression - AnalyticBridge2020-07-13T03:13:14Zhttps://www.analyticbridge.datasciencecentral.com/forum/topics/random-forests-vs-mars-vs-linear-regression?feed=yes&xn_auth=noLinear regression is difficul…tag:www.analyticbridge.datasciencecentral.com,2015-02-26:2004291:Comment:3207472015-02-26T01:37:52.579ZVincent Granvillehttps://www.analyticbridge.datasciencecentral.com/profile/VincentGranville
<p>Linear regression is difficult to interpret, subject over-fitting, sensitive to outliers, and only work in contexts in which associations are nearly linear. If you only have a few variables and hundred observations, it might be enough.</p>
<p>Linear regression is difficult to interpret, subject over-fitting, sensitive to outliers, and only work in contexts in which associations are nearly linear. If you only have a few variables and hundred observations, it might be enough.</p> There's no way to give you a…tag:www.analyticbridge.datasciencecentral.com,2014-03-17:2004291:Comment:2913492014-03-17T21:26:50.998ZDanny W. Stouthttps://www.analyticbridge.datasciencecentral.com/profile/DannyWStout
<p>There's no way to give you a good answer within a forum posts so I'll summarize my thoughts in a few small sentences. RF can be considered a very powerful modeling approach but is pretty much a black box. To put it in terms of linear regression, it is like building 200 linear regression models, with predictors and data chosen at random for each tree, and letting the overall prediction being an average (or voted) prediction of all 200 models. With linear regression, you have one model…</p>
<p>There's no way to give you a good answer within a forum posts so I'll summarize my thoughts in a few small sentences. RF can be considered a very powerful modeling approach but is pretty much a black box. To put it in terms of linear regression, it is like building 200 linear regression models, with predictors and data chosen at random for each tree, and letting the overall prediction being an average (or voted) prediction of all 200 models. With linear regression, you have one model built on all predictors, or predictors chosen by a modeling approach whether selection, stepwise or best subsets. You can also see with that example how different the prediction equations would be, with linear regression fairly easy to understand. With RF...well...there really isn't an equation per se. The utility really comes down to what your purpose is. Are you primarily focused on accurate predictions? If so, RF may be your answer. Do you need to understand how the variables work together towards a prediction? If so, you may need linear regression (or an easily interpretable model).</p>