# AnalyticBridge

A Data Science Central Community

# All Blog Posts Tagged 'regression' (7)

### Linear Models Don’t have to Fit Exactly for P-Values To Be Accurate, Right, and Useful

There is no need to get confused with multiple linear regression, generalized linear model or general linear methods. The general linear model or multivariate regression model is a statistical linear model and is written as Y = XB + U.

Usually, a linear model includes a number of different statistical models such as ANOVA, ANCOVA, MANOVA, MANCOVA, ordinary linear regression, t-test and F-test. The GLM is a generalization of multiple…

Continue

Added by Chirag Shivalker on November 2, 2017 at 11:30pm — 1 Comment

### Regression, Logistic Regression and Maximum Entropy

One of the most important tasks in Machine Learning are the Classification tasks (a.k.a. supervised machine learning). Classification is used to make an accurate prediction of the class of entries in the test set (a dataset of which the entries have not been labelled yet) with the model which was constructed from a training set. You could think of classifying crime in the field of Pre-Policing, classifying patients in the Health sector, classifying houses in the Real-Estate sector.…

Continue

Added by ahmet taspinar on March 29, 2016 at 8:00am — No Comments

### Cross-validation in R: a do-it-yourself and a black box approach

In my previous post, we saw that R-squared  can lead to a misleading interpretation of the quality of our regression fit, in terms of prediction power. One thing that R-squared offers no protection against is overfitting. On the other hand, cross validation, by allowing us to have cases in our testing set that are different from the cases in our training set,  inherently offers protection against overfittting.

1.Do-it-yourself leave-one-out cross validation in R.

In this type…

Continue

Added by Theophano Mitsa on May 22, 2013 at 8:06am — 2 Comments

### Use PRESS, not R squared to judge predictive power of regression

R squared, also known as coefficient of determination, is a popular measure of quality of fit in regression. However, it does not offer any significant insights into how well our regression model can predict future values. Instead, the PRESS statistic (the predicted residual sum of squares) can be used as a measure of predictive power. The PRESS statistic can be computed in the leave-one-out cross validation…

Continue

Added by Theophano Mitsa on May 12, 2013 at 9:00am — 4 Comments

### Read the Preface for Siegel’s Book – Predictive Analytics

Read the Preface for Siegel’s Book – Predictive Analytics

Here is the preface for Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die

By Eric Siegel, with a foreword from Tom Davenport

(Wiley, February 2013)

To order the book:…

Continue

Added by Eric Siegel on February 8, 2013 at 6:30am — No Comments

### Iterative Algorithm for Linear Regression

I am trying to solve the regression Y=AX where Y is the response, X the input, and A the regression coefficients. I came up with the following iterative algorithm:

Ak+1 = cYU + Ak (I-cXU),

where:

• c is an arbitrary constant
• U is an arbitrary matrix such that YU has same dimension as A. For instance U = transposed(X)…
Continue

Added by Vincent Granville on July 30, 2008 at 6:00pm — No Comments

### Nonparametric regression: the LOESS procedure

PROC LOESS implements a nonparametric method for estimating local regression surfaces pioneered by Cleveland (1979); also refer to Cleveland et al. (1988) and Cleveland and Grosse (1991). This method is commonly referred to as loess, which is short for local regression.

PROC LOESS allows greater flexibility than traditional modeling tools because you can use it for situations in which you do not know a suitable parametric form of the regression surface. Furthermore, PROC LOESS is… Continue

Added by Vincent Granville on March 31, 2008 at 11:00pm — No Comments

2020

2019

2018

2017

2016

2015

2014

2013

2012

2011

2010

2009

2008