# AnalyticBridge

A Data Science Central Community

# Vincent Granville's Blog – May 2019 Archive (4)

### Gentle Approach to Linear Algebra, with Machine Learning Applications

This simple introduction to matrix theory offers a refreshing perspective on the subject. Using a basic concept that leads to a simple formula for the power of a matrix, we see how it can solve time series, Markov chains, linear regression, data reduction, principal components analysis (PCA) and other machine learning problems. These problems are usually solved with more advanced matrix calculus, including eigenvalues, diagonalization, generalized inverse matrices, and other types of…

Continue

Added by Vincent Granville on May 28, 2019 at 9:00pm — No Comments

### New Book: Classification and Regression In a Weekend (in Python)

We have added a new free book in our selection exclusively for DSC members. See the first entry below, to get started with machine learning with Python.

1. Book: Classification and Regression In a Weekend

This tutorial began as a series of weekend workshops created by Ajit Jaokar and Dan Howarth. The idea was to work with a specific (longish) program such that we explore as much of it as possible in one weekend. This book is an attempt to take this idea online.…

Continue

Added by Vincent Granville on May 16, 2019 at 6:24pm — No Comments

### Confidence Intervals Without Pain, with Excel

We propose a simple model-free solution to compute any confidence interval and to extrapolate these intervals beyond the observations available in your data set. In addition we propose a mechanism  to sharpen the confidence intervals, to reduce their width by an order of magnitude. The methodology works with any estimator (mean, median, variance, quantile, correlation and so on) even when the data set violates the classical requirements necessary to make traditional statistical techniques…

Continue

Added by Vincent Granville on May 9, 2019 at 11:30am — No Comments

### Re-sampling: Amazing Results and Applications

This crash course features a new fundamental statistics theorem -- even more important than the central limit theorem -- and a new set of statistical rules and recipes. We discuss concepts related to determining the optimum sample size, the optimum k in k-fold cross-validation, bootstrapping, new re-sampling techniques, simulations, tests of hypotheses, confidence intervals, and statistical inference using a unified, robust, simple…

Continue

Added by Vincent Granville on May 4, 2019 at 12:30pm — No Comments

2020

2019

2018

2017

2016

2015

2014

2013

2012

2011

2010

2009

2008