A Data Science Central Community
Date: Monday March 22, 2010; 6:30 pm
Cost: Free
Speakers: Andrea Montanari, Stanford Professor in Electrical Engineering and Statistics
Title: “Large Matrices beyond Singular Value Decomposition”
A number of data sets are naturally described in matrix form. Examples range from micro-arrays to collaborative filtering data. In many
of these examples, singular value decomposition (SVD) provides an efficient way
to construct a low-rank approximation thus achieving a large dimensionality
reduction. SVD is also an important tool in the design of approximate
linear algebra algorithms for massive data sets. It is a recent discovery
that –for ‘generic’ matrices — SVD is sub-optimal, and can be significantly
improved upon. There has been considerable progress on this topic over
the last year, partly spurred by interest in the Netflix challenge. I
will overview this progress.
Andrea Montanari received a Laurea degree in Physics in 1997, and a Ph. D. in Theoretical Physics in 2001 (both from Scuola Normale Superiore in Pisa,
Italy). He has been
post-doctoral fellow at Laboratoire de Physique Théorique de l’Ecole Normale
Supérieure (LPTENS), Paris, France,
and the Mathematical Sciences Research Institute, Berkeley,
USA. Since 2002 he is
Chargé de Recherche (a permanent research position with Centre National de la
Recherche Scientifique, CNRS) at LPTENS.
In September 2006 he joined Stanford University
as Assistant Professor in the Departments of Electrical Engineering and
Statistics.
He was co-awarded the ACM SIGMETRICS best paper award in 2008. He received the
CNRS bronze medal for theoretical physics in 2006 and the National Science
Foundation CAREER award in 2008.
Tags: Andrea, Collaborative, Data, Decomposition, Filtering, Learning, Machine, Mining, Montanari, Netflix, More…Singular, Stanford, Statistics, Value, contest, data, dimensionality, large, reduction, sets
© 2021 TechTarget, Inc.
Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
Most Popular Content on DSC
To not miss this type of content in the future, subscribe to our newsletter.
Other popular resources
Archives: 2008-2014 | 2015-2016 | 2017-2019 | Book 1 | Book 2 | More
Most popular articles