You should feel comfortable with the basics of probability: joint densities, conditional distribu- tions, etc. An undergraduate level is fine — there will not be much measure theory and such background is not required. There will be some basic linear algebra, e.g., solving linear systems and thinking about eigenvalues/eigenvectors. We will regularly use multivariable calculus.
Kevin P. Murphy. Machine Learning: A Probabilistic Perspective. MIT Press, 2012.
Christopher M. Bishop. Pattern Recognition and Machine Learning.Springer, 2006.
This course is about learning to extract statistical structure from data, for making decisions and predictions, as well as for visualization. The course will cover many of the most important math- ematical and computational tools for probabilistic modeling, as well as examine specific models from the literature and examine how they can be used for particular types of data. There will be a heavy emphasis on implementation. You may use Matlab, Python or R. Each of the five assign- ments will involve some amount of coding, and the final project will almost certainly require the running of computer experiments.