베이지안 기계학습 (Bayesian Machine Learning)
- 강의실: 공학 x관 xxx호
- Textbooks
- Kevin Patrick Murphy, Machine Learning: a Probabilistic Perspective, MIT Press, 2012 [link][pdf] [pdf2]
- Carl Edward Rasmussen and Christopher K. I. Williams, Gaussian Processes for Machine Learning, MIT Press, 2006 [link][pdf]
- D. Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press, 2012 [link][pdf]
- Christopher Bishop, Pattern Recognition and Machine Learning, Springer, 2007 [link]
- References
- Martin J. Wainwright1 and Michael I. Jordan, Graphical Models, Exponential Families, and Variational Inference, Foundations and Trends in Machine Learning, 2009 [pdf]
- U Toronto's ML lecture: link
- Mathematics summary sheet [pdf]
- Matrix differential calculus with applications in statistics and econometrics pdf
- Python code for probabilistic machine learning [link]
- Edward: A library for probabilistic modeling, inference, and criticism [link]
- Tensorflow Distributions pdf
- Z. Ghahramani, Probabilistic machine learning and artificial intelligence, Nature '15 [pdf]
- Lecture 1: Mathematics for Bayesian machine learning [pdf]
- Lecture 2: Generative models - overview [pdf]
- Lecture 3: Generative models for Discrete data [pdf][pdf][pdf]
- Read Murphy Chap 3
- Ref. Generative models: Beta-Binomial, Dirichlet-Multinomial [pdf]
- Bayesian classification [pdf]
- Binomial and multinomial distributions [pdf]
- Lecture 3: Gaussian Models [pdf][pdf]
- Read Murphy Chap 4
- Summary - Classification: Generative Models [pdf]
- Lecture 4: Linear regression [pdf]
- Ref. Linear regression [pdf]
- Lecture 5: Logistic regression [pdf]
- Ref. Logistic regression [pdf]
- Lecture 6: Mixture models and the EM algorithm [pdf]
- Lecture 7: Latent liner models [pdf]
- Lecture 8: Gaussian Processes [pdf][pdf]
- Lecture x: Basics of Parameter Estimation in Probabilistic Models [pdf]
- Lecture y: Bayesian basics [pdf]
- Assignment 1: Solve problem sets
- MLAPP: Exercises 2.3, 2.6, 2.7, 2.10, 2.11, 2.15, 2.16
- MLAPP: Exercises 3.1-3.4
- Assignment 2: Solve problem sets
- MLAPP: Exercises 3,6, 3.7, 3.9, 3.10, 3.11, 3.12, 3.13, 3.14, 3.17, 3.20, 3.21, 3.22
- Assignment 3: Programming and experiments - Document classification [pdf]
- Assignment 4: Solve problem sets
- MLAPP: Exercises 4,5, 4.7, 4.10, 4.11, 4.14, 4.15, 4.16, 4.18, 4.20, 4.22, 4.23
- Assignment 5: Solve problem sets
- MLAPP: Exercises 7.2, 7.6, 7.7, 7.8, 7.9, 7.10
- Implement the problems of 7.7-(e),(f),
- Implement Eq. 7.84 to compute the statistics in Table 7.2 on caterpillar data.
- Assignment 6: Solve problem sets
- MLAPP: Exercises 8.3, 8.4, 8.5, 8.6
- Implement the approximation for Bayesian logistic regression (Draw Figure 8.6 (a)-(c))
- 1) MAP of Eq. (8.58) for Eq. (8.60)
- 2) Monte carlo approximation - Eq. (8.61)
- 3) Obtain Figure 8.6 (a)-(c) given two-class data in 2d (using Gaussian distributions)
- Assignment 7: Solve problem sets
- MLAPP: Exercises 11.2, 11.3, 11.5, 11.11, 12.5, 12.8
- Implement the EM for GMM, apply it under the setting of 11.4.2.4, draw plots like Figure 11.11
- Implement latent semantic indexing and apply it to real documents (the problem 12.8)