Diff for "MachineLearning" - Methods
location: Diff for "MachineLearning"
Differences between revisions 25 and 26
Revision 25 as of 2008-06-28 00:48:48
Size: 5213
Comment:
Revision 26 as of 2008-06-28 00:50:36
Size: 5213
Comment:
Deletions are marked like this. Additions are marked like this.
Line 15: Line 15:
6. ICA and Expectation Maximization (EM), Eleftherios Garyfallidis, 1 July 2008. 6. ICA and Expectation Maximization (EM), 1 July 2008, Eleftherios Garyfallidis.

Machine Learning Pages

These pages have been compiled by members of the CBU Learning Machine Learning (LML) Group

Learning Machine Learning Course

1. Introduction (applications, supervised, unsupervised, semi-supervised, reinforcement learning, bayes rule, probability theory, randomness) attachment:Presentation1_LML.ppt , 27 May 2008, Eleftherios Garyfallidis.

2. Further Introduction (what is ML, bayes rule, bayesian regression,entropy, relative entropy, mutual information), attachment:Presentation2_LML.ppt , 3 June 2008, Eleftherios Garyfallidis.

3. Maximum Likelihood vs Bayesian Learning (Notes available upon request) attachment:Presentation3_LML.ppt , 10 June 2008, Hamed Nili.

4. Factor Analysis, PCA and pPCA, attachment:Presentation4_LML.ppt , 17 June 2008, Hamed Nili.

5. Independent Component Analysis (ICA), attachment:Presentation5_LML.pdf , 24 June 2008, Jason Taylor.

6. ICA and Expectation Maximization (EM), 1 July 2008, Eleftherios Garyfallidis.

7. Graphical Models 1, 8 July 2008, Ian Nimmo-Smith.

8. Graphical Models 2, 9 September 2008, Ian Nimmo-Smith.

9. Graphical Models 3, 16 September 2008, Ian Nimmo-Smith.

10. Monte Carlo Sampling 1, 23 September 2008, Eleftherios Garyfallidis.

11. Monte Carlo Sampling 2 (MCMC), 30 September 2008, Eleftherios Garyfallidis.

12. Variational approximations (KL divergences, mean field, expectation propagation).

13. Model comparison (Bayes factors, Occam's razor, BIC, Laplace approximations).

14. Reinforcement Learning, Decision Making and MDPs (value functions, value iteration, policy iteration, Bellman equations, Q-learning, Bayesian decision theory

15. Reinforcement Learning 2.

16. General Discussion Q & A.

Proposed next topics :

  • Non-parametric Methods ( Kernel density estimators, nearest-neighbour methods).
  • Gaussian Processes 1.
  • Gaussian Processes 2.
  • Sparse Kernel Machines (support vector machines (SVM))
  • Sparse Kernel Machines 2 (relevance vector machines (RVM))
  • Boosting.
  • Overview of clustering methods ( k-means, EM, hierarchical clustering).
  • Mutual Information with applications to registration and neuronal coding.
  • Random Field Theory with applications in fMRI.
  • Artificial Neural Networks from a probabilistic viewpoint.
  • Artificial Neural Networks 2.
  • Machine Learning methods used in SPM.
  • Machine Learning methods used in FSL.
  • Signal processing basics.
  • Fourier Transform.
  • Wavelets.
  • Spherical Harmonics.
  • Spherical Deconvolution.
  • SNR in MRI experiments.
  • and many meetings for discussion of specific papers. Any other suggestions ?

Books

1. Pattern Recognition and Machine Learning, C. M. Bishop, 2006. (Copy in our Library)

2. Information Theory and Learning Algorithms, D. J. C. Mackay, 2003. (Available online)

3. Netlab Algorithms for Pattern Recognition, I. T. Nabney, 2001.

4. Gaussian Processes for Machine Learning, C. E. Rasmussen and C. K. I. Williams, 2006. (Available online)

Reading

EM

An online demo with mixtures of lines or mixtures of gaussians. http://lcn.epfl.ch/tutorial/english/gaussian/html/

ICA

An online demonstration of the concept in http://www.cis.hut.fi/projects/ica/icademo/

A tutorial is given at http://www.cs.helsinki.fi/u/ahyvarin/papers/NN00new.pdf

ICA vs PCA

A simple graphical representation of the differences is given in http://genlab.tudelft.nl/~dick/cvonline/ica/node3.html

MCMC

Christophe Andrieu, Nando de Freitas, Arnaud Doucet and Michael I. Jordan. (2003) [attachment:Andrieu2003.pdf An Introduction to MCMC for Machine Learning.] Machine Learning, 50, 5–43, 2003.

Bayes Rule

Highly recommended from Bishop's book chapter 1.2.

http://plato.stanford.edu/entries/bayes-theorem/

[http://cocosci.berkeley.edu/tom/papers/tutorial2.pdf Thomas Griffiths, Alan Yuille. A Primer on Probabilistic Inference. ]

[http://yudkowsky.net/bayes/bayes.html An Intuitive Explanation of Bayesian Reasoning Bayes' Theorem By Eliezer Yudkowsky]

[http://homepages.wmich.edu/~mcgrew/Bayes8.pdf Eight versions of Bayes' theorem]

Bayesian Methods in Neuroscience

[http://www.gatsby.ucl.ac.uk/~pel/papers/ppc-06.pdf Ma, W.J., Beck, J.M., Latham, P.E. & Pouget, A. (2006) Bayesian inference with probabilistic population codes. Nature Neuroscience. 9:1432-1438]

[http://cocosci.berkeley.edu/tom/papers/bayeschapter.pdf Griffiths,Kemp and Tenenbaum. Bayesian models of cognition.]

[http://www.cvs.rochester.edu/knill_lab/publications/TINS_2004.pdf Knill, D. C., & Pouget, A. (2004). The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosciences, 27(12), 712-719.]

[http://www.inf.ed.ac.uk/teaching/courses/mlsc/HW2papers/koerdingTiCS2006.pdf Kording, K. & Wolpert, D.M. (2006) Bayesian decision theory in sensorimotor control. TRENDS in Cognitive Sciences,10, 319-326]

Online Demos for ML

http://homepages.inf.ed.ac.uk/rbf/IAPR/researchers/PPRPAGES/pprdem.htm

Software

Public code for machine learning :

http://homepages.inf.ed.ac.uk/rbf/IAPR/researchers/MLPAGES/mlcode.htm

None: MachineLearning (last edited 2013-03-08 10:28:25 by localhost)