Diff for "MachineLearning" - Methods
location: Diff for "MachineLearning"
Differences between revisions 17 and 18
Revision 17 as of 2008-06-26 12:34:32
Size: 4470
Comment:
Revision 18 as of 2008-06-26 13:00:20
Size: 4757
Comment:
Deletions are marked like this. Additions are marked like this.
Line 57: Line 57:
23. Random Field Theory. 23. Random Field Theory with applications in fMRI.
Line 59: Line 59:
24. Neural Networks. 24. Artificial Neural Networks from a probabilistic viewpoint.
Line 61: Line 61:
25. Neural Networks 2. 25. Artificial Neural Networks 2.
Line 63: Line 63:
26. SPM and Machine Learning. 26. Machine Learning methods used in SPM.
Line 65: Line 65:
27. FSL and Machine Learning. 27. Machine Learning methods used in FSL.
Line 67: Line 67:
28. Signal processing basics.
Line 68: Line 69:
29. Fast Fourier Transform.

30. Wavelets.

31. Discussion about specific papers ...

32. Spherical Deconvolution.

33. SNR in MRI experiments.

Machine Learning Pages

These pages have been compiled by members of the CBU Learning Machine Learning (LML) Group

Machine Learning Course

1. Introduction (applications, supervised, unsupervised, semi-supervised, reinforcement learning, bayes rule, probability theory, randomness) attachment:Presentation1_LML.ppt, 27 May 2008, Eleftherios Garyfallidis.

2. Further Introduction (what is ML, bayes rule, bayesian regression,entropy, relative entropy, mutual information), attachment:Presentation2_LML.ppt, 3 June 2008, Eleftherios Garyfallidis.

3. Maximum Likelihood vs Bayesian Learning (Notes available upon request) attachment:Presentation3_LML.ppt, 10 June 2008, Hamed Nili.

4. Factor Analysis, PCA and pPCA, attachment:Presentation4_LML.ppt, 17 June 2008, Hamed Nili.

5. Independent Component Analysis (ICA), attachment:Presentation5_LML.pdf, 24 June 2008, Jason Taylor.

6. ICA and Expectation Maximization (EM), Eleftherios Garyfallidis, 1 July 2008.

7. Graphical Models 1, 8 July 2008, Ian Nimmo-Smith.

8. Graphical Models 2, 9 September 2008, Ian Nimmo-Smith.

9. Graphical Models 3, 16 September 2008, Ian Nimmo-Smith.

10. Monte Carlo Sampling 1, 23 September 2008, Eleftherios Garyfallidis.

11. Monte Carlo Sampling 2 (MCMC), 30 September 2008, Eleftherios Garyfallidis.

12. Variational approximations (KL divergences, mean field, expectation propagation).

13. Model comparison (Bayes factors, Occam's razor, BIC, Laplace approximations).

14. Reinforcement Learning, Decision Making and MDPs (value functions, value iteration, policy iteration, Bellman equations, Q-learning, Bayesian decision theory

15. Reinforcement Learning 2.

17. General Discussion Q & A.

Proposed next topics :

16. Non-parametric Methods ( Kernel density estimators, nearest-neighbour methods).

17. Gaussian Processes 1.

18. Gaussian Processes 2.

18. Sparse Kernel Machines (support vector machines (SVM))

19. Sparse Kernel Machines 2 (relevance vector machines (RVM))

20. Boosting.

21. Overview of clustering methods ( k-means, EM, hierarchical clustering).

22. Mutual Information with applications to registration and neuronal coding.

23. Random Field Theory with applications in fMRI.

24. Artificial Neural Networks from a probabilistic viewpoint.

25. Artificial Neural Networks 2.

26. Machine Learning methods used in SPM.

27. Machine Learning methods used in FSL.

28. Signal processing basics.

29. Fast Fourier Transform.

30. Wavelets.

31. Discussion about specific papers ...

32. Spherical Deconvolution.

33. SNR in MRI experiments.

Books

1. Pattern Recognition and Machine Learning, C. M. Bishop, 2006.

2. Information Theory and Learning Algorithms, D. J. C. Mackay, 2003.

3. Netlab Algorithms for Pattern Recognition, I. T. Nabney, 2001.

Reading

ICA vs PCA

A simple graphical representation of the differences is given in http://genlab.tudelft.nl/~dick/cvonline/ica/node3.html

ICA

A tutorial is given in http://www.cs.helsinki.fi/u/ahyvarin/papers/NN00new.pdf

MCMC

Christophe Andrieu, Nando de Freitas, Arnaud Doucet and Michael I. Jordan. (2003) [attachment:Andrieu2003.pdf An Introduction to MCMC for Machine Learning.] Machine Learning, 50, 5–43, 2003.

Bayes Rule

Highly recommended Bishop's book first chapter 1.2.

http://plato.stanford.edu/entries/bayes-theorem/

[http://cocosci.berkeley.edu/tom/papers/tutorial2.pdf Thomas Griffiths, Alan Yuille. A Primer on Probabilistic Inference. ]

[http://yudkowsky.net/bayes/bayes.html An Intuitive Explanation of Bayesian Reasoning Bayes' Theorem By Eliezer Yudkowsky]

[http://homepages.wmich.edu/~mcgrew/Bayes8.pdf Eight versions of Bayes' theorem]

Bayesian Methods in Neuroscience

[http://www.gatsby.ucl.ac.uk/~pel/papers/ppc-06.pdf Ma, W.J., Beck, J.M., Latham, P.E. & Pouget, A. (2006) Bayesian inference with probabilistic population codes. Nature Neuroscience. 9:1432-1438]

[http://cocosci.berkeley.edu/tom/papers/bayeschapter.pdf Griffiths,Kemp and Tenenbaum. Bayesian models of cognition.]

[http://www.cvs.rochester.edu/knill_lab/publications/TINS_2004.pdf Knill, D. C., & Pouget, A. (2004). The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosciences, 27(12), 712-719.]

[http://www.inf.ed.ac.uk/teaching/courses/mlsc/HW2papers/koerdingTiCS2006.pdf Kording, K. & Wolpert, D.M. (2006) Bayesian decision theory in sensorimotor control. TRENDS in Cognitive Sciences,10, 319-326]

Software

Public code for machine learning :

http://homepages.inf.ed.ac.uk/rbf/IAPR/researchers/MLPAGES/mlcode.htm

None: MachineLearning (last edited 2013-03-08 10:28:25 by localhost)