Diff for "MachineLearning" - Methods
location: Diff for "MachineLearning"
Differences between revisions 19 and 20
Revision 19 as of 2008-06-26 13:06:20
Size: 4910
Comment:
Revision 20 as of 2008-06-26 13:16:55
Size: 4943
Comment:
Deletions are marked like this. Additions are marked like this.
Line 37: Line 37:
17. General Discussion Q & A. 16. General Discussion Q & A.
Line 41: Line 41:
16. Non-parametric Methods ( Kernel density estimators, nearest-neighbour methods). 17. Non-parametric Methods ( Kernel density estimators, nearest-neighbour methods).
Line 43: Line 43:
17. Gaussian Processes 1. 18. Gaussian Processes 1.
Line 45: Line 45:
18. Gaussian Processes 2. 19. Gaussian Processes 2.
Line 47: Line 47:
18. Sparse Kernel Machines (support vector machines (SVM)) 20. Sparse Kernel Machines (support vector machines (SVM))
Line 49: Line 49:
19. Sparse Kernel Machines 2 (relevance vector machines (RVM)) 21. Sparse Kernel Machines 2 (relevance vector machines (RVM))
Line 51: Line 51:
20. Boosting. 22. Boosting.
Line 53: Line 53:
21. Overview of clustering methods ( k-means, EM, hierarchical clustering). 23. Overview of clustering methods ( k-means, EM, hierarchical clustering).
Line 55: Line 55:
22. Mutual Information with applications to registration and neuronal coding. 24. Mutual Information with applications to registration and neuronal coding.
Line 57: Line 57:
23. Random Field Theory with applications in fMRI. 25. Random Field Theory with applications in fMRI.
Line 59: Line 59:
24. Artificial Neural Networks from a probabilistic viewpoint. 26. Artificial Neural Networks from a probabilistic viewpoint.
Line 61: Line 61:
25. Artificial Neural Networks 2. 27. Artificial Neural Networks 2.
Line 63: Line 63:
26. Machine Learning methods used in SPM. 28. Machine Learning methods used in SPM.
Line 65: Line 65:
27. Machine Learning methods used in FSL. 29. Machine Learning methods used in FSL.
Line 67: Line 67:
28. Signal processing basics. 30. Signal processing basics.
Line 69: Line 69:
29. Fast Fourier Transform. 31. Fourier Transform.
Line 71: Line 71:
30. Wavelets. 32. Wavelets.
Line 73: Line 73:
31. Discussion about specific papers ... 33. Spherical Harmonics.
Line 75: Line 75:
32. Spherical Deconvolution. 34. Spherical Deconvolution.
Line 77: Line 77:
33. SNR in MRI experiments. 35. SNR in MRI experiments.
Line 79: Line 79:
and many meetings for discussion of specific papers.

Machine Learning Pages

These pages have been compiled by members of the CBU Learning Machine Learning (LML) Group

Machine Learning Course

1. Introduction (applications, supervised, unsupervised, semi-supervised, reinforcement learning, bayes rule, probability theory, randomness) attachment:Presentation1_LML.ppt, 27 May 2008, Eleftherios Garyfallidis.

2. Further Introduction (what is ML, bayes rule, bayesian regression,entropy, relative entropy, mutual information), attachment:Presentation2_LML.ppt, 3 June 2008, Eleftherios Garyfallidis.

3. Maximum Likelihood vs Bayesian Learning (Notes available upon request) attachment:Presentation3_LML.ppt, 10 June 2008, Hamed Nili.

4. Factor Analysis, PCA and pPCA, attachment:Presentation4_LML.ppt, 17 June 2008, Hamed Nili.

5. Independent Component Analysis (ICA), attachment:Presentation5_LML.pdf, 24 June 2008, Jason Taylor.

6. ICA and Expectation Maximization (EM), Eleftherios Garyfallidis, 1 July 2008.

7. Graphical Models 1, 8 July 2008, Ian Nimmo-Smith.

8. Graphical Models 2, 9 September 2008, Ian Nimmo-Smith.

9. Graphical Models 3, 16 September 2008, Ian Nimmo-Smith.

10. Monte Carlo Sampling 1, 23 September 2008, Eleftherios Garyfallidis.

11. Monte Carlo Sampling 2 (MCMC), 30 September 2008, Eleftherios Garyfallidis.

12. Variational approximations (KL divergences, mean field, expectation propagation).

13. Model comparison (Bayes factors, Occam's razor, BIC, Laplace approximations).

14. Reinforcement Learning, Decision Making and MDPs (value functions, value iteration, policy iteration, Bellman equations, Q-learning, Bayesian decision theory

15. Reinforcement Learning 2.

16. General Discussion Q & A.

Proposed next topics :

17. Non-parametric Methods ( Kernel density estimators, nearest-neighbour methods).

18. Gaussian Processes 1.

19. Gaussian Processes 2.

20. Sparse Kernel Machines (support vector machines (SVM))

21. Sparse Kernel Machines 2 (relevance vector machines (RVM))

22. Boosting.

23. Overview of clustering methods ( k-means, EM, hierarchical clustering).

24. Mutual Information with applications to registration and neuronal coding.

25. Random Field Theory with applications in fMRI.

26. Artificial Neural Networks from a probabilistic viewpoint.

27. Artificial Neural Networks 2.

28. Machine Learning methods used in SPM.

29. Machine Learning methods used in FSL.

30. Signal processing basics.

31. Fourier Transform.

32. Wavelets.

33. Spherical Harmonics.

34. Spherical Deconvolution.

35. SNR in MRI experiments.

and many meetings for discussion of specific papers.

Books

1. Pattern Recognition and Machine Learning, C. M. Bishop, 2006. (Copy in our Library)

2. Information Theory and Learning Algorithms, D. J. C. Mackay, 2003. (Available online)

3. Netlab Algorithms for Pattern Recognition, I. T. Nabney, 2001.

4. Gaussian Processes for Machine Learning, C. E. Rasmussen and C. K. I. Williams, 2006. (Available online)

Reading

ICA vs PCA

A simple graphical representation of the differences is given in http://genlab.tudelft.nl/~dick/cvonline/ica/node3.html

ICA

A tutorial is given in http://www.cs.helsinki.fi/u/ahyvarin/papers/NN00new.pdf

MCMC

Christophe Andrieu, Nando de Freitas, Arnaud Doucet and Michael I. Jordan. (2003) [attachment:Andrieu2003.pdf An Introduction to MCMC for Machine Learning.] Machine Learning, 50, 5–43, 2003.

Bayes Rule

Highly recommended Bishop's book first chapter 1.2.

http://plato.stanford.edu/entries/bayes-theorem/

[http://cocosci.berkeley.edu/tom/papers/tutorial2.pdf Thomas Griffiths, Alan Yuille. A Primer on Probabilistic Inference. ]

[http://yudkowsky.net/bayes/bayes.html An Intuitive Explanation of Bayesian Reasoning Bayes' Theorem By Eliezer Yudkowsky]

[http://homepages.wmich.edu/~mcgrew/Bayes8.pdf Eight versions of Bayes' theorem]

Bayesian Methods in Neuroscience

[http://www.gatsby.ucl.ac.uk/~pel/papers/ppc-06.pdf Ma, W.J., Beck, J.M., Latham, P.E. & Pouget, A. (2006) Bayesian inference with probabilistic population codes. Nature Neuroscience. 9:1432-1438]

[http://cocosci.berkeley.edu/tom/papers/bayeschapter.pdf Griffiths,Kemp and Tenenbaum. Bayesian models of cognition.]

[http://www.cvs.rochester.edu/knill_lab/publications/TINS_2004.pdf Knill, D. C., & Pouget, A. (2004). The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosciences, 27(12), 712-719.]

[http://www.inf.ed.ac.uk/teaching/courses/mlsc/HW2papers/koerdingTiCS2006.pdf Kording, K. & Wolpert, D.M. (2006) Bayesian decision theory in sensorimotor control. TRENDS in Cognitive Sciences,10, 319-326]

Software

Public code for machine learning :

http://homepages.inf.ed.ac.uk/rbf/IAPR/researchers/MLPAGES/mlcode.htm

None: MachineLearning (last edited 2013-03-08 10:28:25 by localhost)