Course Syllabus
The syllabuses on both this page and the NTU online course information are synchronized.
Course Information
| Item | Content |
| Course title | Neural Networks |
| Semester | 113-2 |
| Designated for |
Graduate Institute of Brain and Mind Sciences PROGRAM OF NEUROBIOLOGY AND COGNITIVE SCIENCE |
| Instructor | JOSHUA GOH OON SOO |
| Curriculum No. | GIBMS 7015 |
| Curriculum Id No. | 454 M0390 |
| Class | |
| Credit | 3 |
| Full/Half Yr. | Half |
| Required/Elective | Elective |
| Time | Friday 2,3,4(9:10~12:10) |
| Place | 基醫508 |
| Remarks | The course is conducted in English。 |
Course Syllabus
| Item | Content |
| Course Description | This course will introduce basic principles of neural networks in relation to human cognition with applied practical programming of simple neural networks. Students will read three modeling papers and apply the neural network models in these papers to create their own neural networks, in addition to regular class assignments. Four examples of network implementation will be covered: 1) Basic Perceptron, 2) Attractors (Hopfield, 1982), 2) Backpropagation (Multi-Layered Perceptron; Rumelhart et al., 1986), 3) Unsupervised Learning (Von Der Malsburg, 1973). Assignments: In addition to homework to aid understanding, there will be three greater course assignments which are to program the above three neural networks using any of the above software languages and apply the neural networks to real-life problems or simulations of human cognition. Grading: Students will be graded on the quality of their assignments in terms of model success and comprehensiveness of evaluating the models to exemplify a real cognitive phenomenon. Homeworks where given will count towards bonus credit. |
| Course Objective | a). To learn the basic principles of how neural network models work. b) To make one's own simple neural networks. c) To learn how to evaluate neural network models. |
| Course Requirement | Students in Graduate Institute of Brain and Mind Sciences; confidence in computer programming; Jupyter Notebook (https://jupyter.org/) or R (https://www.r-project.org/; with R Studio, https://www.rstudio.com/) are free to download and install, and recommended for use in the modeling work to be done in this course. The course will use python and R code for different models. Officially, there will be no auditing unless very special reason is given; your own computer with above softwares installed and ready to go. Other interested students will be considered on a case-by-case basis. |
| Expected weekly study hours before and/or after class | A person with average coding profiency might expect to spend about 24 hrs/wk outside of class time to complete the assignments. |
| References | The whole book "Neural Networks and Deep Learning: A Textbook. (2018). Charu C. Aggarwal, Springer, Cham, Switzerland" is a useful resource for this course and neural network modeling in general. |
| Designated Reading | 1. Jordan M. I. (1986). An introduction to linear algebra in parallel distributed processing. In Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations. Ed. David E. Rumelhart & James L. McLelland, p365–422, MIT Press, Cambridge: USA. pdf 2. Neural Networks and Deep Learning: A Textbook. (2018). Charu C. Aggarwal, Springer, Cham, Switzerland. Ch. 1. pdf 3. Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences USA, 79, 2554-2558. pdf 4. Rumelhart, D., Hinton, G., & Williams, R. (1986). Learning internal representations by error propagation. MIT Press Cambridge, MA, USA. pdf 5. Von Der Malsburg, C. (1973). Self-organization of orientation sensitive cells in the striate cortex. Kybernetik, 14, 85-100. pdf |
Progress
| Week | Date | Topic |
| Week 1 | 2025/02/21 | Introduction: Biology, why model, and general approach. |
| Week 2 | 2025/02/28 | Memorial Day (no class) |
| Week 3 | 2025/03/07 | Linear algebra: Vectors, matrices, and matrix operations; reading Jordan (1986). [HW 1] |
| Week 4 | 2025/03/014 | Perceptrons: Nomenclature, general neural network framework, application in logic problems; reading Aggarwal (2018), Ch. 1. [HW 2] |
| Week 5 | 2025/03/21 | Attractor networks 1: Introduction to the principles of autoencoding and memory; reading Hopfield (1982). |
| Week 6 | 2025/03/28 | Attractor networks 2: Simple autoencoder architecture and learning rule to instantiate content-addressable memory, attractor properties. |
| Week 7 | 2025/04/04 | Childrens Day, Tomb Sweeping Day (no class) |
| Week 8 | 2025/04/11 | Attractor networks 3: Evaluating and describing the autoencoder model. [Assignment 1] |
| Week 9 | 2025/04/18 | Backpropagation 1: Introduction to the principles of multi-layered perceptrons and error-based learning; reading Rumelhart et al. (1986). |
| Week 10 | 2025/04/25 | Backpropagation 2: Simple multi-layered perceptron to instantiate error-based learning, non-linear input-output mappings. |
| Week 11 | 2025/05/02 | Backpropagation 3: Evaluating and describing the multi-layered perceptron model. [Assignment 2] |
| Week 12 | 2025/05/09 | Unsupervised learning 1: Introduction to the principles of functional self-organization and convolution in V1 orientation selectivity; reading Von der Malsburg (1973). |
| Week 13 | 2025/05/16 | Unsupervised learning 2: Unpacking the neural network model in Von der Malsburg (1973). |
| Week 14 | 2025/05/23 | Unsupervised Learning 3: Evaluating the Von der Malsburg model. [Assignment 3] |
| Week 15 | 2025/05/30 | Recurrent neural networks; reading Aggarwal (2018), Ch. 2., Convolutional neural networks; reading Aggarwal (2018), Ch. 3. |
| Week 16 | 2025/06/06 | Exploding and diminishing gradients, overfitting, regularization. |
Makeup Class Information
| NO | Date | Start Time | End Time | Location or Method |
Grading
| NO | Item | Pc | Explanations for the conditions |
| 1 | Assignment 1: Attractor model | 30% | Properly structured, commented code with report on organized implementation/simulations, and adequate descriptions of work processes, hypotheses, analytical reasoning, results, and conclusions. |
| 2 | Assignment 2: Multi-layered perceptron model | 30% | Properly structured, commented code with report on organized implementation/simulations, and adequate descriptions of work processes, hypotheses, analytical reasoning, results, and conclusions. |
| 3 | Assignment 3: Unsupervised learning model | 40% | Properly structured, commented code with report on organized implementation/simulations, and adequate descriptions of work processes, hypotheses, analytical reasoning, results, and conclusions. |
Adjustment methods for students
| Adjustment method | |
| Teaching methods | |
| Assignment submission methods | Extension of the deadline for submitting assignments |
| Exam methods | |
| Others | Negotiated by both teachers and students |
Office Hour
| Remarks | None |