Instructor: Emre Neftci
Email: firstname.lastname@example.org Place: SBSG 2200.
Time: Fri 1:00PM – 3:50PM
Office Hours: Tu 11:00 - 12:30 or by appointment at SBSG 2308
Discussion Group: nmilabedu.slack.com
Recently, machine learning and artificial intelligence repeatedly broke new grounds in solving complex cognitive tasks (e.g. AlphaGo). The relevance of such feats to cognitive sciences is evident, and makes one wonder whether machine learning can also help understand how the building blocks of the brain subserve cognition, or even guide the design of novel brain-inspired learning machines. This course will explore the algorithmic bases of cognitive behavior and construct bridges from machine learning to biological processes of learning in the brain.
The course will start by introducing the tools necessary for modeling the dynamics of neurons and synapse models and understanding the basics of machine learning and neural networks. In particular, we will present simplified models of neurons that are amenable for analysis under a machine learning framework. Using this framework, we will construct neural systems for inference and learning tasks relevant for cognition. These models will include pattern recognition with deep learning, sequence learning with reservoir computing and competitive learning, dimensionality reduction, probabilistic inference using neural Monte Carlo sampling. Each class will include a lecture component, and will be followed by hands-on experimentation using software simulations of neurons. This course is ideal for students interested in doing research in computational neuroscience and machine learning. The course is intended to be accessible to students from a broad range of disciplines, with varying background knowledge in the field. However, quantitative reasoning skills, including basic calculus and computer programming are necessary to understand and implement the core concepts of machine learning and neural dynamics. In case of doubt, interested students are encouraged to e-mail the instructor.
The programs distributed for the hands-on exercises and assignments will be written in Python and Tensorflow. If necessary, the instructor will introduce the basic Python programming and Tensorflow concepts during the second week. As a Python tutorial, refer to these resources: http://www.scipy-lectures.org/intro/index.html (sections 1.1 through 1.4).
Students will be evaluated on an individual project of their choice and present it orally on week 10 to the class. Ideally, this project should related to the students’ research and, if applicable, with data already collected. The course is based on material from the following books, and may provide useful complementary information.
 Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep learning. MIT press, 2016.
 M. A. Nielsen. Neural Networks and Deep Learning. 2015.
 Wulfram Gerstner, Werner M Kistler, Richard Naud, and Liam Paninski. Neuronal dynam-
ics: From single neurons to networks and models of cognition. Cambridge University Press,
 Mark F Bear, Barry W Connors, and Michael A Paradiso. Neuroscience. Vol. 2. Lippincott
Williams & Wilkins, 2007.
 C.M. Bishop. Pattern recognition and machine learning. Springer-Verlag New York, Inc.
Secaucus, NJ, USA, 2006.
 P. Dayan and L.F. Abbott. Theoretical Neuroscience: Computational and Mathematical
Modeling of Neural Systems. MIT Press, 2001
Assignments: (50%), Project (50%). Reports and assignments must be submitted
before the deadline posted with each assignment sheet. There will no more than 4 individually
graded assignments. The overall grade for the assignments will be calculated using the best 3
The syllabus page shows a table-oriented view of the course schedule, and the basics of course grading. You can add any other comments, notes, or thoughts you have about the course structure, course policies or anything else.
To add some comments, click the "Edit" link at the top.