Introduction to Computational Cognitive Neuroscience

Alexander A. Petrov, Ohio State University
apetrov [at] alexpetrov.com, http://alexpetrov.com/

Course Overview

How does cognition emerge from the brain? This course introduces you to the new and exciting field of Computational Cognitive Neuroscience (CCN) that provides important pieces of the answer to this question. We focus on simulations of cognitive and perceptual processes, using neural network models that bridge the gap between biology and behavior. We adopt the Leabra framework of Randy O'Reilly and Yuko Munakata and use their 2000 book Computational Explorations in Cognitive Neuroscience as the main text for the course. We first consider the basic biological and computational properties of individual neurons and networks of neurons, as well as their idealized Leabra counterparts. We discuss their role in basic processing mechanisms such as spreading activation, inhibition, and multiple constraint satisfaction. We then discuss learning mechanisms that allow networks of neurons to build internal models of their environments and perform complex tasks. Models illustrating these ideas will be demonstrated in class and discussed further in the afternoon sessions. We complement the simple demos with a case study of a full-blown model of visual object recognition and selective attention. Next, we turn to big-picture issues and present (an outline of) a comprehensive connectionist proposal of a cognitive architecture. We discuss how different brain systems (e.g., hippocampus, parietal cortex, frontal cortex) specialize to solve difficult computational tradeoffs. Finally, we discuss how neural networks can learn to do complicated symbol-manipulation tasks once thought to be the exclusive domain of symbolic architectures such as ACT-R. As each of these topics requires a course of its own, all we can do in a week is introduce the key ideas, illustrate how they fit together, and provide pointers to the literature. We will also introduce the PDP++ neural network simulator, which will enable you to explore the variety of models that come with the book.

Prerequisites

This course is open to all participants in the Summer School. However, we will cover a lot of ground and you are stongly advised to attend the introductory connectionist course taught by Bob French during the first week of the Summer School. We will take for granted many concepts (e.g., perceptrons, Hebbian learning) covered in Bob's course. Obviously, the stronger your background in computational modeling and neuroscience, the better. Knowledge of statistics and cognitive psychology is a great asset too. I will do my best to make the course useful to you even if you know relatively little in these areas.We will all see whether this is realistic. While the models we will be using are mathematically based, only algebra and some simple calculus-level concepts are involved. Computer programming experience is not required because the models are accessible via a graphical interface.

Textbook

O'Reilly, R. C. & Munakata, Y. (2000). Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain. Cambridge, MA: MIT Press.

This book is a valuable addition to your library. I strongly advise to obtain a copy of your own and read as much as possible before the Summer School begins. As a former student in several summer schools at NBU, I assure you from personal experience that once the school starts everything happens awfully fast. Do not fool yourselves that you will be able to read anything after a full day of classes in the morning, labs in the afternoon, and parties in the evening (and night).

Afternoon Lab

The textbook comes with dozens of pre-built neural network models that illustrate key principles and phenomena. They are all built in the Leabra framework using the PDP++ simulator. First-hand experience with these models is the best way to understand how neural networks really work. We will demonstrate several models during the morning lectures, but will not have time to sink our teeth in them. The afternoon lab provides an opportunity for more detailed exploration. We can change various parameters and see how the model's behavior changes, discuss alternative network topologies, and so forth. Those of you who have laptops will be able to install the simulator on your own machine and explore on your own. The afternoon lab will be useful for all students in the course and indispensable for those who take it for credit.

Assessment

Students who take the course for credit will be asked to write a brief (5-7 pages) paper that explores, evaluates, compares, and/or criticizes some of the models discussed in class or in the book. The paper can also relate the issues discussed in class to some alternative model or experimental result.

Plan of the Lectures

14th International Summer School in Cognitive Science
New Bulgarian University, Sofia, Bulgaria
Week 2, July 16-22, 2007.

1. Foundations. Neurons and Networks

Required reading:

Optional readings:

2. Learning in Neural Networks

Required reading:

Optional readings:

3. Object Recognition and Selective Attention

Required reading:

Optional readings:

4. Leabra: A Connectionist Cognitive Architecture

Required readings:

Optional reading:

5. High-Level Cognition in Neural Networks

Required readings:

Optional reading:

About the Instructor

Alexander Petrov got his Ph.D. in 1998 from New Bulgarian University under the supervision of Boicho Kokinov. He has been a postdoc in John Anderson's lab at Carnegie Mellon University, where he learned firsthand about ACT-R. He has also been a postdoc in Randy O'Reilly's lab at the University of Colorado at Boulder, where he learned firsthand about Leabra. He is involved in a DARPA-sponsored project titled "Biologically Inspired Cognitive Architectures" and is really excited at the prospect of combining his two favorite architectures. At present, he is an Assistant Professor at the Department of Psychology at the Ohio State University, Columbus, OH. His bread-and-butter research there involves models of perceptual learning and memory-based scaling. More information is available on-line at http://alexpetrov.com

http://alexpetrov.com/teach/nbu07/ Check the validity of this page's XHTML Check the validity of this site's Cascading Style Sheet Page maintained by Alex Petrov
Created 2007-03-07, last updated 2007-03-07.