Neuronal Dynamics for Embodied Cognition 2022

Our annual summer school gives students an in-depth look at dynamic field theory over the course of a single week. The summer school covers everything from the basics of dynamic field theory up to our latest research projects. 

This virtual edition of our summer school will consist of two parts: A live-lecture series and a hands-on workshop.

The lecture series will be held as a video conference and provides a step-by-step introduction to Dynamic Field Theory. It is open to everyone; you only have to register with your email address. Lectures will take place from August 15 to August 20 between 3:00 and 6:00 pm (UTC +2).

The two-and-a-half-day project workshop gives students the opportunity to put the newly acquired skills to use in a concrete hands-on modeling project. Students solve the task in our open-source simulation environment under the guidance of a personal tutor. The one-on-one tutoring limits the number of participants who can take part in the workshop. To apply, please fill out the online application form, which asks for a cover letter and a CV as well as some other information that we use to prepare projects for you. We encourage workshop applications by small groups of participants, maybe two or three colleagues who will work together locally on the same project and may share a tutor. All participants of such a group should apply separately and list their potential group partners in the application form. The workshop will take place August 18-20. Personal tutoring via video conference will be available on each workshop day at flexible hours.

Participation in any part of the school is free of charge.

Who is this summer school for?

The summer school is aimed at students at the advanced undergraduate or graduate level, postdocs and faculty members in the areas of embodied cognition, cognitive science, developmental science, cognitive neuroscience, developmental robotics, autonomous robotics, cognitive robotics, and anyone who wants to learn about dynamic field theory.

Structure of the school

The school is led by Prof. Gregor Schöner, who lectures on the basic concepts of DFT in the live lecture series. Workshop sessions provide hands-on training working with dynamic field models in COSIVINA and CEDAR, our interactive simulation environments. Students learn to create a dynamic field model from the ground up and learn how to apply the concepts of DFT to their home domain of interest.

Software

For the hands-on work, the exercises and projects, we use numeric simulators. COSIVINA is a simulator for dynamic field theory that is based on Matlab and supports quick assembly of DFT models, for instance to account for human behavioral data.

Models based on larger field architectures can be implemented in the C++ based simulator CEDAR, which features a graphical user interface in which core elements of DFT can be assembled without in-depth programming knowledge.  For more advanced projects in CEDAR, basic knowledge of C++ is required.

Lecturers

To be announced

Topics

Some of the topics that the summer school covers include:

  • Neural dynamics and basic attractor states
  • Dynamic fields
  • Introduction to COSIVINA and CEDAR
  • Links to neurophysiology and embodiment
  • Multi-dimensional fields and multi-layer dynamics
  • Higher cognition 
  • Sequence generation 

Schedule

To be announced

2022-08-15

03:00PM (UTC+2) · 03:30PM (UTC+2): Welcome [Gregor Schöner]

03:30PM (UTC+2) · 05:30PM (UTC+2): DFT Core Lecture - Embodied Cognition [Gregor Schöner]

05:30PM (UTC+2) · 06:00PM (UTC+2): Preparation of Exercises

2022-08-16

03:00PM (UTC+2) · 03:15PM (UTC+2): Exercise Feedback

03:15PM (UTC+2) · 04:45PM (UTC+2): DFT Core Lecture - Higher Cognition [Gregor Schöner]

04:45PM (UTC+2) · 06:00PM (UTC+2): Introduction to Software Frameworks

2022-08-17

03:00PM (UTC+2) · 03:15PM (UTC+2): Exercise Feedback

03:15PM (UTC+2) · 05:30PM (UTC+2): DFT Core Lecture - Autonomy [Gregor Schöner]

05:30PM (UTC+2) · 06:00PM (UTC+2): Workshop Projects Overview

2022-08-18

03:00PM (UTC+2) · 04:00PM (UTC+2): Special Lecture: Scene Representation and Visual Search [Raul Grieben]

04:00PM (UTC+2) · 05:00PM (UTC+2): Special Lecture: Models of Grounded Cognition [Daniel Sabinasz]

05:00PM (UTC+2) · 06:00PM (UTC+2): Guest Lecture: Neuromorphic Computing and Neural Dynamics [Mathis Richter, Intel Labs]

2022-08-19

03:00PM (UTC+2) · 04:00PM (UTC+2): Guest Lecture: The WOLVES model on Word-Object Learning via Visual Exploration in Space [John Spencer, University of East Anglia]]

04:00PM (UTC+2) · 04:30PM (UTC+2): Case Study: Using Mouse Tracking to Study Visual Search [Cora Hummert]

04:30PM (UTC+2) · 05:30PM (UTC+2): Special Lecture: Neural Process Models of Intentionality [Jan Tekülve]

05:30PM (UTC+2) · 06:00PM (UTC+2): General Discussion [Gregor Schöner]

2022-08-18

10:00AM (UTC+2) · 11:00AM (UTC+2): Workshop Introduction

11:00AM (UTC+2) · 03:00PM (UTC+2): Exercise Work

2022-08-19

10:00AM (UTC+2) · 10:30AM (UTC+2): Q&A Session

10:30AM (UTC+2) · 03:00PM (UTC+2): Exercise Work

2022-08-21

03:00PM (UTC+2) · 05:00PM (UTC+2): Workshop Project Presentation

Core lectures

Lecture slides DFT 1: Neural dynamics, neural fields, their dynamic instabilities and dynamic regimes

This first of three core lectures of the DFT summerschool introduces the foundational ideas of neural dynamics and dynamic neural fields. The core are the attractor states and their characteristic instabilities for detection, selection, and memory. A case study illustrates how DFT models are used to account for behavioral and neural data, a second case study illustrates how DFT models ultimately link to sensory and motor systems.

Video DFT 1: Neural dynamics, neural fields, their dynamic instabilities and dynamic regimes
Lecture slides DFT 2: Toward higher cognition: binding, coordinate transforms, grounding, mental mapping

This second core lecture of the DFT school looks into dynamic neural fields that combine different types of feature dimension and provide the basis for flexibly extracting and binding features, enable visual search. Binding through space enables higher-dimensional representations within the localist neural representations of DFT. Coordinate transforms exploit the binding of source and target coordinate frames to a space that steers the coordinate transform. Two examples illustrate how higher cognition emerges: Perceptual grounding and mental mapping of relational phrases. 

Video DFT 2: Toward higher cognition: binding, coordinate transforms, grounding, mental mapping
Lecture slides DFT 3: Sequences, and general discussion

This third core lecture discusses the mechanism in neural dynamics for the autonomous generation of sequences of behavioral or mental acts. This is followed by a general discussion: Why DFT architectures work as promised; The role of embodiment in DFT; The relationship between DFT and connectionism/deep neural network, between DFT and symbolic computation,  and between DFT and Vector-Symbolic-Architectures.

Video DFT 3: Sequences and general discussion

Exercises

Exercises Exercise 1: Neural node dynamics
Exercises Exercise 2: Neural field dynamics
Video Survey over Cedar and DFF

Jan Tekülve and Daniel Sabinasz present the software frameworks Cedar and DFF for DFT simulations. 

Exercises Exercise 3: Field architectures

Tutorial and Exercise with the Cedar Simulator

Document Cedar FAQ

Special lectures

Lecture slides Raul Grieben: Bridging DFT and Deep Neural Networks

In this special lecture, Raul Grieben presents a part of his work on a neural theory of visual search.His presentation emphasizes how a deep neural network that supports classification is mapped onto the localist representations of DFT and how guidance features for visual search can be extracted from the deep network. He demonstrates how the model accounts for scene grammar by explaining how visual anchors work.

Video Raul Grieben: Bridging DFT and Deep Neural Networks
Lecture slides Daniel Sabinasz: Models of Grounded Cognition

In this special lecture, Daniel Sabinasz reviews the notion of grounded cognition. He points to the challenges that a neural theory of grounding must face and outlines the DFT approach toward solving these problems. The high point of his talk is an outline of his own recent theoretical work on neural representations for conceptual structure and an account for the grounding of nested relations.

Video Daniel Sabinasz: Models of Grounded Cognition
Lecture slides Mathis Richter: Neuromorphic computing and DFT

In this Guest Lecture, Mathis Richter shows how neuromorphic computing can be used to implement DFT architectures in form of genuinely neural computation. His emphasis is on architectures that link to sensory inputs, but he reviews neuromorphic computation more broadly as well.

Video Mathis Richter: Neuromorphic computing and DFT
Lecture slides John P Spencer: Word-Object Learning via Visual Exploration in Space (WOLVES): A Neural Process Model of Cross-Situational Word Learning
Lecture slides Cora Hummert: Using mouse tracking to study visual search

In this shorter presentation, Cora Hummert reviews an experimental study that was inspired by the DFT accounts for visual search and for reaching movements outlined earlier in the course and, for some of the participants, studied in the projects workshop. Within a visual search task, participants moved a computer mouse to point at an object presented at varied locations that matched the search cues, color and orientation. A distractor object that only matched color attracted the path of the mouse. 

Video Special Lecture Cora Hummert: Using mouse tracking to study visual search
Lecture slides Jan Tekülve: Neural Process models of intentionality

(The Google doc with live videos is here.) In this final special lecture of the DFT summer school 2022, Jan Tekülve introduces the notion of "intentionality" as defined by philosophers of mind as a way to characterize what a neurally grounded account for the "mind" needs to achieve. He then provides a neural process model within DFT that realizes intentionality in the six psychological modes: perception, memory, beliefs, and action, planning, goals that sample the two directions of fit of intentionality, mind-to-world and world-to-mind. He makes this concrete around a simple toy scenario in which a robotic vehicle explores an environment, of which it builds a model, and in which it transforms objects by "painting" (in a highly simplified way). The robot learns contingencies of the world (of color combination in painting over colored surfaces), which it uses to achieve its goal.

Video Special Lecture Jan Tekülve: Neural Process model of Intentionality

Workshop

Video Introduction of the workshop projects

Jan Tekülve and Daniel Sabinasz present the projects that participants can do in Cedar, Covina, or DFF during the workshop poertion of the summer school.

Exercises Visual Search Project (Cedar)

All Cedar projects should start with this project

Exercises Spatial Language Project (Cedar)
Exercises Sequence Replay Project (Cedar)
Exercises Reaching Project (Cedar)

Might only run on Windows, try the template before starting

Exercises Simon Effect Project (Cosivina)
Configuration files Cedar Template Files

Template Architecture files for all Cedar Projects. The reaching project needs the _simulator file, while all other projects can be done with the _picture file.