A computational model of spatial memory anticipation during visual search


Some visual search tasks require to memorize the location of stimuli that have been previously scanned. Considerations about the eye movements raise the question of how we are able to maintain a coherent memory, despite the frequent drastically changes in the perception. In this article, we present a computational model that is able to anticipate the consequences of the eye movements on the visual perception in order to update a spatial memory.

In Anticipatory Behavior in Adaptive Learning Systems : From Brains to Individual and Social Behavior, M.V. Butz, O. Sigaud, G. Baldassarre and G. Pezzulo (Eds), Springer-Verlag LNAI 4520, p170-188