Generalisation of prior information for rapid Bayesian time estimation

Roach, Neil, McGraw, Paul, Whitaker, David and Heron, James (2016) Generalisation of prior information for rapid Bayesian time estimation. Proceedings of the National Academy of Sciences, 114 (2). pp. 412-417. ISSN 1091-6490

Full text not available from this repository.


To enable effective interaction with the environment, the brain combines noisy sensory information with expectations based on prior experience. There is ample evidence showing that humans can learn statistical regularities in sensory input and exploit this knowledge to improve perceptual decisions and actions. However, fundamental questions remain regarding how priors are learned and how they generalise to different sensory and behavioural contexts. In principle, maintaining a large set of highly specific priors may be inefficient and restrict the speed at which expectations can be formed and updated in response to changes in the environment. On the other hand, priors formed by generalising across varying contexts may not be accurate. Here we exploit rapidly induced contextual biases in duration reproduction to reveal how these competing demands are resolved during the early stages of prior acquisition. We show that observers initially form a single prior by generalising across duration distributions coupled with distinct sensory signals. In contrast, they form multiple priors if distributions are coupled with distinct motor outputs. Together, our findings suggest that rapid prior acquisition is facilitated by generalisation across experiences of different sensory inputs, but organised according to how that sensory information is acted upon.

Item Type: Article
Keywords: Bayesian inference, Time perception, Sensorimotor learning
Schools/Departments: University of Nottingham, UK > Faculty of Science > School of Psychology
Identification Number:
Depositing User: Roach, Neil
Date Deposited: 04 Jan 2017 14:15
Last Modified: 04 May 2020 18:19

Actions (Archive Staff Only)

Edit View Edit View