A comparative analysis of response times shows that multisensory benefits and interactions are not equivalent
Abstract
Multisensory signals allow faster responses than the unisensory components. While this redundant signals effect (RSE) has been studied widely with diverse signals, no modelling approach explored the RSE systematically across studies. For a comparative analysis, here, we propose three steps: The first quantifies the RSE compared to a simple, parameter-free race model. The second quantifies processing interactions beyond the race mechanism: history effects and so-called violations of Miller’s bound. The third models the RSE on the level of response time distributions using a context-variant race model with two free parameters that account for the interactions. Mimicking the diversity of studies, we tested different audio-visual signals that target the interactions using a 2 × 2 design. We show that the simple race model provides overall a strong prediction of the RSE. Regarding interactions, we found that history effects do not depend on low-level feature repetition. Furthermore, violations of Miller’s bound seem linked to transient signal onsets. Critically, the latter dissociates from the RSE, demonstrating that multisensory interactions and multisensory benefits are not equivalent. Overall, we argue that our approach, as a blueprint, provides both a general framework and the precision needed to understand the RSE when studied across diverse signals and participant groups.
Citation
Innes , B R & Otto , T U 2019 , ' A comparative analysis of response times shows that multisensory benefits and interactions are not equivalent ' , Scientific Reports , vol. 9 , 2921 . https://doi.org/10.1038/s41598-019-39924-6
Publication
Scientific Reports
Status
Peer reviewed
ISSN
2045-2322Type
Journal article
Description
This work was supported by the Biotechnology and Biological Sciences Research Council (BBSRC, grant number: BB/J01446X/1).Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.