Focus:
Nathan is an assistant professor at Utrecht University with 10+ years of research and teaching experience. He heads the Multisensory Space Lab at the department of Experimental Psychology and has many national and international academic and clinical collaborations. Next to using state of the art neuroscientific and psychological methods, Nathan uses multisensory games and VR to study how the brain makes optimal use of sensory information. He recently released his first action adventure game and is active as a freelance artist for various games that have been published or are in development.
Nathan’s research focuses on unraveling the mechanisms underlying multisensory perception, the maintenance of sensory unity, and its restoration in the face of brain damage and/or sensory impairments. By now, it is well established that our senses do not operate independently. For example, what we hear can influence what we see, and vice versa. The integration of information from different senses (multisensory integration) is crucial for optimal perception of the environment and leads to better localisation, identification, and detection.
His lab uses innovative approaches to push our understanding of multisensory integration in health and disease forward and inform new rehabilitation methods. One of his recent ambitions is to apply his multisensory research expertise to develop new multisensory diagnostic tests and treatments, focusing on sensory hypersensitivity/overload problems after acquired brain damage. He is active as an expert consultant in a new multidisciplinary diagnostic team for people with acquired brain damage at Bartiméus, an institute for people with visual impairments.
For more information about my lab, see www.multisensoryspacelab.com
Please feel free to contact me if you have any questions about (our) research, would like to collaborate, or if you are looking for a research internship in the lab.
Topics of interest: Multisensory perception, multisensory calibration, spatial attention, hearing loss.
Grants & Awards
Expertisefunctie Zintuigelijk Gehandicapten onderzoek en innovatie Grant (2022)
Project: A personal sensory navigation profile: A novel tool to predict successful use of sensory information during navigation in visually impaired individuals.
Every day, we must navigate the world around us to go to work, the supermarket, a friend, the kitchen, etc. Even with normal hearing and vision, this is not always an easy task. For example, think about how hard it can be to find your way to the right platform at a train station or the right gate at the airport. Therefore, individuals with visual impairments often receive orientation and mobility training with the goal to improve navigating the world using other senses, such as hearing and touch. The severity and type of visual impairment and personal navigation preferences and skills determine the challenges that someone faces during navigation. It is therefore important to adapt orientation and mobility training to an individual’s visual impairment and characteristics. This is often done by a mobility and navigation trainer by asking about and observing personal navigation strategies and preferences. However, standardized and evidence-based guidelines on which type of sensory information and strategies lead to the best navigation performance in distinct types of visual impairments and for different individuals are lacking.
Most navigation research focuses on congenitally and late blind individuals. This means that knowledge about how low vision affects navigation and how to best utilize the available sensory information in low vision is lacking. In this project we will (1) develop and validate a personalized sensory navigation profile questionnaire to identify navigation strengths, weaknesses, and preferences for use in mobility training for a wide range of visually impaired individuals, (2) investigate what type of (multi)sensory signals (internal, external and technological) people successfully use when navigating, independent of their preferences, and (3) identify how the sensory navigation profile relates to the sensory signals that can be successfully used to facilitate navigation in the visually impaired.
Expertisefunctie Zintuigelijk Gehandicapten onderzoek en innovatie Grant (2020)
Project: Visual overload after acquired brain damange. What it is and how we can measure it.
Many patient with acquired brain damage suffer from sensory overload. Although this is a widespreak phenomenon, we understand very little about the causes of sensory overload. The primary aim of the project is to define visual overload and make the underlying causes measurable. This project is a collaboration between Bartiméus and the Multisensory Space Lab in the department of Experimental Psychology at Utrecht University. This research can provide key insights into how sensory information is processed in health and disease (fundamental science) and can help patients and clinicians to diagnose and potentially treat perceptual problems after brain damage (utilization).
NWO VENI Grant (2017)
Project: Restoring sensory unity: Unifying spatial vision and hearing through multisensory recalibration
Our brain combines what we see and hear, enhancing spatial perception. Hearing loss causes hearing and vision to be in conflict. In this project, the researchers will investigate whether the brain’s plasticity can be used to restore sensory unity, improve spatial perception, and rehabilitate people with hearing loss.
Neuroscience and Cognition Utrecht seeding money grant (2016):
Grant to visit the multisensory lab of prof. dr. Mark Wallace and setup new collaborations.
Neuroscience and Cognition Utrecht seeding money grant (2015):
Grant for alleviating teaching duties.
Short Stay PhD Fellowship Grant Utrecht University (2013):
Research fellowship on multisensory spatial perception in the Crossmodal Research Lab of Prof. Dr. Charles Spence.
Travel Award INS Oslo by the International Neuropsychological Society/APA (2012):
Travel Award for the study "Exploring space: Dissociations and interactions between neglect in near and far regions of space. Student symposium at the International Neuropsychological Society mid-year meeting held in Oslo, Norway, June 27-30, 2012.
Peer-reviewed publications:
See the lab website or my google scholar profile for an up-to-date list.
2020
Van der Stoep, N., & Alais, D. (2020). Cross-modal motion perception: Auditory motion encoded in a visual motion area. Current Biology, 30(13), R775-R778
Van der Stoep, N., Colonius, H., Noel, J.-P., Wallace, M. T., & Diederich, A. (2020). Audiovisual Integration in Depth: Modeling the Effect of Distance and Stimulus Effectiveness Using the TWIN model. Journal of Mathematical Psychology, 99, 102443
Van der Stigchel, S., Schut, M. J., Fabius, J., & Van der Stoep, N., (2020). Trans-saccadic perception is affected by saccade landing point deviations after saccadic adaptation. Journal of Vision, 20(9):8, 1-12
Elshout, J. A., Van der Stoep, N., Nijboer, T. C. W., & Van der Stigchel, S. (2020). Motor congruency and multisensory integration jointly facilitate visual information processing before movement execution. Experimental Brain Research.
2019
Schut, M. J., Van der Stoep, N., Fabius, J. H., & Van der Stigchel, S. (2018). Feature integration is unaffected by saccade landing point, even when saccades land outside of the range of regular oculomotor variance. Journal of vision, 18(7):6, 1-17. Open Access
Schut, M. J., Van der Stoep, N., & Van der Stigchel, S. (2018). Auditory spatial attention is encoded in a retinotopic reference frame across eye-movements. PloS ONE, 13(8), e0202414. [PDF] | Open Access
Noel, J-P., Modi, K., Wallace, M. T., Van der Stoep, N. (2018). Audiovisual Integration in Depth: Multisensory Binding and Gain as a Function of Distance. Experimental Brain Research. DOI: 10.1007/s00221-018-5274-7 [PDF] | Open Access
2015
2014
2012