We experience the world through our senses, a constant torrent of sights, sounds, smells, and more. Our brains take these signals and process them, giving rise to our individual perceptions of the world. But sometimes our senses play tricks on us, notably in the case of perceptual illusions.
Now, Caltech researchers have developed two new illusions that reveal how the senses can influence each other — in particular, how sound can give rise to visual illusions. These illusions occur so quickly that they illustrate a phenomenon called postdiction (as opposed to prediction) in which a stimulus that occurs later can retroactively affect our perceptions of an earlier event.
The Caltech work is among the first to show this kind of time-traveling illusion across multiple senses.
The work was done in the laboratory of Shinsuke Shimojo, Gertrude Baltimore Professor of Experimental Psychology and affiliated faculty member of the Tianqiao and Chrissy Chen Institute for Neuroscience at Caltech. A paper describing the research appears in the October 3 issue of the journalĀ PLOS ONE.
“Illusions are a really interesting window into the brain,” says first author Noelle Stiles (PhD ’15), a visitor in biology and biological engineering and a postdoctoral scholar-research associate at USC. “By investigating illusions, we can study the brain’s decision-making process. For example, how does the brain determine reality with information from multiple senses that is at times noisy and conflicting? The brain uses assumptions about the environment to solve this problem. When these assumptions happen to be wrong, illusions can occur as the brain tries to make the best sense of a confusing situation. We can use these illusions to unveil the underlying inferences that the brain makes.”
The two illusions in this study were developed to illustrate how stimuli that occur later can affect the perception of stimuli that have already occurred. Postdictive processing has been demonstrated within individual senses, but this work focuses on how the phenomenon can bridge multiple senses. The key to both of the new illusions is that the audio and visual stimuli occur rapidly, in under 200 milliseconds (one-fifth of a second). The brain, trying to make sense of this barrage of information, synthesizes the stimuli from both senses to determine the experience, using postdiction to do so.
The first illusion is called the Illusory Rabbit. To produce the illusion, first a short beep and a quick flash are played nearly simultaneously on a computer, with the flash appearing at the left side of the screen. Next, 58 milliseconds after the first beep, a lone beep is played. Finally, 58 milliseconds after the second beep, a second nearly simultaneous beep-flash pair occurs, but with the flash appearing on the right side of the screen. The beep location is always central and does not move. Though only two flashes are played, most people viewing the illusion perceive three flashes, with an illusory flash coinciding with the second beep and appearing to be located in the center of the screen.
The fact that the illusory flash is perceived in between the left and right flashes is the key evidence that the brain is using postdictive processing.
“When the final beep-flash pair is later presented, the brain assumes that it must have missed the flash associated with the unpaired beep and quite literally makes up the fact that there must have been a second flash that it missed,” explains Stiles. “This already implies a postdictive mechanism at work. But even more importantly, the only way that you could perceive the shifted illusory flash would be if the information that comes later in time — the final beep-flash combination — is being used to reconstruct the most likely location of the illusory flash as well.”
The second illusion is called the Invisible Rabbit. In this related illusion, three flashes are shown on the screen, the first on the left, the second in the middle, and finally the third on the right, with only the first and third flashes coinciding with beeps. In this case, most people do not see the second flash — the one without a corresponding sound — at all. The absence of the second beep leads the brain to decide after the fact that there actually was no flash, even though it was in fact present.
By showing that a sound can excite a visual illusion, the researchers have uncovered new clues as to how the brain combines the senses over space and time to generate an integrated sense of perception.
“The significance of this study is twofold,” says Shimojo. “First, it generalizes postdiction as a key process in perceptual processing for both a single sense and multiple senses. Postdiction may sound mysterious, but it is not — one must consider how long it takes the brain to process earlier visual stimuli, during which time subsequent stimuli from a different sense can affect or modulate the first. The second significance is that these illusions are among the very rare cases where sound affects vision, not vice versa, indicating dynamic aspects of neural processing that occur across space and time. These new illusions will enable researchers to identify optimal parameters for multisensory integration, which is necessary for both the design of ideal sensory aids and optimal training for low-vision individuals.”
The paper is titled “What You Saw is What You Will Hear: Two New Illusions with Audiovisual Postdictive Effects.” In addition to Stiles and Shimojo, co-authors are former Caltech undergraduate Monica Li, Carmel Levitan of Occidental College, and former Caltech postdoc Yukiyasu Kamitani of Kyoto University and ATR Computational Neuroscience Laboratories. Funding was provided by the National Institutes of Health, the Philanthropic Educational Organization Scholar Award Program, the Japan Science and Technology Agency’s Core Research for Evolutional Science and Technology program, and the Japan Society for the Promotion of Science’s KAKENHI program.