New research suggests you’ll do better at a video game with the sound on.
“We live in a multisensory world,” says Robert Sekuler, professor of psychology and of neuroscience at Brandeis University, who studies how the human senses, especially sight and hearing, work with and compete with one another.
“How does the brain combine information from different senses? How does it decide when stimuli should and shouldn’t be combined?”
ValueWalk's Raul Panganiban interviews Joseph Cioffi, Author of Credit Chronometer and Partner at Davis + Gilbert where he is Chair of the Insolvency, Creditor’s Rights & Financial Products Practice Group. In the interview, we discuss the findings of the 3rd Annual report. Q2 2021 hedge fund letters, conferences and more The following is a computer Read More
To help answer these questions, Sekuler, computer science professor Tim Hickey, and coauthor Yile Sun created a video game they call “Fish Police!!”
In the game, a boat’s bow peeks out from the bottom of the screen. Above it, fish swim horizontally against a watery turquoise background. Players see examples of two kinds of fish. One is a “good” fish. As it swims across the screen, it changes size, or oscillates, slowly. Then a “bad” fish swims past, oscillating more rapidly.
Over the course of the five-minute game, good and bad fish appear randomly, one after the other. Players must press one button on the joystick whenever they see a good fish and another button whenever they spot a bad fish. They earn a point every time they respond correctly within two seconds.
The game also has an auditory element. While a fish swims, a tone is emitted that fluctuates in loudness, either slowly or rapidly.
Players performed best when the rate at which a fish fluctuated in size matched the rate at which the sound fluctuated in loudness.
When the cues were mismatched—if the sound’s loudness changed rapidly but the fish’s size changed slowly, for example—players performed less well. And when the game was on mute, players fared even worse than they did when their eyes and ears were both engaged.
The research suggests humans perform best when multiple senses are involved and sensory inputs agree with one another. “Any time you have multiple sources of information and they’re correlated, performance improves,” Sekuler says.
The findings appear in the Journal of the Acoustical Society of America.
Source: Brandeis University
Original Study DOI: 10.1121/1.4979470
Article by Lawrence Goodman-Brandeis — Futurity