Scientists Develop New Theory to Understand Why Our Perception is Biased
Researchers examined decades of data to create a unifying theory to explain biases in perception.
Photo-Illustration: Martha Morales/The University of Texas at Austin
How humans perceive the world around them is a complex dance between input from the various senses, how the brain encodes that information and how it all interacts with previous experiences. What we perceive often is systematically different from reality, leading to what is known as perceptual bias.
Neuroscientists’ work to understand perceptual bias involves exploring how the brain processes information about different stimuli, including color, movement, size and the number and orientation of objects. Based on decades of experimental data, several insights have emerged about common perception biases, but they sometimes contradict each other.
Researchers at The University of Texas at Austin and Saarland University in Germany have created a new central theory of perceptual biases that combines decades of data and unifies even contradictory phenomena into a model that can go so far as to predict the biases of individuals.
The theory is outlined in a paper out this week in Nature Neuroscience.
“Perception is not simply understanding the environment around us as it is, but about how our brains reconstruct the environment around us,” said Xue-Xin Wei, assistant professor of neuroscience and psychology at UT Austin. “This theory allows us to understand how humans see the world and predict how they will see it, as well as how they may behave.”
One example of perceptual bias is that a slightly tilted bar is often perceived as more tilted than it is. People perceive objects at a distance as being smaller than they are. People may perceive colors differently based on the color of objects nearby or colors they were shown previously.
Bias in perception traces its roots to many sources: previous experiences, irrelevant sensory information (often called sensory noise), how frequently something is observed in the environment and even how our brain penalizes errors in our estimations. The emerging theory accounts for all of these.
“This work has implications not only for basic scientific understanding of perception, but also for mental disorders, as people with certain types of psychiatric conditions have been reported to exhibit different perceptual biases,” Wei said.
The work is also relevant to social science, in particular neuroeconomics, a field that studies how we make economic decisions. For example, the theory could be applied to better understand how humans perceive the value of an item, for example, when it comes to product design. Understanding how humans perceive the item itself, the biases that may be present and how the brain calculates the value of the item and the behaviors it leads to could influence the price of goods and how products are marketed.
“We were mainly dealing with the perception of simple stimuli such as color and magnitude in this study, but the same principles can be applied to more complex variables.” Wei said. “For example, we can use a similar approach to study how we perceive emotions, such as happiness or sadness.”
Given the complexities of biases in perceptual decisions, some may wonder: Is there a path forward to reduce or get rid of these biases?
“According to the theory, the best way to reduce the biases in perceptual decisions is to reduce the noise, or in other words, to gather more data before the decision is made. More information, less biases,” Wei said.
Michael Hahn of Saarland University in Germany was also an author of the paper. Funding for the research was provided by UT Austin, and computing for the project was provided by Saarland University.