When you see a bag of carrots at the supermarket, does your mind turn to potatoes and parsnips, or buffalo wings and celery?
It depends, of course, on whether you are preparing a hearty winter stew or getting ready to watch the Super Bowl.
Most scientists agree that categorising an object – such as thinking of a carrot as either a root vegetable or a party snack – is the responsibility of the prefrontal cortex, the part of the brain that governs reasoning and other advanced functions that make us intelligent and social.
In this view, the eyes and the visual regions of the brain act rather like a security camera, collecting data and processing it in a standard way before handing it over for analysis.
However, a new study led by biomedical engineer and neuroscientist Nuttida Rungratsameetaweemana, an assistant professor at Columbia Engineering, shows that the brain’s visual regions are actively involved in interpreting information.
Importantly, the way these regions process what we see depends on what the rest of the brain is focused on.
If it is Super Bowl Sunday, the visual system will recognise those carrots on a vegetable platter before the prefrontal cortex even registers their presence.
Published earlier this month in Nature Communications, the study offers some of the clearest evidence yet that early sensory systems play a role in decision making – and that they adapt in real time.
It also points to new ways of designing artificial intelligence systems that can adjust to new or unexpected situations.
Read this next: