When you pick up a bag of carrots at the grocery store, your mind might quickly leap to thoughts of other vegetables like potatoes and parsnips. Or perhaps, if you’re preparing for a party or Super Bowl Sunday, your thoughts will turn to celery and dip. What might seem like a simple, automatic association is actually a fascinating display of how the human brain categorizes and interprets the world around us.
For years, scientists have held a commonly accepted view: the prefrontal cortex—the region of the brain responsible for reasoning, decision-making, and complex cognitive functions—is the final arbiter when it comes to categorizing objects. In this traditional model, the brain’s visual regions are seen as passive participants, essentially serving as data collectors that pass along information to the prefrontal cortex for processing.
However, a groundbreaking study led by Nuttida Rungratsameetaweemana, an assistant professor at Columbia Engineering, has revealed a surprising twist to this story. Published in Nature Communications on April 11, this research suggests that the visual system is far more active in decision-making than previously thought. It also opens up exciting new avenues for understanding cognitive flexibility and designing more adaptable artificial intelligence (AI) systems. According to Rungratsameetaweemana and her team, the brain’s visual regions don’t simply process visual data in a mechanical, preordained way. Instead, they dynamically adjust their interpretations based on the brain’s ongoing tasks and goals. In other words, your brain’s response to a carrot might differ dramatically depending on whether you’re preparing a healthy stew or gearing up for a snack-filled sporting event.
A New View of the Visual System
One of the most striking revelations of Rungratsameetaweemana’s research is that the brain’s visual regions, which were once thought to play a passive role, are in fact actively involved in shaping the very way we understand and interpret visual information. This discovery challenges the established view that visual areas like the primary and secondary visual cortices are simply responsible for collecting and transmitting raw visual input to higher-order regions like the prefrontal cortex for categorization and decision-making.
In the new framework proposed by Rungratsameetaweemana, the visual system isn’t just a passive observer. It plays an active role in adapting and reshaping its interpretation of the environment, depending on what the brain is trying to accomplish at any given moment. Whether you’re identifying a carrot as part of a healthy meal or as a snack on a veggie tray, your brain’s visual system tunes its response in real-time based on the context in which the object is encountered.
This flexibility of visual processing is a profound insight into the brain’s ability to adapt and perform complex tasks. It offers a refreshing new perspective on cognitive flexibility, a hallmark of human intelligence. This work also opens doors for advancing artificial intelligence systems to become more adaptable and responsive to unpredictable or changing contexts, similar to the human brain’s remarkable ability to navigate complex and varied environments.
The Experiments Behind the Discovery
The research conducted by Rungratsameetaweemana and her colleagues employed advanced neuroimaging techniques to explore how the visual system adapts to different decision-making tasks. The team used functional magnetic resonance imaging (fMRI) to observe brain activity in participants as they engaged in categorization tasks.
In these experiments, participants were asked to categorize a series of shapes. The challenge, however, lay in the fact that the rules for categorizing the shapes were constantly shifting. In one trial, the participants might have been asked to sort the shapes based on color, and in the next, they might have been asked to categorize them by size or shape. This constant switching of rules provided a unique opportunity to investigate how the brain’s visual system responds to changing tasks and how it adapts its processing accordingly.
Rungratsameetaweemana and her team analyzed the fMRI data using computational machine learning tools, including multivariate classifiers. These tools allowed them to examine the complex patterns of brain activation as participants sorted shapes according to different criteria. The results revealed a stunning insight: the activity in the brain’s visual areas, including the primary and secondary visual cortices, shifted dynamically depending on the categorization task at hand.
What was even more remarkable was that the brain’s visual system seemed to reconfigure itself most effectively when the task was most challenging. For example, when participants were asked to sort shapes that were near the boundary between categories—those shapes that were particularly difficult to distinguish—the brain’s visual system became more active and organized, indicating that it was using extra processing power to solve the task.
The Brain’s Real-Time Adaptability
The findings from this study point to a fascinating new perspective on cognitive flexibility. Historically, cognitive flexibility—the ability to shift between different mental tasks or strategies—has been thought to be managed primarily by the prefrontal cortex. The results of this research suggest that the visual system, often considered a more basic or passive sensory processing system, is much more engaged in flexible decision-making than previously believed.
This has profound implications for understanding how the brain adapts to new or unexpected situations. The ability to switch between different strategies and adjust to shifting rules is a hallmark of human cognition. In fact, it is one of the key characteristics that sets us apart from other species and allows us to thrive in dynamic environments. Whether we’re navigating social interactions or responding to changes in our physical environment, our brains are constantly adjusting and recalibrating, ensuring that we stay agile and adaptable.
For example, imagine you’re at a party with a platter of carrots and celery in front of you. If you’re in the mindset of enjoying a healthy snack, you might immediately recognize the carrots as part of the veggie tray. But if you’re at a Super Bowl party and the carrots are part of a larger platter of snack foods, your brain may shift its interpretation to categorize the carrots as a part of a more indulgent spread.
What this study shows is that the brain’s visual regions are not merely passively taking in information, but are instead actively tuning their responses to align with the current task or goal. This flexibility in sensory processing allows us to adapt quickly to changing circumstances, and it gives us a deeper understanding of how the brain is able to perform such complex tasks effortlessly.
Implications for AI and Cognitive Disorders
Beyond its implications for understanding human cognition, this research also has exciting potential applications in the field of artificial intelligence. AI systems, particularly those involved in image recognition and categorization, have made tremendous strides in recent years. However, one area where AI still struggles is in adapting to new or unexpected contexts. Unlike humans, who can easily switch between tasks or categories based on the situation, AI systems often require extensive retraining when faced with new inputs or shifting goals.
By studying how the human brain adjusts to changing contexts and adapts its visual processing strategies, researchers may be able to design more flexible and adaptive AI systems. These AI systems could become better equipped to handle real-time decision-making and respond dynamically to unpredictable situations, much like the human brain does.
In addition, the study’s findings may have important implications for understanding cognitive flexibility in individuals with disorders such as ADHD or autism. These conditions are often characterized by difficulties in shifting attention or adapting to changing rules or environments. By exploring how the visual system plays a role in flexible cognition, researchers may be able to develop new approaches for diagnosing and treating cognitive disorders that affect flexibility and adaptive thinking.
What’s Next? Exploring the Neural Circuits Behind Flexibility
As the team led by Rungratsameetaweemana continues its research, they are now moving into even deeper territory—studying the neural circuits responsible for flexible coding at the level of individual neurons and circuits. While fMRI allows researchers to examine large populations of neurons, it doesn’t provide the resolution needed to explore the intricacies of neural circuit activity. In a follow-up study, the team is recording neurological activity directly from within the skull, allowing them to explore how specific neurons and circuits contribute to flexible, goal-directed behavior.
In addition to advancing our understanding of the human brain, these insights may help improve AI systems. Researchers are now working to design artificial models that can better mimic the brain’s flexibility, allowing them to adapt not just to new inputs, but to new contexts and tasks. The ultimate goal is to create AI that is more fluid, more adaptable, and better able to navigate the complexities of the real world.
Conclusion: A New Era of Cognitive Flexibility
This groundbreaking research is reshaping our understanding of how the brain works. Far from being a passive recipient of visual information, the visual system plays a dynamic and active role in shaping our decisions, perceptions, and actions. By examining how the brain adapts to shifting goals and contexts, Rungratsameetaweemana and her team are not only shedding light on the extraordinary flexibility of the human brain, but they are also paving the way for advances in AI and treatments for cognitive disorders.
As we continue to unravel the mysteries of the brain, one thing is clear: the human mind is far more adaptable and resourceful than we ever imagined. From simple grocery store decisions to complex problem-solving, our brains are constantly at work, reshaping the way we interact with the world around us.
Reference: Margaret M. Henderson et al, Dynamic categorization rules alter representations in human visual cortex, Nature Communications (2025). DOI: 10.1038/s41467-025-58707-4