Is perception really reality? Philosophical considerations aside, scientists agree that perception certainly dictates reality. Throughout our lifetime, our brains build up a world based on what our senses detect. Often, these senses process environmental stimuli without us even realizing it. For example, the subconscious brain can pick up the rhythm of conversations or notice behavioral patterns that our conscious mind rarely pauses to consider.
How exactly our brain transforms sensory information into a neuronal response remains a mystery. Understanding this process could have huge implications for how therapists treat conditions ranging from autism to strokes and even traumatic brain injuries. However, the implications aren’t only biological. Research in this area could help improve artificial intelligence systems. After all, these systems are built on our understanding of how the human brain uses, processes, and learns from sensory inputs.
At Cold Spring Harbor Laboratory (CSHL), an energetic group of neuroAI researchers has been making exciting breakthroughs in the neuroscience of perception. Their discoveries could open the door to more sensible AI and new therapeutic strategies.
Timing is everything
Verbal communication is one meaningful way we make sense of the world. CSHL Assistant Professor Arkarup Banerjee has long been interested in how the auditory and motor systems work together to enable free-flowing conversations. In the average discussion, only 200 milliseconds (about one-fifth of a second) pass between when one person stops talking and another responds.
“That’s really, really fast,” Banerjee says. “Think of all that needs to happen in one-fifth of a second. You need to hear and process the auditory input. You need to access your memory, trying to understand what those words mean. Then you need to decide whether you want to respond or not. Then you have to plan for it and move your muscles appropriately in order to speak the words you want to say.”
To better understand how this works in the brain, Banerjee focuses on a species of mouse that can “sing” uptempo duets. Alston’s singing mice take turns vocalizing with each other, using songs that can last many seconds and may contain up to about 100 human-audible notes. There are often only 200 to 300 milliseconds between each call and response—a pace similar to human conversation.
CSHL Assistant Professor Arkarup Banerjee demonstrates the abilities of Alston’s singing mice and discusses what they may be able to teach us about human communication.
Banerjee’s team measured electrical activity in the mouse brains and examined neurons in a region called the orofacial motor cortex to see what they could uncover about the rodents’ chatter. Naturally, during sing-offs, the mice vary the length of their performances. They may start with a six-second song, followed by a 12-second song, then a seven-second song.
When examining their data, Banerjee and his colleagues found something unexpected. The mice’s neural activity seemed to “bend” time to help them generate songs of varying tempos and communicate more effectively. Instead of processing time in even amounts like a clock, the mouse’s brain measured time in relative intervals. Their neurons slowed down or sped up the song’s tempo based on its length.

Human brains similarly bend time without us even realizing it. An hour spent doing a boring task can feel much longer than an hour hanging out with friends. Banerjee finds that in mice, this manipulation enables greater vocal flexibility.
“One reason we have a brain is because it allows flexibility in our behavior. We can change things on the fly. We can adapt,” he says. “To study that, you have to choose a suitable behavior in a model system. In this case, we work on vocal interactions in the singing mouse.”
Banerjee says this research not only advances our understanding of hearing and communication. It also offers a new framework for thinking about how the brain manages social interactions. Thus, breakthroughs like this could one day inform speech therapy strategies for children with autism and people who have suffered from strokes or other brain injuries.
Seeing is believing
CSHL Assistant Professor Benjamin Cowley is trying to understand how our brain processes information from a different sense: our sight. And he has a fascinating way of going about it. Cowley builds AI-powered models to predict animal behavior.
Because the human brain is so complex, Cowley’s research focuses on fruit fly brains, which are built similarly to ours but much simpler. Human brains have almost 100 billion neurons. Fruit flies only have around 100,000. However, unlike humans, the fruit fly’s visual system must account for flight. It can respond to a visual input in less than five milliseconds. Humans, on the other hand, often take about 200 milliseconds.

Cowley investigates how male fruit flies respond to visual cues from females during the courtship process. Courtship rituals are a serious affair for the little flies. Upon spotting a female, a male will chase her down, tap her abdomen with his foreleg, and vibrate his wings to perform a series of “love songs.” In the wild, these displays last only around five seconds. However, in a lab, researchers can extend the “dates” to around 30 minutes. For a fly that lives only about three weeks, a half-hour is a very long time. “This is like a month-long date at Starbucks,” Cowley says. “That gives us a lot of rich data to be able to see this courtship behavior unfold.”
In a recent study, Cowley and colleagues silenced specific visual neurons in male flies and trained an AI model to detect subtle changes in the animals’ behavior after they were presented with a female. The team conducted multiple rounds of experiments, silencing a different visual neuron each time until the model could accurately predict how a real-life fly would respond to its female companion.
The team figured out that flies process visual data using groups of neurons rather than a single neuron connecting each visual feature to one action. That finding could mark a major shift in how we think about sight. However, it’s still unclear just how the brain processes visual cues.
“How does a stimulus—an image—transform into this useful neural code? What are the transformations that do that?” Cowley wonders. “We don’t yet understand that in the fruit fly. And we don’t understand that in the human visual system as well. There are some ideas. But we haven’t nailed down the exact computations.”
Ever wish you could be a fly on the wall? CSHL Assistant Professor Benjamin Cowley explores the brain’s visual cortex through the eyes of the fruit fly.
Figuring out these computations could lead to incredible advances in AI models that process visual stimuli. Imagine, for example, AI that can distinguish between tumors and noncancerous growths. Other applications could be even closer in sight. Think of AI systems that allow self-driving cars to respond to visual cues more quickly, safely, and reliably.
But that’s not all. Cowley’s fruit flies could also tell us something about energy efficiency. Consider that flies’ brains enable them to walk, respond to mates, fend off competitors, and escape predators—all using minuscule amounts of energy. For all its computational power, today’s AI is nowhere near that level of efficiency.
What makes us, us
Cowley and Banerjee are just two CSHL neuroscientists at the cutting edge of perceptual science. CSHL Professor Florin Albeanu and Associate Professor Saket Navlakha have uncovered fascinating insights into the world of odor. Meanwhile, Assistant Professor Gabrielle Pouchelon considers how early environmental factors like sensory cues affect the developing brain. And Assistant Professor Helen Hou investigates behavioral responses through facial expressions. Additionally, in 2024, CSHL welcomed two new labs specializing in visual computation. Assistant Professor David Klindt came to CSHL from Stanford University following a stint at Meta. And Mitra Javadzadeh joined the CSHL Fellows program after completing postdoctoral research at University College London.
Together, their work could one day enable us to build artificial sensory technologies that restore lost senses or even heighten our existing abilities. But perhaps even more exciting are the questions that will be addressed along the way.
“Our brain is the seed of all our intelligence—all emotions, thoughts, and memories,” says Banerjee. “In a very real sense, this is what makes us, us. The question then becomes: ‘How does a relatively small organ manage all of these complex and crucial tasks?’” For Banerjee, “The answer lies in being able to understand how neurons in the brain connect to each other. How does the electrical activity in the brain allow humans and other animals to do what they do? It’s the grandest challenge in the universe.”
Banerjee, Cowley, and their colleagues are up for the challenge and excited to see where it takes them next.
Written by: Margaret Osborne, Science Writer | publicaffairs@cshl.edu | 516-367-8455