All Together Now
TCCI Blog
|
Jul 17, 2025



Summary
AI Prize finalist Aditya Nair uses machine learning to uncover hidden patterns in brain activity related to emotions like anger and fear. While individual neurons show little correlation to emotional states, Nair’s model analyzes collective neural activity—like listening to an orchestra—to detect persistent emotional signals. His work revealed a “line attractor” pattern in the hypothalamus, offering new ways to quantify emotional intensity over time. This breakthrough could improve psychiatric drug development and expand how researchers interpret brain data. Nair is also building tools to help scientists apply AI models more easily, making advanced neural analysis accessible across the research community.
———————————————————————————————————————————————————————————————————————————————
Full Article
When you experience an emotion like hunger, anger, or fear, specific circuits in your brain light up. But look at individual cells within those circuits, and there’s often startlingly little correlation between the cell’s activity and a person’s emotional experience.
“It’s a big paradox: we know from animal studies that these circuits control emotions, because if we silence the circuits, animals stop displaying affective behavior,” says Chen Institute and Science AI Prize finalist, Aditya Nair. “But when we take recordings from individual neurons, there’s no direct connection to the behavior in question.”
To untangle that riddle, Nair fed neural activity data from an area of the hypothalamus involved in emotions such as aggression into a machine learning model capable of detecting subtle patterns that emerge as neurons interact with one another. “It’s like listening to a symphony: nothing makes sense if you listen to a single instrument,” he explains. “You need to hear the entire orchestra in order to understand the melody.”
The algorithm Nair used revealed that as choruses of neurons trade and recirculate signals, they settle into a complex relationship known as a line attractor—a sort of sliding scale, common in digital neural networks but seldom previously observed in the living brain, that allows continuous variables to be stored over time. “That’s important because emotions have two key properties: they vary in intensity, and they persist over time,” Nair says. “By fitting AI models to neural activity, we’ve found a previously undetectable signal in the brain with both these qualities.”
That’s a key breakthrough with potentially important implications for drug discovery. “Animals can’t self-report emotions, and only show affect through a handful of different behaviors,” Nair explains. “Now, we can use brain activity to quantify how hungry, or angry, or fearful a subject is—which should help us develop and test psychiatric treatments much more effectively.”
Nair’s methods could one day help neuroscientists to detect emergent signals from much broader patterns of brain activity. “So far, we’ve only studied a sliver of the hypothalamus,” Nair says. “But we’re developing toolkits to model activity across larger brain regions, and read the hidden signals that emerge when hundreds of thousands of neurons perform multiple computations simultaneously.”
To accelerate that process, Nair has developed a large language model capable of taking brain activity data and rapidly processing it to reveal emergent signals. The goal: to enable the rapid implementation of standardized analyses, even by researchers with limited expertise in AI. “We’re making AI tools more accessible to the community in order to accelerate research and drug discovery,” Nair says.
Summary
AI Prize finalist Aditya Nair uses machine learning to uncover hidden patterns in brain activity related to emotions like anger and fear. While individual neurons show little correlation to emotional states, Nair’s model analyzes collective neural activity—like listening to an orchestra—to detect persistent emotional signals. His work revealed a “line attractor” pattern in the hypothalamus, offering new ways to quantify emotional intensity over time. This breakthrough could improve psychiatric drug development and expand how researchers interpret brain data. Nair is also building tools to help scientists apply AI models more easily, making advanced neural analysis accessible across the research community.
———————————————————————————————————————————————————————————————————————————————
Full Article
When you experience an emotion like hunger, anger, or fear, specific circuits in your brain light up. But look at individual cells within those circuits, and there’s often startlingly little correlation between the cell’s activity and a person’s emotional experience.
“It’s a big paradox: we know from animal studies that these circuits control emotions, because if we silence the circuits, animals stop displaying affective behavior,” says Chen Institute and Science AI Prize finalist, Aditya Nair. “But when we take recordings from individual neurons, there’s no direct connection to the behavior in question.”
To untangle that riddle, Nair fed neural activity data from an area of the hypothalamus involved in emotions such as aggression into a machine learning model capable of detecting subtle patterns that emerge as neurons interact with one another. “It’s like listening to a symphony: nothing makes sense if you listen to a single instrument,” he explains. “You need to hear the entire orchestra in order to understand the melody.”
The algorithm Nair used revealed that as choruses of neurons trade and recirculate signals, they settle into a complex relationship known as a line attractor—a sort of sliding scale, common in digital neural networks but seldom previously observed in the living brain, that allows continuous variables to be stored over time. “That’s important because emotions have two key properties: they vary in intensity, and they persist over time,” Nair says. “By fitting AI models to neural activity, we’ve found a previously undetectable signal in the brain with both these qualities.”
That’s a key breakthrough with potentially important implications for drug discovery. “Animals can’t self-report emotions, and only show affect through a handful of different behaviors,” Nair explains. “Now, we can use brain activity to quantify how hungry, or angry, or fearful a subject is—which should help us develop and test psychiatric treatments much more effectively.”
Nair’s methods could one day help neuroscientists to detect emergent signals from much broader patterns of brain activity. “So far, we’ve only studied a sliver of the hypothalamus,” Nair says. “But we’re developing toolkits to model activity across larger brain regions, and read the hidden signals that emerge when hundreds of thousands of neurons perform multiple computations simultaneously.”
To accelerate that process, Nair has developed a large language model capable of taking brain activity data and rapidly processing it to reveal emergent signals. The goal: to enable the rapid implementation of standardized analyses, even by researchers with limited expertise in AI. “We’re making AI tools more accessible to the community in order to accelerate research and drug discovery,” Nair says.
© 2025 Tianqiao and Chrissy Chen Institute
© 2025 Tianqiao and Chrissy Chen Institute
© 2025 Tianqiao and Chrissy Chen Institute



