NS/ Molecular glue explains how memories last a lifetime

Paradigm
Paradigm
Published in
17 min readJul 25, 2024

Neuroscience biweekly vol. 112, 11th July — 25th July

TL;DR

  • The study in Science Advances identifies KIBRA as a crucial molecule that helps maintain long-term memories by acting as a “glue” for other memory-related molecules, particularly PKMzeta. This interaction forms persistent synaptic tags on strong synapses, ensuring memory stability despite the continual turnover of synaptic molecules. This mechanism likened to Theseus’s Ship, explains how memories can last for years despite molecular changes.
  • In a Nature article, neuroscientists argue that language primarily serves as a tool for communication, not for internal thought. Studies using brain imaging show that language-processing areas are inactive during tasks like problem-solving, suggesting thinking can occur independently of language. Language’s optimization for efficient communication across diverse languages further supports this view, challenging the idea that language is necessary for shaping thought processes.
  • Researchers in a study published in Nature have identified how astrocytes, a type of brain cell, monitor and respond to the energy demands of neurons. They discovered that astrocytes use specific receptors to detect neuronal activity and activate pathways that increase glucose metabolism and energy supply, crucial for maintaining brain functions like learning, memory, and sleep. Disabling these receptors in mice led to impaired brain activity, memory deficits, and disrupted sleep patterns. This understanding could inform new therapies for conditions where brain energy metabolism declines, such as neurodegenerative diseases like Alzheimer’s.
  • Scientists from Aarhus University and the University of Oxford have discovered how the brain reacts when we recognize and predict musical sequences. Listening to music activates a complex chain of events in the brain involving areas responsible for sound processing, emotions, and memory. This process helps us quickly recognize songs and anticipate their progression, making music an enjoyable and familiar experience. Understanding these brain mechanisms could potentially aid in developing screening tools for conditions like dementia, by assessing how individuals’ brain activity responds to music. Future studies aim to explore these processes further and their implications for cognitive health and neurological conditions.
  • Researchers at Baylor College of Medicine, the University of Cambridge in the U.K. and collaborating institutions have shown that serotonin 2C receptor in the brain regulates memory in people and animal models. The findings, published in the journal Science Advances, not only provide new insights into the factors involved in healthy memory but also in conditions associated with memory loss, like Alzheimer’s disease, and suggest novel avenues for treatment.

Neuroscience market

The global neuroscience market size was valued at USD 28.4 billion in 2016 and it is expected to reach USD 38.9 billion by 2027.

The latest news and research

KIBRA anchoring the action of PKMζ maintains the persistence of memory

by Tsokas P, Hsieh C, Flores-Obando RE, et al. in Science Advances

The study in Science Advances identifies KIBRA as a crucial molecule that helps maintain long-term memories by acting as a “glue” for other memory-related molecules, particularly PKMzeta. This interaction forms persistent synaptic tags on strong synapses, ensuring memory stability despite the continual turnover of synaptic molecules. This mechanism, likened to Theseus’s Ship, explains how memories can last for years despite molecular changes.

Whether it’s a first-time visit to a zoo or when we learned to ride a bicycle, we have memories from our childhoods kept well into adult years. But what explains how these memories last nearly an entire lifetime?

A new study in the journal Science Advances, conducted by a team of international researchers, has uncovered a biological explanation for long-term memories. It centers on the discovery of the role of a molecule, KIBRA, that serves as a “glue” to other molecules, thereby solidifying memory formation.

“Previous efforts to understand how molecules store long-term memory focused on the individual actions of single molecules,” explains André Fenton, a professor of neural science at New York University and one of the study’s principal investigators. “Our study shows how they work together to ensure perpetual memory storage.”

“A firmer understanding of how we keep our memories will help guide efforts to illuminate and address memory-related afflictions in the future,” adds Todd Sacktor, a professor at SUNY Downstate Health Sciences University and one of the study’s principal investigators.

It’s been long-established that neurons store information in memory as the pattern of strong synapses and weak synapses, which determines the connectivity and function of neural networks. However, the molecules in synapses are unstable, continually moving around in the neurons, and wearing out and being replaced in hours to days, thereby raising the question: How, then, can memories be stable for years to decades?

In a study using laboratory mice, the scientists focused on the role of KIBRA, or kidney and brain expressed protein, the human genetic variants of which are associated with both good and poor memory. They focused on KIBRA’s interactions with other molecules crucial to memory formation — in this case, protein kinase Mzeta (PKMzeta). This enzyme is the most crucial molecule for strengthening normal mammalian synapses that is known, but it degrades after a few days.

Their experiments reveal that KIBRA is the “missing link” in long-term memories, serving as a “persistent synaptic tag,” or glue, that sticks to strong synapses and to PKMzeta while also avoiding weak synapses.

“During memory formation the synapses involved in the formation are activated — and KIBRA is selectively positioned in these synapses,” explains Sacktor, a professor of physiology, pharmacology, anesthesiology, and neurology at SUNY Downstate. “PKMzeta then attaches to the KIBRA-synaptic-tag and keeps those synapses strong. This allows the synapses to stick to newly made KIBRA, attracting more newly made PKMzeta.”

More specifically, their experiments in the Science Advances paper show that breaking the KIBRA-PKMzeta bond erases old memory. Previous work had shown that randomly increasing PKMzeta in the brain enhances weak or faded memories, which was mysterious because it should have done the opposite by acting at random locations, but the persistent synaptic tagging by KIBRA explains why the additional PKMzeta was memory enhancing, by only acting at the KIBRA tagged sites.

“The persistent synaptic tagging mechanism for the first time explains these results that are clinically relevant to neurological and psychiatric disorders of memory,” observes Fenton, who is also on the faculty at NYU Langone Medical Center’s Neuroscience Institute.

The paper’s authors note that the research affirms a concept introduced in 1984 by Francis Crick. Sacktor and Fenton point out that his proposed hypothesis to explain the brain’s role in memory storage despite constant cellular and molecular changes is a Theseus’s Ship mechanism — borrowed from a philosophical argument stemming from Greek mythology in which new planks replace old ones to maintain Theseus’s Ship for years.

“The persistent synaptic tagging mechanism we found is analogous to how new planks replace old planks to maintain Theseus’s Ship for generations, and allows memories to last for years even as the proteins maintaining the memory are replaced,” says Sacktor. “Francis Crick intuited this Theseus’s Ship mechanism, even predicting the role for a protein kinase. But it took 40 years to discover that the components are KIBRA and PKMzeta and to work out the mechanism of their interaction.”

Language is primarily a tool for communication rather than thought

by Fedorenko E, Piantadosi ST, Gibson EAF in Nature

In a Nature article, neuroscientists argue that language primarily serves as a tool for communication, not for internal thought. Studies using brain imaging show that language-processing areas are inactive during tasks like problem-solving, suggesting thinking can occur independently of language. Language’s optimization for efficient communication across diverse languages further supports this view, challenging the idea that language is necessary for shaping thought processes.

Language is a defining feature of humanity, and for centuries, philosophers and scientists have contemplated its true purpose. We use language to share information and exchange ideas — but is it more than that? Do we use language not just to communicate, but to think?

In the June 19 issue of the journal Nature, McGovern Institute for Brain Research neuroscientist Evelina Fedorenko and colleagues argue that we do not. Language, they say, is primarily a tool for communication.

Fedorenko acknowledges that there is an intuitive link between language and thought. Many people experience an inner voice that seems to narrate their own thoughts. And it’s not unreasonable to expect that well-spoken, articulate individuals are also clear thinkers. But as compelling as these associations can be, they are not evidence that we actually use language to think.

“I think there are a few strands of intuition and confusions that have led people to believe very strongly that language is the medium of thought,” she says. “But when they are pulled apart thread by thread, they don’t really hold up to empirical scrutiny.”

For centuries, language’s potential role in facilitating thinking was nearly impossible to evaluate scientifically. But neuroscientists and cognitive scientists now have tools that enable a more rigorous consideration of the idea. Evidence from both fields, which Fedorenko, MIT brain and cognitive scientist and linguist Edward Gibson, and University of California at Berkeley cognitive scientist Steven Piantadosi review in their Nature Perspective, supports the idea that language is a tool for communication, not for thought.

“What we’ve learned by using methods that actually tell us about the engagement of the linguistic processing mechanisms is that those mechanisms are not really engaged when we think,” Fedorenko says. Also, she adds, “you can take those mechanisms away, and it seems that thinking can go on just fine.”

Over the past 20 years, Fedorenko and other neuroscientists have advanced our understanding of what happens in the brain as it generates and understands language. Now, using functional MRI to find parts of the brain that are specifically engaged when someone reads or listens to sentences or passages, they can reliably identify an individual’s language-processing network. Then they can monitor those brain regions while the person performs other tasks, from solving a sudoku puzzle to reasoning about other people’s beliefs.

“Pretty much everything we’ve tested so far, we don’t see any evidence of the engagement of the language mechanisms,” Fedorenko says. “Your language system is basically silent when you do all sorts of thinking.”

That’s consistent with observations from people who have lost the ability to process language due to an injury or stroke. Severely affected patients can be completely unable to process words, yet this does not interfere with their ability to solve math problems, play chess, or plan for future events.

“They can do all the things that they could do before their injury. They just can’t take those mental representations and convert them into a format which would allow them to talk about them with others,” Fedorenko says. “If language gives us the core representations that we use for reasoning, then … destroying the language system should lead to problems in thinking as well, and it really doesn’t.”

Conversely, intellectual impairments do not always associate with language impairment; people with intellectual disability disorders or neuropsychiatric disorders that limit their ability to think and reason do not necessarily have problems with basic linguistic functions. Just as language does not appear to be necessary for thought, Fedorenko and colleagues conclude that it is also not sufficient to produce clear thinking.

In addition to arguing that language is unlikely to be used for thinking, the scientists considered its suitability as a communication tool, drawing on findings from linguistic analyses. Analyses across dozens of diverse languages, both spoken and signed, have found recurring features that make them easy to produce and understand.

“It turns out that pretty much any property you look at, you can find evidence that languages are optimized in a way that makes information transfer as efficient as possible,” Fedorenko says.

That’s not a new idea, but it has held up as linguists analyze larger corpora across more diverse sets of languages, which has become possible in recent years as the field has assembled corpora that are annotated for various linguistic features. Such studies find that across languages, sounds and words tend to be pieced together in ways that minimize effort for the language producer without muddling the message. For example, commonly used words tend to be short, while words whose meanings depend on one another tend to cluster close together in sentences. Likewise, linguists have noted features that help languages convey meaning despite potential “signal distortions,” whether due to attention lapses or ambient noise.

“All of these features seem to suggest that the forms of languages are optimized to make communication easier,” Fedorenko says, pointing out that such features would be irrelevant if language was primarily a tool for internal thought.

“Given that languages have all these properties, it’s likely that we use language for communication,” she says.

She and her coauthors conclude that as a powerful tool for transmitting knowledge, language reflects the sophistication of human cognition — but does not give rise to it.

Adenosine signalling to astrocytes coordinates brain metabolism and function

by Theparambil SM, Kopach O, Braga A, et al. in Nature

Researchers in a study published in Nature have identified how astrocytes, a type of brain cell, monitor and respond to the energy demands of neurons. They discovered that astrocytes use specific receptors to detect neuronal activity and activate pathways that increase glucose metabolism and energy supply, crucial for maintaining brain functions like learning, memory, and sleep. Disabling these receptors in mice led to impaired brain activity, memory deficits, and disrupted sleep patterns. This understanding could inform new therapies for conditions where brain energy metabolism declines, such as neurodegenerative diseases like Alzheimer’s.

The scientists say their findings, published in Nature, could inform new therapies to maintain brain health and longevity, as other studies have found that brain energy metabolism can become impaired late in life and contribute to cognitive decline and the development of neurodegenerative disease.

Lead author Professor Alexander Gourine (UCL Neuroscience, Physiology & Pharmacology) said: “Our brains are made up of billions of nerve cells, which work together coordinating numerous functions and performing complex tasks like control of movement, learning and forming memories. All of this computation is very energy-demanding and requires an uninterrupted supply of nutrients and oxygen.

“When our brain is more active, such as when we’re performing a mentally taxing task, our brain needs an immediate boost of energy, but the exact mechanisms that ensure on-demand local supply of metabolic energy to active brain regions are not fully understood.”

First and co-corresponding author Dr Shefeeq Theparambil, who began the study at UCL before moving to Lancaster University, said: “The normal activities of the brain require enormous amounts of energy, comparable to that of a human leg muscle running a marathon. This energy is primarily derived from blood glucose. Neurons in the brain consume more than 75% of this energy.”

Prior research has shown that numerous brain cells called astrocytes appear to play a role in providing the brain neurons with energy they need. Astrocytes, shaped like stars, are a type of glial cell, which are non-neuronal cells found in the central nervous system. When neighbouring neurons need an increase in energy supply, astrocytes jump into action by rapidly activating their own glucose stores and metabolism, leading to the increased production and release of lactate. Lactate supplements the pool of energy that is readily available for use by neurons in the brain.

Professor Gourine explained: “In our study, we have figured out how exactly astrocytes are able to monitor the energy use by their neighbouring nerve cells, and kick-start this process that delivers additional chemical energy to busy brain regions.”

In a series of experiments using mouse models and cell samples, the researchers identified a set of specific receptors in astrocytes that can detect and monitor neuronal activity, and trigger a signalling pathway involving an essential molecule called adenosine. The researchers found that the metabolic signalling pathway activated by adenosine in astrocytes is exactly the same as the pathway that recruits energy stores in the muscle and the liver, for example when we exercise.

Adenosine activates astrocyte glucose metabolism and supply of energy to neurons to ensure that synaptic function (neurotransmitters passing communication signals between cells) continues apace under conditions of high energy demand or reduced energy supply.

The researchers found that when they deactivated the key astrocyte receptors in mice, the animal’s brain activity was less effective, including significant impairments in global brain metabolism, memory and disruption of sleep, thus demonstrating that the signalling pathway they identified is vital for processes such as learning, memory and sleep.

Dr Theparambil said: “Identification of this mechanism may have broader implications as it could be a way of treating brain diseases where brain energetics are downregulated, such as neurodegeneration and dementia.”

Professor Gourine added: “We know that brain energy homeostasis is progressively impaired in ageing and this process is accelerated during the development of neurodegenerative diseases such as Alzheimer’s disease. Our study identifies an attractive readily druggable target and therapeutic opportunity for brain energy rescue for the purpose of protecting brain function, maintaining cognitive health, and promoting brain longevity.”

Spatiotemporal brain hierarchies of auditory memory recognition and predictive coding

by Bonetti L, Fernández-Rubio G, Carlomagno F, et al. in Nature Communications

Researchers from Aarhus University and the University of Oxford have discovered how the brain reacts when we recognize and predict musical sequences. Listening to music activates a complex chain of events in the brain involving areas responsible for sound processing, emotions, and memory. This process helps us quickly recognize songs and anticipate their progression, making music an enjoyable and familiar experience. Understanding these brain mechanisms could potentially aid in developing screening tools for conditions like dementia, by assessing how individuals’ brain activity responds to music. Future studies aim to explore these processes further and their implications for cognitive health and neurological conditions.

In a joint venture, researchers from Aarhus University and the University of Oxford have uncovered how our brain reacts to and recognizes music. The research shows that listening to music sets off a complex chain reaction of events in the brain — a discovery that may one day be used to help screen for dementia.

Ever heard just a snippet of a song and instantly known what comes next? Or picked up the rhythm of a chorus after just a few notes? New research from the Center for Music in the Brain at Aarhus University and the Centre for Eudaimonia and Human Flourishing at the University of Oxford has uncovered what happens in our brain when we recognize and predict musical sequences.

When we turn on the radio and our favourite song starts playing, our brain reacts in a complex pattern, where areas that process sound, emotions, and memory are activated. In a feedforward and feedback loop, our auditory cortex first responds to the sounds and sends information to other brain areas, like the hippocampus, which is involved in memory, and the cingulate gyrus, which helps with attention and emotional processing. This process helps us recognise songs quickly and predict what comes next, making listening to music an enjoyable and familiar experience.

Knowing how our brain reacts to music can play a pivotal role in understanding our cognitive functions, explains one of the leading researchers behind the study, Associate Professor Leonardo Bonetti from the Center for Music in the Brain at Aarhus University:

“Our research provides detailed insights into the brain’s ability to process and predict music and contributes to our broader understanding of cognitive functions. This could make a difference for studying brain health, as it offers potential pathways to explore how ageing and diseases like dementia affect cognitive processing over time.”

In fact, understanding how our brain rocks along to Bohemian Rhapsody or reacts to a childhood classic may help researchers detect dementia in the future.

“In the long run, these findings could inform the development of screening tools for detecting the individual risk of developing dementia just using the brain activity of people while they listen to and recognise music.”

In the study, the researchers measured the brainwaves of 83 people as they listened to music, and they will follow up with additional studies, says Leonardo Bonetti.

“Future studies could explore how these brain mechanisms change with age or in individuals with cognitive impairments. Understanding these processes in more detail could lead to new interventions for improving cognitive function and quality of life for people with neurological conditions.”

Neural circuits expressing the serotonin 2C receptor regulate memory in mice and humans

by Liu H, He Y, Liu H, et al. in Science Advances

Researchers at Baylor College of Medicine, the University of Cambridge in the U.K. and collaborating institutions have shown that serotonin 2C receptor in the brain regulates memory in people and animal models. The findings, published in the journal Science Advances, not only provide new insights into the factors involved in healthy memory but also in conditions associated with memory loss, like Alzheimer’s disease, and suggest novel avenues for treatment.

“Serotonin, a compound produced by neurons in the midbrain, acts as a neurotransmitter, passing messages between brain cells,” said co-corresponding author Dr. Yong Xu, professor of pediatrics — nutrition and associate director for basic sciences at the USDA/ARS Children’s Nutrition Research Center at Baylor. “Serotonin-producing neurons reach out to multiple brain regions including the hippocampus, a region essential for short- and long-term memory.”

Serotonin communicates messages to brain cells by binding to receptors on the cell surface, which signal the receiving cell to carry on a certain activity. In this study, the Xu lab, with expertise in basic and genetic animal studies, and the human genetics lab of co-corresponding author Dr. I. Sadaf Farooqi, professor of metabolism and medicine at the University of Cambridge, focused on serotonin 2C receptors, which are abundantly present in the brain’s ventral hippocampal CA1 region (vCA1), investigating the role of the receptor in memory in humans and animal models.

“We had previously identified five individuals carrying variants of the serotonin 2C receptor gene (HTR2C) that produce defective forms of the receptor,” Farooqi said. “People with these rare variants showed significant deficits on memory questionnaires. These findings led us to investigate the association between HTR2C variants and memory deficits in animal models.”

The team genetically engineered mice to mimic the human mutation. When the researchers ran behavioral tests on these mice to evaluate their memory, they found that both males and females with the non-functional gene showed reduced memory recall when compared with the unmodified animals.

“When we combined the human data and the mouse data, we found compelling evidence connecting non-functional mutations of the serotonin receptor 2C with memory deficits in humans,” Xu said.

The animal models also enabled the team to dig deeper into how the receptor mediates memory. They discovered a brain circuit that begins in the midbrain where serotonin-producing neurons are located. These neurons project to the vCA1 region, which has abundant serotonin 2C receptors.

“When neurons in the midbrain reaching out to neurons in the vCA1 region release serotonin, the neurotransmitter binds to its receptor signaling these cells to make changes that help the brain consolidate memories,” Xu said.

Importantly, the researchers also found that this serotonin-associated neural circuit is damaged in a mouse model of Alzheimer’s disease.

“The neural circuit in the Alzheimer’s disease animal model cannot release sufficient serotonin into the vCA1 region that would need to bind to its receptor in the downstream neurons to signal the changes required to consolidate a memory,” Xu said.

However, it is possible to bypass this lack of serotonin and directly activate the downstream serotonin receptor by administering a serotonin analog, lorcaserin, a compound that selectively activates the serotonin 2C receptor in these cells.

“We tested this strategy in our animal model and were excited to find that the animals treated with the serotonin analog improved their memory,” Xu said. “We hope our findings encourage further studies to evaluate the value of serotonin analogs in the treatment of Alzheimer’s disease.”

Subscribe to Paradigm!

Medium, Twitter, Telegram, Telegram Chat, LinkedIn, and Reddit.

Main sources

Research articles

Nature Neuroscience

Science Daily

Technology Networks

Neuroscience News

Frontiers

Cell

--

--