Mitochondria that are transported from satellite glial cells in dorsal root ganglia to peripheral sensory neurons through tunneling nanotube-like structures provide protection against peripheral neuropathy.
Semaglutide reduced HbA1c and body weight over 26 weeks in individuals with schizophrenia spectrum disorders and early glycemic abnormalities taking clozapine or olanzapine.
Question Can adjunctive semaglutide improve glycemic control and weight outcomes in individuals with early-stage prediabetes or diabetes and schizophrenia spectrum disorders who initiated clozapine or olanzapine within the past 5 years?
Findings In this randomized clinical trial including 73 participants, semaglutide was found to significantly reduce hemoglobin A1c level and body weight over a 26-week period. Approximately one-half of individuals treated with semaglutide achieved low-risk hemoglobin A1c levels, compared with the placebo group.
Meaning This study found that semaglutide can mitigate the early metabolic burden associated with second-generation antipsychotic use in schizophrenia and may support the prevention of long-term cardiometabolic complications when initiated during the early stages of metabolic dysregulation.
A study co-led by Richard Carson and colleagues at the Yale School of Medicine has identified a measurable molecular difference in the brains of autistic adults – finding reduced availability of a key glutamate receptor involved in brain signaling balance.
Published in The American Journal of Psychiatry, the research offers new insight into the biological mechanisms of autism and could inform future diagnostic tools and targeted supports.
👉 Read the full story.
Brains of autistic individuals have fewer of a specific kind of glutamate receptor, supporting an idea that autism is driven by a signaling imbalance.
This is the third in my annual series reviewing everything that happened in the LLM space over the past 12 months. For previous years see Stuff we figured out about AI in 2023 and Things we learned about LLMs in 2024.
It’s been a year filled with a lot of different trends.
2026 (Nature Neuroscience)
• AstroREG, a resource of enhancer–gene interactions in human primary astrocytes, generated by combining CRISPR inhibition (CRISPRi), single-cell RNA-seq and machine learning.
This study reveals how distal DNA ‘switches’ control gene activity in human astrocytes. Using CRISPRi screens and single-cell RNA-seq, we map enhancer–gene links, highlight Alzheimer’s disease-related targets and introduce a model that predicts additional regulatory interactions.
Goodrich admits that when his group first proposed the strategy, “everyone said we were crazy.” Researchers have been trying to develop vaccines that contain whole cancer cells for more than 50 years, and although some formulations made it to clinical trials, they produced a poor immune response. None of the vaccines has been approved for humans, although one is available for pets.
The harsh methods previously used to stop cancer cells from reproducing, such as radiation, also caused them to shed their neoantigens, Goodrich says. He argues that the UV-based approach should work better because it preserves these potential immune stimulants.
The new clinical trial, launching this month at City of Hope in California and sponsored by PhotonPharma, aims to recruit eight patients with relapsed ovarian cancer. They will first undergo surgery to remove their tumors. Researchers will then expose the tumor cells to riboflavin and UV light and combine them with an immune-boosting additive known as an adjuvant to produce a custom vaccine. Participants will receive three doses of the vaccine, and researchers will check for side effects and measure immune responses.
Thanks to their work from the 1980s and onward, John Hopfield and Geoffrey Hinton have helped lay the foundation for the machine learning revolution that started around 2010.
The development we are now witnessing has been made possible through access to the vast amounts of data that can be used to train networks, and through the enormous increase in computing power. Today’s artificial neural networks are often enormous and constructed from many layers. These are called deep neural networks and the way they are trained is called deep learning.
A quick glance at Hopfield’s article on associative memory, from 1982, provides some perspective on this development. In it, he used a network with 30 nodes. If all the nodes are connected to each other, there are 435 connections. The nodes have their values, the connections have different strengths and, in total, there are fewer than 500 parameters to keep track of. He also tried a network with 100 nodes, but this was too complicated, given the computer he was using at the time. We can compare this to the large language models of today, which are built as networks that can contain more than one trillion parameters (one million millions).