Toggle light / dark theme

Remarkable Intelligence: Octopus and Human Brains Share the Same “Jumping Genes”

New research has identified an important molecular analogy that could explain the remarkable intelligence of these fascinating invertebrates.

An exceptional organism with an extremely complex brain and cognitive abilities makes the octopus very unique among invertebrates. So much so that it resembles vertebrates more than invertebrates in several aspects. The neural and cognitive complexity of these animals could originate from a molecular analogy with the human brain, as discovered by a research paper that was recently published in BMC Biology and coordinated by Remo Sanges from Scuola Internazionale Superiore di Studi Avanzati (SISSA) of Trieste and by Graziano Fiorito from Stazione Zoologica Anton Dohrn of Naples.

This research shows that the same ‘jumping genes’ are active both in the human brain and in the brain of two species, Octopus vulgaris, the common octopus, and Octopus bimaculoides, the Californian octopus. A discovery that could help us understand the secret of the intelligence of these remarkable organisms.

Retina recordings could be the key to identifying Autism and ADHD in children

Early diagnosis of neurodevelopmental conditions, such as Attention Deficit Hyperactivity Disorder (ADHD) and Autism Spectrum Disorder (ASD), is critical for treatment and symptom management processes.

Now, a team of researchers from the University of South Australia has found that recordings from the retina may be used to distinguish unique signals for both ADHD and ASD, offering a possible biomarker for each disorder. According to a press release published by the institution, The team used the “electroretinogram” (ERG), a diagnostic test that measures the electrical activity of the retina in response to light, and found out that children with ASD showed less ERG energy while children with ADHD displayed more ERG energy.

The Goodness of the Universe — John Smart

Outer Space, Inner Space, and the Future of Networks.
Synopsis: Does the History, Dynamics, and Structure of our Universe give any evidence that it is inherently “Good”? Does it appear to be statistically protective of adapted complexity and intelligence? Which aspects of the big history of our universe appear to be random? Which are predictable? What drives universal and societal accelerating change, and why have they both been so stable? What has developed progressively in our universe, as opposed to merely evolving randomly? Will humanity’s future be to venture to the stars (outer space) or will we increasingly escape our physical universe, into physical and virtual inner space (the transcension hypothesis)? In Earth’s big history, what can we say about what has survived and improved? Do we see any progressive improvement in humanity’s thoughts or actions? When is anthropogenic risk existential or developmental (growing pains)? In either case, how can we minimize such risk? What values do well-built networks have? What can we learn about the nature of our most adaptive complex networks, to improve our personal, team, organizational, societal, global, and universal futures? I’ll touch on each of these vital questions, which I’ve been researching and writing about since 1999, and discussing with a community of scholars at Evo-Devo Universe (join us!) since 2008.

For fun background reading, see John’s Goodness of the Universe post on Centauri Dreams, and “Evolutionary Development: A Universal Perspective”, 2019.

John writes about Foresight Development (personal, team, organizational, societal, global, and universal), Accelerating Change, Evolutionary Development (Evo-Devo), Complex Adaptive Systems, Big History, Astrobiology, Outer and Inner Space, Human-Machine Merger, the Future of AI, Neuroscience, Mind Uploading, Cryonics and Brain Preservation, Postbiological Life, and the Values of Well-Built Networks.
He is CEO of Foresight University, founder of the Acceleration Studies Foundation, and co-founder of the Evo-Devo Universe research community, and the Brain Preservation Foundation. He is editor of Evolution, Development, and Complexity (Springer 2019), and Introduction to Foresight: Personal, Team, and Organizational Adaptiveness (Foresight U Press 2022). He is also author of The Transcension Hypothesis (2011), the proposal that universal development guides leading adaptive networks increasingly into physical and virtual inner space.

A talk for the ‘Stepping into the Future‘conference (April 2022).

The Goodness of the Universe: Outer Space, Inner Space, and the Future of Networks /w John Smart

Many thanks for tuning in!

Have any ideas about people to interview? Want to be notified about future events? Any comments about the STF series?

Silence for Thought: Special Interneuron Networks in the Human Brain

Summary: Human cortical networks have evolved a novel neural network that relies on abundant connections between inhibitory interneurons.

Source: Max Planck Institute.

The analysis of the human brain is a central goal of neuroscience. However, for methodological reasons, research has largely focused on model organisms, in particular the mouse.

Lyme Disease Is Even More Common Than Experts Realized, New Research Finds

It’s extremely important to check yourself for ticks this summer.


Up to 14.5% of the global population may have already had Lyme disease, according to a new meta-analysis published in BMJ Global Health. The researchers behind the report analyzed 89 previously published studies to calculate the figure, which sheds a harrowing light on the worldwide toll of the tick-borne illness.

From 1991 to 2018, the incidence of Lyme disease in the United States nearly doubled, according to data from the United States Environmental Protection Agency (EPA). In 1991, there were nearly four reported cases per 100,000 people; that number jumped to about seven cases per 100,000 people by 2018. The Centers for Disease Control and Prevention (CDC) estimates that about 470,000 Americans are diagnosed and treated for Lyme disease each year.

The bacterium that most commonly causes Lyme, Borrelia burgdorferi, is transmitted to humans via the bite of an infected black-legged tick, also known as a deer tick. These especially tiny ticks are often found in the Northeast, Mid-Atlantic, Upper Midwest, and Pacific Coast of the United States, per the U.S. National Library of Medicine (NLM). Once a person has been infected, they may develop short-term, flu-like symptoms including fever, headache, and fatigue, as well as a signature bull’s-eye-shaped rash that appears in up to 80% of Lyme disease cases, according to the CDC. In rare instances, when Lyme is left untreated, a person may experience long-term, potentially life-threatening complications, including joint pain, severe headaches and neck stiffness, heart issues, and inflammation of the brain and spinal cord, among others.

Researchers discover two important novel aspects of APOE4 gene in Alzheimer’s patients

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder and the most common cause of dementia, affecting more than 5.8 million individuals in the U.S. Scientists have discovered some genetic variants that increase the risk for developing Alzheimer’s; the most well-known of these for people over the age of 65 is the APOE ε4 allele. Although the association between APOE4 and increased AD risk is well-established, the mechanisms responsible for the underlying risk in human brain cell types has been unclear until now.

Researchers from Boston University School of Medicine (BUSM) have discovered two important novel aspects of the gene: 1) human genetic background inherited with APOE4 is unique to APOE4 patients and 2) the mechanistic defects due to APOE4 are unique to human cells.

Our study demonstrated what the APOE4 gene does and which brain cells get affected the most in humans by comparing human and mouse models. These are important findings as we can find therapeutics if we understand how and where this risk gene is destroying our brain.

Beyond longevity: The DIY quest to cheat death and stop aging

At 79, he’s already outlived the CDC’s official life expectancy by two years and he has no intention of dying — or even slowing down — anytime soon. An active man, Scott jets between his homes in upstate New York and Florida, flies to exotic locations such as Panama City for business and still finds time for the odd cruise. His secret? A DIY regime of self-experimentation and untested therapies he believes will keep him going well past the next century.

Self-experimenters litter the history of medical science. Dentist Horace Wells dosed himself with nitrous oxide in 1,844 to see if it could kill pain, Nicholas Senn inflated his innards with hydrogen a few decades later to work out if it could diagnose a ruptured bowel, and more recently, Barry Marshall drank a solution containing H. pylori in 1985 to prove the bacterium caused ulcers.

These scientists risked their own health to make a medical breakthrough or prove a theory, but Scott is not a scientist. He’s an amateur enthusiast, also known as a biohacker. Biohackers engage in DIY biology, experimenting on themselves to enhance their brain and body. And many of them — like Scott — see longevity as the ultimate prize.

Discrete Wavelet Transform Analysis of the Electroretinogram in Autism Spectrum Disorder and Attention Deficit Hyperactivity Disorder

Background: To evaluate the electroretinogram waveform in autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD) using a discrete wavelet transform (DWT) approach.

Methods: A total of 55 ASD, 15 ADHD and 156 control individuals took part in this study. Full field light-adapted electroretinograms (ERGs) were recorded using a Troland protocol, accounting for pupil size, with five flash strengths ranging from −0.12 to 1.20 log photopic cd.s.m–2. A DWT analysis was performed using the Haar wavelet on the waveforms to examine the energy within the time windows of the a-and b-waves and the oscillatory potentials (OPs) which yielded six DWT coefficients related to these parameters. The central frequency bands were from 20–160 Hz relating to the a-wave, b-wave and OPs represented by the coefficients: a20, a40, b20, b40, op80, and op160, respectively. In addition, the b-wave amplitude and percentage energy contribution of the OPs (%OPs) in the total ERG broadband energy was evaluated.

Results: There were significant group differences (p < 0.001) in the coefficients corresponding to energies in the b-wave (b20, b40) and OPs (op80 and op160) as well as the b-wave amplitude. Notable differences between the ADHD and control groups were found in the b20 and b40 coefficients. In contrast, the greatest differences between the ASD and control group were found in the op80 and op160 coefficients. The b-wave amplitude showed both ASD and ADHD significant group differences from the control participants, for flash strengths greater than 0.4 log photopic cd.s.m–2 (p < 0.001).

Neocortex saves energy

Despite constituting less than 2% of the body’s mass, the human brain consumes approximately 20% of total caloric intake, with 50% of the energy being used by cortex (Herculano-Houzel, 2011). The majority of this energy is spent by neurons to reverse the ion fluxes associated with electrical signaling via Na+/K+ ATPase (Attwell and Laughlin, 2001; Harris et al., 2012). Excitatory synaptic currents and action potentials are particularly costly in this regard, accounting for approximately 57% and 23% of the energy budget for electrical signaling in gray matter, respectively (Harris et al., 2012; Sengupta et al., 2010). Given this cost, and the scarcity of resources, the brain is thought to have evolved an energy-efficient coding strategy that maximizes information transmission per unit energy (i.e., ATP) (Barlow, 2012; Levy and Baxter, 1996). This strategy accounts for a number of cellular features, including the low mean firing rate of neurons and the high failure rate of synaptic transmission, as well as higher order features, such as the structure of neuronal receptive fields (Albert et al., 2008; Attwell and Laughlin, 2001; Harris et al., 2015; Levy and Baxter, 1996; Olshausen and Field, 1997; Sterling and Laughlin, 2015). Scarcity of food, therefore, appears to have strongly sculpted information coding in the brain throughout evolution.

Energy intake is not fixed but can vary substantially across individuals, environments, and time (Hladik, 1988; Knott, 1998). Given that the brain is energy limited, one hypothesis is that in times of food scarcity, neuronal networks should save energy by reducing information processing. There is some evidence to suggest that this is the case in invertebrates (Kauffman et al., 2010; Longden et al., 2014; Plaçais et al., 2017; Placais and Preat, 2013). In Drosophila 0, food deprivation inactivates neural pathways required for long-term memory to preserve energy (Plaçais et al., 2017; Placais and Preat, 2013). Experimental re-activation of these pathways restores memory formation but significantly reduces survival rates (Placais and Preat, 2013). Similar memory impairments are seen with reduced food intake in C. elegans (Kauffman et al., 2010). Moreover, in blowfly, food deprivation reduces visual interneuron responses during locomotion, consistent with energy savings (Longden et al., 2014). However, it remains unclear whether and how the mammalian brain, and cortical networks in particular, regulate information processing and energy use in times of food scarcity.

Here we used the mouse primary visual cortex (V1) as a model system to examine how food restriction affects information coding and energy consumption in cortical networks. We assessed neuronal activity and ATP consumption using whole-cell patch-clamp recordings and two-photon imaging of V1 layer 2/3 excitatory neurons in awake, male mice. We found that food restriction, resulting in a 15% reduction of body weight, led to a 29% reduction in ATP expenditure associated with excitatory postsynaptic currents, which was mediated by a decrease in single-channel AMPA receptor (AMPAR) conductance. Reductions in AMPAR current were compensated by an increase in input resistance and a depolarization of the resting membrane potential, which preserved neuronal excitability; neurons were therefore able to generate a comparable rate of spiking as controls, while spending less ATP on the underlying excitatory currents. This energy-saving strategy, however, had a cost to coding precision. Indeed, we found that an increase in input resistance and depolarization of the resting membrane potential also increased the subthreshold variability of visual responses, which increased the probability for small depolarizations to cross spike threshold, leading to a broadening of orientation tuning by 32%. Broadened tuning was associated with reduced coding precision of natural scenes and behavioral impairment in fine visual discrimination. We found that these deficits in visual coding under food restriction correlated with reduced circulating levels of leptin, a hormone secreted by adipocytes in proportion to fat mass (Baile et al., 2000), and were restored by exogenous leptin supplementation. Our findings reveal key metabolic state-dependent mechanisms by which the mammalian cortex regulates coding precision to preserve energy in times of food scarcity.

/* */