Toggle light / dark theme

It’s called cognitive reserve, and it’s the phenomenon of the mind’s resistance to damage of the brain. It’s also the subject of not only an upcoming new data and biomedical sample resource, but also a related request for information (RFI) from the NIA and a first-of-its-kind workshop in September.

The push to study cognitive reserve in more depth across the scientific disciplines was born out of recommendations from the Cognitive Aging Summit III. Some 300 researchers attended the summit in Bethesda, Maryland in 2017. Coordinated by the NIA of the National Institutes of Health (NIH) and supported by the McKnight Brain Research Foundation, the summit centered on age-related brain and cognitive changes, with a particular focus on issues related to cognitive resilience and reserve. According to the NIA, investigators from around the world delivered presentations and engaged in discussion “about some of the most important scientific questions relating to the biological, physiological, social and behavioral aspects of reserve and resilience in aging individuals. Attendees also discussed strategies to preserve and bolster cognitive function during aging.”

Active-duty military members suffering from PTSD and traumatic brain injuries made these masks in an art therapy group at the National Intrepid Center of Excellence, located at the Walter Reed National Military Medical Center in Bethesda, Md.

This mask was created by an Airman who was exposed to multiple blast injuries in combat while working on a bomb disposal unit, called EOD — Explosive Ordnance Disposal. The mask incorporates the laurels, shield, and lightning bolts of the EOD insignia.

Credit: Chris Albert

Conscious “free will” is problematic because brain mechanisms causing consciousness are unknown, measurable brain activity correlating with conscious perception apparently occurs too late for real-time conscious response, consciousness thus being considered “epiphenomenal illusion,” and determinism, i.e., our actions and the world around us seem algorithmic and inevitable. The Penrose–Hameroff theory of “orchestrated objective reduction (Orch OR)” identifies discrete conscious moments with quantum computations in microtubules inside brain neurons, e.g., 40/s in concert with gamma synchrony EEG. Microtubules organize neuronal interiors and regulate synapses. In Orch OR, microtubule quantum computations occur in integration phases in dendrites and cell bodies of integrate-and-fire brain neurons connected and synchronized by gap junctions, allowing entanglement of microtubules among many neurons. Quantum computations in entangled microtubules terminate by Penrose “objective reduction (OR),” a proposal for quantum state reduction and conscious moments linked to fundamental spacetime geometry. Each OR reduction selects microtubule states which can trigger axonal firings, and control behavior. The quantum computations are “orchestrated” by synaptic inputs and memory (thus “Orch OR”). If correct, Orch OR can account for conscious causal agency, resolving problem 1. Regarding problem 2, Orch OR can cause temporal non-locality, sending quantum information backward in classical time, enabling conscious control of behavior. Three lines of evidence for brain backward time effects are presented. Regarding problem 3, Penrose OR (and Orch OR) invokes non-computable influences from information embedded in spacetime geometry, potentially avoiding algorithmic determinism. In summary, Orch OR can account for real-time conscious causal agency, avoiding the need for consciousness to be seen as epiphenomenal illusion. Orch OR can rescue conscious free will.

Keywords: microtubules, free will, consciousness, Penrose-Hameroff Orch OR, volition, quantum computing, gap junctions, gamma synchrony.

We have the sense of conscious control of our voluntary behaviors, of free will, of our mental processes exerting causal actions in the physical world. But such control is difficult to scientifically explain for three reasons:

Hi all.


Up until now, chip-makers have been piggybacking on the renowned Moore’s Law for delivering successive generations of chips that have more compute capabilities and are less power hungry. Now, these advancements are slowly coming to a halt. Researchers around the world are proposing alternative architectures to continue producing systems which are faster and more energy efficient. This article discusses those alternatives and reasons why one of them might have an edge over others in averting the chip design industry from getting stymied.

Moore’s law, or to put it differently — savior of chip-makers worldwide — was coined by Dr. Gordon Moore, the founder of Intel Corp, in 1965. The law states that the number of transistors on a chip would double every 2 years. But why the savior of chip-makers? This law was so powerful during the semiconductor boom that “people would auto-buy the next latest and greatest computer chip, with full confidence that it would be better than what they’ve got”, said former Intel engineer Robert P. Colwell. Back in the day writing a program with bad performance was not an issue as the programmer knew that Moore’s law would ultimately save him.

Problem that we are facing today is, the law is nearly dead! Or to avert from offending Moore fans — as Henry Samueli, chief technology officer for Broadcom says.

A new mouse study highlights the proteins responsible for LC3-associated endocytosis (LANDO), an autophagy process that is involved in degrading β-amyloid, the principal substance associated with Alzheimer’s disease.

Proteostasis

Proteins in the human brain can form misfolded, non-functional, and toxic clumps known as aggregates. Preventing these aggregates from forming, and removing them when they do, is a natural function of the human body, and it is known as proteostasis. However, as we age, this function degrades, and loss of proteostasis is one of the hallmarks of aging. The resulting accumulation of aggregates leads to several deadly diseases, one of which is Alzheimer’s.

Flashback to 2 years ago…


Scientists from Maastricht University have developed a method to look into the brain of a person and read out who has spoken to him or her and what was said. With the help of neuroimaging and data mining techniques the researchers mapped the brain activity associated with the recognition of speech sounds and voices.

In their Science article “‘Who’ is Saying ‘What’? Brain-Based Decoding of Human Voice and Speech,” the four authors demonstrate that speech sounds and voices can be identified by means of a unique ‘neural fingerprint’ in the listener’s brain. In the future this new knowledge could be used to improve computer systems for automatic speech and speaker recognition.

Seven study subjects listened to three different speech sounds (the vowels /a/, /i/ and /u/), spoken by three different people, while their brain activity was mapped using neuroimaging techniques (fMRI). With the help of data mining methods the researchers developed an algorithm to translate this brain activity into unique patterns that determine the identity of a speech sound or a voice. The various acoustic characteristics of vocal cord vibrations (neural patterns) were found to determine the brain activity.

The story behind the writing of Frankenstein is famous. In 1816, Mary Shelley and Percy Bysshe Shelley, summering near Lake Geneva in Switzerland, were challenged by Lord Byron to take part in a competition to write a frightening tale. Mary, only 18 years old, later had a waking dream of sorts where she imagined the premise of her book:

When I placed my head on my pillow, I did not sleep, nor could I be said to think. My imagination, unbidden, possessed and guided me, gifting the successive images that arose in my mind with a vividness far beyond the usual bounds of reverie. I saw — with shut eyes, but acute mental vision, — I saw the pale student of unhallowed arts kneeling beside the thing he had put together. I saw the hideous phantasm of a man stretched out, and then, on the working of some powerful engine, show signs of life, and stir with an uneasy, half vital motion.

This became the kernel of Frankenstein; or, The Modern Prometheus, the novel first published in London in 1818, with only 500 copies put in circulation.