Toggle light / dark theme

Vicarious body maps bridge vision and touch in the human brain

A central question in sensory neuroscience is how inputs from vision and touch are combined to generate cohesive representations of the external world. Here we reveal a widespread mode of brain organization in which aligned topographic maps bridge vision and somatosensation. We developed a computational model that revealed somatotopic structure in dorsolateral visual cortex. Somatotopic tuning in these regions was predictive of visual field locations more dorsally and visual body part selectivity more ventrally. These results suggest more extensive cross-modal overlap than traditionally assumed: the computational machinery classically attributed to the somatosensory system is also embedded within and aligned with that of the visual system. These aligned visual and bodily maps are a likely brain substrate for internalized somatosensory representations of visual signals, and are a candidate human homologue of findings in mice whereby somatomotor responses dominate visual cortex36.

Consistent with embodied perception theories, our model-based quantifications of somatotopic and retinotopic connectivity revealed that dorsolateral visual cortical responses to naturalistic stimuli are best explained by selectivities in both modalities, as opposed to visual selectivity alone. The necessity of incorporating body-referenced processing into models of dorsolateral visual cortex supports evidence that its role extends beyond passive visual analysis, encompassing perceptual, semantic and bodily functions optimized for behavioural interactions with the world25.

Consistent with visuospatial alignment of somatosensory tuning, we found that body part preferences in dorsolateral visual cortex predicted visual field tuning. Such alignment, previously reported at the terminus of the dorsal visual pathway around the postcentral sulcus28, therefore extends far into dorsal and lateral streams of the visual system. This alignment may be reinforced by shared developmental influences, as somatotopic and retinotopic maps are shaped trophically from birth: dorsal regions represent the upper body and visual field, and ventral regions to the lower body and visual field22, providing a roughly aligned sensory periphery optimized for efficient environmental sampling and action. The explicit interweaving of touch and retinal coordinates may subserve efficient perception of environmental affordances and a cohesive sense of spatial self-representation.

Musicians drift less in blindfolded walk: Could musical training be utilized in cognitive rehabilitation?

A multi-institutional team of researchers led by Université de Montréal report that extensive musical training can steady the body in space, both with and without guiding sounds, during a blindfolded stepping test.

Spatial cognition is at the heart of everyday movement, linking mental maps of the environment with the body’s position and orientation. Spatial abilities support tasks such as mental rotation, navigation, walking through space, and maintaining spatial information in working memory, all of which depend on a stable sense of where the body is located.

Body representation provides a solution to what some researchers describe as the computational “where” problem of the body, knitting together inputs from vision, touch, and the vestibular system. Auditory cues join this network as well, supplying information that can help stabilize posture and guide movement when other senses are limited or absent, as described in prior work on postural control and ambulation.

Breakthrough Simulation Maps Every Star in The Milky Way in Scientific First

The Milky Way contains more than 100 billion stars, each following its own evolutionary path through birth, life, and sometimes violent death.

For decades, astrophysicists have dreamed of creating a complete simulation of our galaxy, a digital twin that could test theories about how galaxies form and evolve. That dream has always crashed against an impossible computational wall.

Until now.

Airborne sensors map ammonia plumes in California’s Imperial Valley

A recent study led by scientists at NASA’s Jet Propulsion Laboratory in Southern California and the nonprofit Aerospace Corporation shows how high-resolution maps of ground-level ammonia plumes can be generated with airborne sensors, highlighting a way to better track the gas.

A key chemical ingredient of fine particulate matter—tiny particles in the air known to be harmful when inhaled—ammonia can be released through agricultural activities such as livestock farming and geothermal power generation as well as natural geothermal processes. Because it’s not systematically monitored, many sources of the pungent gas go undetected.

Published in Atmospheric Chemistry and Physics, the study focuses on a series of 2023 research flights that covered the Imperial Valley to the southeast of the Salton Sea in inland Southern California, as well as the Eastern Coachella Valley to its northwest. Prior satellite-based research has identified the Imperial Valley as a prolific source of gaseous ammonia.

Study maps the time and energy patterns of electron pairs in ultrafast pulses

The ability to precisely study and manipulate electrons in electron microscopes could open new possibilities for the development of both ultrafast imaging techniques and quantum technologies.

Over the past few years, physicists have developed new experimental tools for studying the behavior of electrons not bounded to any material by utilizing the so-called nanoscale field emitters, tiny metallic tips that release electrons when exposed to strong electric fields.

Researchers at the Max Planck Institute for Multidisciplinary Sciences recently carried out a study aimed at shedding new light on how pairs of emitted electrons relate to each other and how their behavior unfolds over time.

Quantum imaging settles 20-year debate on gold surface electron spin direction

Researchers at the Institute for Molecular Science (IMS) have definitively resolved a two-decade-long controversy regarding the direction of electron spin on the surface of gold.

Using a state-of-the-art Photoelectron Momentum Microscope (PMM) at the UVSOR synchrotron facility, the team captured complete two-dimensional snapshots of the Au(111) Shockley surface state, mapping both the electron’s spin (its intrinsic magnetic property) and its orbital shape in a projection-based measurement. The work is published in the Journal of the Physical Society of Japan.

The experiment unambiguously confirmed the Rashba effect—where an electron’s motion is coupled to its spin—by assigning a clockwise (cw) spin texture to the outer electron band and a counterclockwise (ccw) texture to the inner band when viewed from the vacuum side.

Dark energy might be changing and so is the Universe

Dark energy may be alive and changing, reshaping the cosmos in ways we’re only beginning to uncover. New supercomputer simulations hint that dark energy might be dynamic, not constant, subtly reshaping the Universe’s structure. The findings align with recent DESI observations, offering the strongest evidence yet for an evolving cosmic force.

Since the early 20th century, scientists have gathered convincing evidence that the Universe is expanding — and that this expansion is accelerating. The force responsible for this acceleration is called dark energy, a mysterious property of spacetime thought to push galaxies apart. For decades, the prevailing cosmological model, known as Lambda Cold Dark Matter (ΛCDM), has assumed that dark energy remains constant throughout cosmic history. This simple but powerful assumption has been the foundation of modern cosmology. Yet, it leaves one key question unresolved: what if dark energy changes over time instead of remaining fixed?

Recent observations have started to challenge this long-held view. Data from the Dark Energy Spectroscopic Instrument (DESI) — an advanced project that maps the distribution of galaxies across the Universe — suggests the possibility of a dynamic dark energy (DDE) component. Such a finding would mark a significant shift from the standard ΛCDM model. While this points to a more intricate and evolving cosmic story, it also exposes a major gap in understanding: how a time-dependent dark energy might shape the formation and growth of cosmic structures remains unclear.

Mapping AI’s brain reveals memory and reasoning are not located in the same place

Researchers studying how large AI models such as ChatGPT learn and remember information have discovered that their memory and reasoning skills occupy distinct parts of their internal architecture. Their insights could help make AI safer and more trustworthy.

AI models trained on massive datasets rely on at least two major processing features. The first is memory, which allows the system to retrieve and recite information. The second is reasoning, solving new problems by applying generalized principles and learned patterns. But up until now, it wasn’t known if AI’s memory and general intelligence are stored in the same place.

So researchers at the startup Goodfire.ai decided to investigate the internal structure of large language and vision models to understand how they work.

Mapping chromatin structure at base-pair resolution unveils a unified model of cis-regulatory element interactions

Now online! Li et al. apply base-pair resolution Micro Capture-C ultra to map chromatin contacts between individual motifs within cis-regulatory elements and reveal a unified model of biophysically mediated enhancer-promoter communication.

/* */