Researchers at Pointcloud GmbH in Zürich, Switzerland, have packed advanced 4D sensing technology — once too bulky for everyday use — onto a single silicon chip.
It’s a 4D imaging sensor that maps the physical world while simultaneously clocking the speed of every object it sees. It offers a low-cost, high-speed vision solution for everything from autonomous drones to future smartphones.
“This result demonstrates the capabilities of FMCW LiDAR FPA sensors as enablers of ubiquitous, low-cost, compact coherent 4D imaging cameras,” the researchers wrote in the study paper.
Researchers have identified a distinct and reproducible gene expression program associated with neurotransmission in the living human brain, offering unprecedented insight into the molecular mechanisms that support human cognition, emotion, and behavior. The findings were published February 19 in Molecular Psychiatry.
Neurotransmission-the electrical and chemical signaling between neurons-is fundamental to all brain function. Until now, most gene expression studies of the human brain have relied on postmortem tissue, limiting scientists’ ability to understand which genes are actively involved in real-time neuronal communication.
In this study, investigators integrated gene expression profiling from the prefrontal cortex with direct intracranial measures of neurotransmission collected from the brains of more than 100 individuals as they underwent neurosurgical procedures. By combining molecular data with real-time physiological recordings, the team identified a coordinated set of genes whose activity tracks with neuronal signaling-a transcriptional program associated with neurotransmission.
Does the universe need observers to exist? Neil deGrasse Tyson and co-hosts Chuck Nice and Gary O’Reilly explore questions about entropy, spontaneous symmetry breaking, spectroscopy and more with astrophysicist Charles Liu.
Does the universe require observers for information to exist? From Niels Bohr and the Copenhagen interpretation to modern neuroscience and philosophy, the crew explores whether measurement creates reality or reveals it. How does the double-slit experiment fit into this? Are wave and particle behaviors determined by how we measure them?
The conversation turns to information itself. What do physicists mean by “information”? How is entropy connected to hidden information in a system? We discuss entropy through everyday examples like coin flips, burning wood, and boiling water. How does this relate to quantum computing? We explore how astronomers separate cosmic redshift from stellar motion using spectroscopy, how interstellar dust and extinction curves complicate observations, and why mapping that dust is both a challenge and a source of discovery.
We discuss why the Big Bang didn’t form a black hole, how spontaneous symmetry breaking may have split the fundamental forces, and whether science can meaningfully investigate the universe’s earliest moments. Wrapping up, the team looks ahead to multi-messenger astronomy, next-generation telescope technology, exotic ideas about the speed of light, and how information continues to reshape what we know about the cosmos.
Thanks to our Patrons Avery Ellis, Markus Riegler, Linda Tullberg, Gami Lannin, Arief Aziz, Ron Lawhon, Corie Prater, Patrick McNaught, FracturedEquality, Spengler, Peter Harbeson, Oddron86, Hudson Lowe, Drew Romaniak, V2022, Kyle Ferchen, Branko Denčić, Patrick Borgquist, DJ Sipe, Andy Blair, Alan Keizer, SR, Nihat Cubukcu, Greg Lance, Diwas Pandit, Anik Kasumi, Alexander Albert, Kodai, Dyonne Peters Lewoc AKA DPTaterTot, Adrian, Ben Goff, Jose Barreiro, Saurabh Chaudhari, Wimberley Children’s House, Jean Arthur Deda, Jerrel Thomas, Serkan Ergenc, Douglas Kennedy, Lee Browner, Manuel Palmer, Dans Jansons, Russell Harvey, BladiX, Lars-Ove Torstensson, Norman Weizer, Arian Farkhoy, S. Madge, Pavel Seraphimov, Amanda Wolfe, Heisenberg, Mattchew Phillips, Caleb Berumen, Sretooh, Gary Tabbert, Oscar Abreu Lamas, Kevin Attebury, Volker Haberlandt, SeaGolly, B. Shoemaker, Ruben Ferrer, Steven Adams, Daniel Hintz, Nathaniel Richardson, Nick Griffiths, Adam Schmidt, Scott Plummer, Northernlight, JoMama, Beth, Frank Cottone, Yinj, Betty Anderson, Paul Smith, John Little, Emad Uddin, Brian O’Brien, Jayden Moffatt, Kevin Mace, Zara DeBresoc, Rain Bresee, Mara (Farmstrong), Rose, Stiven, Demethius Jackson, Alejandro Rodriguez, J Davis, Chris Buhler, Nathan Davieau, Sourav Prakash Patra, Wayne Rasmussen, John from Bavaria, Stephanie Phillips, Yohojones, Josh Farrell, John, Oo-De-Lally, Millie Richter, Montague Films, Lawrey Goodrick, and John Giovannettone for supporting us this week.
A stunning new map of the Milky Way reveals a dramatic magnetic flip hiding in plain sight. Deep inside the Milky Way, an invisible force is quietly holding everything together — its magnetic field. Now, researchers have created one of the most detailed maps ever of this hidden structure, revealing surprising twists in how it flows through our galaxy.
For generations, scientists have studied the stars and planets to better understand how our galaxy works. Now, Dr. Jo-Anne Brown, PhD, is focused on charting something we cannot see at all: the Milky Way’s magnetic field.
“Without a magnetic field, the galaxy would collapse in on itself due to gravity,” says Brown, a professor in the Department of Physics and Astronomy at the University of Calgary.
China just released DuClaw, a new platform that lets anyone run OpenClaw AI agents instantly from a web browser without dealing with deployment, servers, or API keys. At the same time, researchers at Stanford introduced OpenJarvis, a framework that allows personal AI assistants to run entirely on your own computer instead of the cloud. Meanwhile Google is using Gemini to build the largest flash flood dataset ever created, mapping millions of disaster events across the planet. And a new toolkit called gstack is turning AI coding into something far more autonomous, allowing AI systems to plan software, test applications, and review code automatically.
🧠 What You’ll See. Baidu launches DuClaw to run OpenClaw AI agents directly from a browser. SOURCE: https://pandaily.com/baidu-ai-cloud-l… introduces OpenJarvis for fully local AI assistants SOURCE: https://www.marktechpost.com/2026/03/.… Google uses Gemini to build the largest flash flood dataset ever created SOURCE: https://www.wsj.com/articles/google-t… gstack toolkit organizes AI into automated software development workflows SOURCE: https://www.producthunt.com/products/.… 🚨 Why It Matters These developments show how quickly artificial intelligence is moving toward more autonomous systems. From browser based AI agents that run instantly, to personal assistants that operate entirely on local machines, the way people interact with AI is changing rapidly. At the same time, large scale AI systems are being used to analyze global disasters and predict floods, while new developer tools are allowing AI to plan, test, and review software almost like an engineering team. #ai #artificialintelligence #ainews.
Current vision systems for robots and drones rely on 3D sensors that, although powerful, do not always keep up with the fast-paced, unpredictable movement of the real world. These systems often struggle to measure speed instantly or are too bulky and expensive for everyday use. Now, in a paper published in the journal Nature, scientists report how they have developed a 4D imaging sensor on a chip that creates 3D maps of an environment while simultaneously tracking the speed of moving objects.
The researchers built a focal plane array (FPA), a physical grid of 61,952 stationary pixels etched onto a single silicon chip. Each one is a tiny sensor that emits laser light toward a scene and detects the reflected signal.
To “see” its surroundings, laser light from an external source is fed into the chip. This light is routed across the chip through a network of optical switches that sequentially direct it to groups of pixels. Each pixel then uses a technique called FMCW LiDAR to measure the returning signal, which is later processed to determine distance and speed. In many LiDAR systems, one set of pixels sends the light, and another receives it, but here, all pixels both send and receive, making the system much more compact.
In a study published in Analytical Chemistry, researchers from the University of Amsterdam’s Van ‘t Hoff Institute for Molecular Sciences (HIMS) reveal a sobering reality regarding nontargeted chemical analysis. Although widely used for screening the environment for chemicals, this concept isn’t nearly as broad as its name suggests, leaving massive blind spots in the data.
(CMB, CMBR), or relic radiation, is microwave radiation that fills all space in the observable universe. With a standard optical telescope, the background space between stars and galaxies is almost completely dark. However, a sufficiently sensitive radio telescope detects a faint background glow that is almost uniform and is not associated with any star, galaxy, or other object. This glow is strongest in the microwave region of the electromagnetic spectrum. Its energy density exceeds that of all the photons emitted by all the stars in the history of the universe. The accidental discovery of the CMB in 1964 by American radio astronomers Arno Allan Penzias and Robert Woodrow Wilson was the culmination of work initiated in the 1940s.
The CMB is the key experimental evidence of the Big Bangtheory for the origin of the universe. In the Big Bang cosmological models, during the earliest periods, the universe was filled with an opaque fog of dense, hot plasma of sub-atomic particles. As the universe expanded, this plasma cooled to the point where protons and electrons combined to form neutral atoms of mostly hydrogen. Unlike the plasma, these atoms could not scatter thermal radiation by Thomson scattering, and so the universe became transparent. Known as the recombination epoch, this decoupling event released photons to travel freely through space. However, the photons have grown less energetic due to the cosmological redshift associated with the expansion of the universe. The surface of last scattering refers to a shell at the right distance in space so photons are now received that were originally emitted at the time of decoupling.
The CMB is very smooth and uniform, but maps by sensitive detectors detect small but important temperature variations. Ground and space-based experiments such as COBE, WMAP and Planck have been used to measure these temperature inhomogeneities. The anisotropy structure is influenced by various interactions of matter and photons up to the point of decoupling, which results in a characteristic pattern of tiny ripples that varies with angular scale. The distribution of the anisotropy across the sky has frequency components that can be represented by a power spectrum displaying a sequence of peaks and valleys. The peak values of this spectrum hold important information about the physical properties of the early universe: the first peak determines the overall curvature of the universe, while the second and third peak detail the density of normal matter and so-called dark matter, respectively.
Mass spectrometry imaging (MSI) is an advanced analytical technique that combines mass spectrometry with spatial mapping, enabling the direct, label-free detection and visualization of molecular distributions within biological tissues. This review comprehensively outlines the fundamental principles, major technological platforms, and recent applications of MSI in plant science. We detail key ionization techniques – matrix-assisted laser desorption/ionization (MALDI), desorption electrospray ionization (DESI), and secondary ion mass spectrometry (SIMS) – focusing on their ionization mechanisms and instrumental characteristics.
The reef is a home and feeding ground for dozens of species that depend on it the way a woodland creature depends on trees. It has survived ice ages – but whether it will survive increasing pressures from industrial fishing, deep-sea mining and climate change is, in part, a question about data. If we don’t know it exists, how can we protect it?
A new project called Deep Vision could fundamentally transform our understanding of the deep ocean by digging into pictures and videos sat largely unexamined in research archives around the world. By using AI, thousands of hours of seafloor footage can be analysed to produce the first comprehensive maps of vulnerable marine ecosystems across the entire Atlantic basin.
Over the past two decades, robotic and autonomous underwater vehicles have collected vast quantities of footage from the deep sea. This represents an extraordinary resource – a record of ecosystems that most humans will never see.