Toggle light / dark theme

Data Science and Machine Learning: Mathematical and

D.P. Kroese, Z.I. Botev, T. Taimre, R. Vaisman. Data Science and Machine Learning: Mathematical and Statistical Methods, Chapman and Hall/CRC, Boca Raton, 2019.

The purpose of this book is to provide an accessible, yet comprehensive textbook intended for students interested in gaining a better understanding of the mathematics and statistics that underpin the rich variety of ideas and machine learning algorithms in data science.

Identification of proliferating neural progenitors in the adult human hippocampus

Continuous adult hippocampal neurogenesis is involved in memory formation and mood regulation but is challenging to study in humans. Difficulties finding proliferating progenitor cells called into question whether and how new neurons may be generated. We analyzed the human hippocampus from birth through adulthood by single-nucleus RNA sequencing. We identified all neural progenitor cell stages in early childhood. In adults, using antibodies against the proliferation marker Ki67 and machine learning algorithms, we found proliferating neural progenitor cells. Furthermore, transcriptomic data showed that neural progenitors were localized within the dentate gyrus. The results contribute to understanding neurogenesis in adult humans.

Improving randomness may be the key to more powerful quantum computers

Understanding randomness is crucial in many fields. From computer science and engineering to cryptography and weather forecasting, studying and interpreting randomness helps us simulate real-world phenomena, design algorithms and predict outcomes in uncertain situations.

Randomness is also important in quantum computing, but generating it typically involves a large number of operations. However, Thomas Schuster and colleagues at the California Institute of Technology have demonstrated that quantum computers can produce randomness much more easily than previously thought.

And that’s good news because the research could pave the way for faster and more efficient quantum computers.

New imaging technique reconstructs the shapes of hidden objects

A new imaging technique developed by MIT researchers could enable quality-control robots in a warehouse to peer through a cardboard shipping box and see that the handle of a mug buried under packing peanuts is broken.

Their approach leverages millimeter wave (mmWave) signals, the same type of signals used in Wi-Fi, to create accurate 3D reconstructions of objects that are blocked from view.

The waves can travel through common obstacles like plastic containers or interior walls, and reflect off hidden objects. The system, called mmNorm, collects those reflections and feeds them into an algorithm that estimates the shape of the object’s surface.

DNA as a perfect quantum computer based on the quantum physics principles

I believe that dna will be able to answer just about all our genetic coding questions so much that it will lead to even better breakthroughs in the future and use hardly any energy. I believe also that the master algorithm can eventually be derived from DNA as dna seems already a perfect master algorithm for human beings where human beings are the key to all future progress. I say this as quantum computing is still not stable but we already know that dna computers seem already a masterpiece already especially even organoids of the human brain. Really it becomes really quite simple as even the quantum realm is unstable but dna computers that are quantum would stabilize this currently unstable realm.


Riera Aroche, R., Ortiz García, Y.M., Martínez Arellano, M.A. et al. DNA as a perfect quantum computer based on the quantum physics principles. Sci Rep 14, 11,636 (2024). https://doi.org/10.1038/s41598-024-62539-5

Download citation.

Faster topology optimization: An emerging industrial design technique gets a speed boost

With the rise of 3D printing and other advanced manufacturing methods, engineers can now build structures that were once impossible to fabricate. An emerging design strategy that takes full advantage of these new capabilities is topology optimization—a computer-driven technique that determines the most effective way to distribute material, leading to an optimized design.

Now, a research team including mathematicians from Brown University has developed a new approach that dramatically improves the speed and stability of topology optimization algorithms. The team, a collaboration between researchers at Brown, Lawrence Livermore National Laboratory and Simula Research Laboratory in Norway, detailed their work in two recently published papers in the SIAM Journal on Optimization and Structural and Multidisciplinary Optimization.

“Our method beats some existing methods by four or five times in terms of efficiency,” said Brendan Keith, an assistant professor of applied mathematics at Brown. “That’s a huge computational savings that could enable people to make designs more quickly and inexpensively, or to develop more complex designs with higher resolution.”

Ultrafast 12-minute MRI maps brain chemistry to spot disease before symptoms

Illinois engineers fused ultrafast imaging with smart algorithms to peek at living brain chemistry, turning routine MRIs into metabolic microscopes. The system distinguishes healthy regions, grades tumors, and forecasts MS flare-ups long before structural MRI can. Precision-medicine neurology just moved closer to reality.

Navier–Stokes existence and smoothness

The problem concerns the mathematical properties of solutions to the Navier–Stokes equations, a system of partial differential equations that describe the motion of a fluid in space. Solutions to the Navier–Stokes equations are used in many practical applications. However, theoretical understanding of the solutions to these equations is incomplete. In particular, solutions of the Navier–Stokes equations often include turbulence, which remains one of the greatest unsolved problems in physics, despite its immense importance in science and engineering.

New hybrid quantum–classical computing approach used to study chemical systems

Caltech professor of chemistry Sandeep Sharma and colleagues from IBM and the RIKEN Center for Computational Science in Japan are giving us a glimpse of the future of computing. The team has used quantum computing in combination with classical distributed computing to attack a notably challenging problem in quantum chemistry: determining the electronic energy levels of a relatively complex molecule.

The work demonstrates the promise of such a quantum–classical hybrid approach for advancing not only , but also fields such as , nanotechnology, and drug discovery, where insight into the electronic fingerprint of materials can reveal how they will behave.

“We have shown that you can take classical algorithms that run on high-performance classical computers and combine them with quantum algorithms that run on quantum computers to get useful chemical results,” says Sharma, a new member of the Caltech faculty whose work focuses on developing algorithms to study quantum . “We call this quantum-centric supercomputing.”

Reports in Advances of Physical Sciences

In this paper, the authors propose a three-dimensional time model, arguing that nature itself hints at the need for three temporal dimensions. Why three? Because at three different scales—the quantum world of tiny particles, the realm of everyday physical interactions, and the grand sweep of cosmological evolution—we see patterns that suggest distinct kinds of “temporal flow.” These time layers correspond, intriguingly, to the three generations of fundamental particles in the Standard Model: electrons and their heavier cousins, muons and taus. The model doesn’t just assume these generations—it explains why there are exactly three and even predicts their mass differences using mathematics derived from a “temporal metric.”


This paper introduces a theoretical framework based on three-dimensional time, where the three temporal dimensions emerge from fundamental symmetry requirements. The necessity for exactly three temporal dimensions arises from observed quantum-classical-cosmological transitions that manifest at three distinct scales: Planck-scale quantum phenomena, interaction-scale processes, and cosmological evolution. These temporal scales directly generate three particle generations through eigenvalue equations of the temporal metric, naturally explaining both the number of generations and their mass hierarchy. The framework introduces a metric structure with three temporal and three spatial dimensions, preserving causality and unitarity while extending standard quantum mechanics and field theory.