Toggle light / dark theme

Quobly Toolbox Explores Quantum Phase Estimation Pipeline With Tensor Networks

An international collaboration between a French quantum startup and a major Taiwanese electronics manufacturer has yielded a new open-source tool for exploring a critical area of quantum computing. Quobly and Taiwan’s Hon Hai Research Institute, the R&D arm of Foxconn, jointly released a numerical toolbox dedicated to the Quantum Phase Estimation (QPE) algorithm, described as a cornerstone of fault-tolerant quantum computing with major applications in quantum chemistry and materials science. While QPE’s theoretical benefits are understood, simulating its practical resource needs has proven difficult; the toolbox aims to bridge this gap by allowing researchers to explore implementations and their implications. The tool focuses on practical, interpretable numerical experiments, enabling full circuit executions for up to 20 qubits and circuits ranging from 1,000 to 100,000 gates on standard laptops.

Quantum Phase Estimation Toolbox for Molecular Systems

While the theoretical underpinnings of QPE are well established, simulating its practical demands has proven a significant hurdle, limiting exploration beyond simplified models. The toolbox addresses this gap by offering a platform for practical, interpretable numerical experiments, allowing scientists to investigate QPE implementations without requiring access to full-scale quantum hardware, which is currently unavailable. Built upon advanced tensor network techniques and the open-source quimb library, the toolbox facilitates the preparation of initial states using DMRG and matrix product states, and allows encoding of molecular Hamiltonians into quantum circuits through methods like trotterization and qubitization. Researchers can directly compare standard QPE with the single-ancilla Robust Phase Estimation (RPE) method, analyzing circuit depth, gate counts, and potential error sources.

String theory is uniquely derived from basic assumptions about the universe, physicists show

If you could take an apple and break it into smaller and smaller parts, you would find molecules, then atoms, followed by subatomic particles like protons and the quarks and gluons that make them up. You might think you hit the bottom, but, according to string theorists, if you keep going to even smaller scales—about a billion billion times smaller than a proton—you will find more: tiny vibrating strings.

Developed in the 1960s, string theory proposes that everything in the universe is made from invisible strings. The theory arose as a possible solution to the problem of “quantum gravity,” the quest to align quantum mechanics, which describes our world at the smallest scales, with the general theory of relativity, which explains how our universe works on the largest scales (and includes gravity). Researchers have tried to reconcile the two theories—asking, for example, how gravity behaves in the quantum realm—but their equations go berserk, or in mathematical terms, go to infinity.

String theory is a mathematical solution that tames the unruly infinities. It purports that all particles, including the graviton—the hypothetical particle believed to convey the force of gravity—are generated by very small vibrating strings. The math behind string theory requires the strings to vibrate in at least 10 dimensions, rather than the four we live in (three for space and one for time), which is one of the reasons some scientists are not convinced that string theory is correct. But perhaps the biggest challenge for the theory is the ultrahigh energies required for testing it: Such an experiment would require a particle collider the size of a galaxy.

Engineered proteins store digital files with 30 times density at one-tenth cost

Massive volumes of digital data are generated every day from AI training, big data analytics and smart devices. As conventional hard drives and cloud storage are increasingly constrained by high costs, limited capacity, high power consumption and short lifespans, molecular data storage has emerged as a breakthrough storage alternative.

Researchers at The Hong Kong Polytechnic University (PolyU) have pioneered a method that uses engineered proteins to store digital data and, for the first time, completed the full process from data storage to data retrieval in de novo designed unnatural proteins.

This demonstrates the potential of establishing a protein-based storage framework with sustainability, high storage capacity and high stability, offering a promising solution to the explosive AI-generated growth in data globally.

String Theory Emerges from “Almost Nothing”

Developed in the 1960s, string theory proposes that everything in the universe is made from invisible strings. The theory arose as a possible solution to the problem of “quantum gravity,” the quest to align quantum mechanics, which describes our world at the smallest scales, with the general theory of relativity, which explains how our universe works on the largest scales (and includes gravity). Researchers have tried to reconcile the two theories—asking, for example, how gravity behaves in the quantum realm—but their equations go berserk, or in mathematical terms, go to infinity.

String theory is a mathematical solution that tames the unruly infinities. It purports that all particles, including the graviton—the hypothetical particle believed to convey the force of gravity—are generated by very small vibrating strings. The math behind string theory requires the strings to vibrate in at least 10 dimensions, rather than the four we live in (three for space and one for time), which is one of the reasons some scientists are not convinced that string theory is correct. But perhaps the biggest challenge for the theory is the ultrahigh energies required for testing it: Such an experiment would require a particle collider the size of a galaxy.

What is a physicist to do? One way they can probe the theory is to turn to a “bootstrap” approach, in which researchers start with certain assumptions they believe to be true about the universe, and then see what laws emerge out of those assumptions. In a new paper titled “Strings from Almost Nothing,” accepted for publication in Physical Review Letters, Caltech researchers, and their colleagues at New York University and Institut de Fisica d’Altes Energies in Barcelona, have done just that. From a couple of basic assumptions about how particles should scatter off one another at very high energies, they derived the elements of string theory.

Universal Bridge Theorem

We proved that our Universe was made from AI Algorithm.


What if spacetime itself is the result of a gigantic self-learning quantum neural network? 🤯🌌

A new framework called the Universal Bridge Theorem (UBT) proposes a deep equivalence between:

🧠 Neural network training.
and.
🌌 The evolution of spacetime geometry.

The proposal combines:

Signal-folding design helps neuromorphic chip slash AI energy use

Artificial intelligence systems, such as large language models (LLMs) and convolutional neural networks (CNNs), can analyze large amounts of data and rapidly generate desired content or identify meaningful patterns. However, when running on existing hardware, such as smartphones, laptops and tablets, these systems typically consume a large amount of energy.

Over the past decade or so, electronics engineers have been increasingly working on alternative hardware systems that could run AI models more energy efficiently. Many of these systems are neuromorphic, meaning that they are inspired by the structure and functioning of the human brain.

Researchers at Huazhong University of Science and Technology and the Chinese University of Hong Kong recently introduced a new approach for designing neuromorphic computing hardware based on two-dimensional materials. Their proposed strategy, introduced in a paper published in Nature Electronics, was used to develop a chip based on the 2D semiconductor molybdenum disulfide (MoS2) that can reliably run AI algorithms while consuming less power.

3D atomic rearrangement creates 40,000 quantum defects in 40 minutes

It’s been 37 years since scientists first demonstrated the ability to move single atoms, suggesting the possibility of designing materials atom by atom to customize their properties. Today there are several techniques that allow researchers to move individual atoms in order to give materials exotic quantum properties and improve our understanding of quantum behavior.

But existing techniques can only move atoms across the surface of materials in two dimensions. Most also require painstakingly slow processes and high-vacuum, ultracold lab conditions.

Now a team of researchers at MIT, the Department of Energy’s Oak Ridge National Laboratory, and other institutions has created a way to precisely move tens of thousands of individual atoms within a material in minutes at room temperature. The approach uses a set of algorithms to carefully position an electron beam at specific locations of a material, then scan the beam to drive atomic motions.

Open-source ‘digital twin’ enables end-to-end testing of applications over wireless

Researchers at the University of California San Diego have developed an open-source “digital twin” of a wireless network, giving graduate students, startups and other innovators a free, easy-to-use way to test new technologies and get fast, realistic feedback. The platform could help accelerate the pace of wireless innovation.

“We are building a software replica of everything that happens when you use your phone, from the wireless signals traveling through the environment to the cellular network and apps that deliver data and services like video and Instagram,” said Dinesh Bharadia, associate professor in the Department of Electrical and Computer Engineering at the UC San Diego Jacobs School of Engineering, an affiliate of the UC San Diego Qualcomm Institute and senior author of the paper.

“This will help industry and academia build new protocols and algorithms faster using software and AI, with less need for real-world experiments.”

New Linux ‘Dirty Frag’ zero-day gives root on all major distros

A new Linux zero-day exploit, named Dirty Frag, allows local attackers to gain root privileges on most major Linux distributions with a single command.

Security researcher Hyunwoo Kim, who disclosed it earlier today and published a proof-of-concept (PoC) exploit, says this local privilege escalation was introduced roughly nine years ago in the Linux kernel’s algif_aead cryptographic algorithm interface.

Dirty Frag works by chaining two separate kernel flaws, the xfrm-ESP Page-Cache Write vulnerability and the RxRPC Page-Cache Write vulnerability, to modify protected system files in memory without authorization and achieve privilege escalation.

Cellular and subcellular specialization enables biology-constrained deep learning

Galloni et al. introduce “dendritic target propagation”: a Dale’s law-compliant learning algorithm for cortical microcircuits with soma-and dendrite-targeting inhibition and realistic connectivity constraints. By combining experimentally derived BTSP and Hebbian rules, dendrites compute local error proxies via E/I mismatch, supporting gradient-based deep learning during simultaneous bottom-up and top-down signaling.

/* */