Toggle light / dark theme

RIKEN-led Project Seeks to Combine The Powers of Quantum Computers And Supercomputers

While supercomputers excel at general-purpose tasks and large-scale simulations, quantum computers specialize in problems involving exponential combinations (e.g., materials science, drug discovery, AI optimization). However, quantum systems currently require conventional computers to operate—a dependency that will intensify as they scale from today’s 100+ qubits to thousands or millions. The project envisions supercomputers acting as the “pianists” that play the quantum “piano.”

Twelve user groups are currently testing both systems. The project’s primary objective is to provide concrete answers to “What can quantum computers do *now*?” rather than speculating about future capabilities, while demonstrating practical advantages of tightly integrated hybrid computing for real-world scientific and industrial applications.


A RIKEN-led project is developing system software to tightly integrate quantum computers with supercomputers.

Quantum mechanics works, but it doesn’t describe reality

Physicists like Sean Carroll propose not only that quantum mechanics is not only a valuable way of interpreting the world, but actually describes reality, and that the wave function – the centre equation of quantum mechanics – describes a real object.

But, in this article, philosophers Raoni Arroyo and Jonas R. Becker Arenhart argue that the case for wave function realism is deeply confused. While it is a useful component within quantum theory, this alone doesn’t justify treating it as literally real.

Tap the link to read more.


Physicists like Sean Carroll argue not only that quantum mechanics is not only a valuable way of interpreting the world, but actually describes reality, and that the central equation of quantum mechanics – the wave function – describes a real object in the world. But philosophers Raoni Arroyo and Jonas R. Becker Arenhart warn that the arguments for wave-function realism are deeply confused. At best, they show only that the wave function is a useful element inside the theoretical framework of quantum mechanics. But this goes no way whatsoever to showing that this framework should be interpreted as true or that its elements are real. The wavefunction realists are confusing two different levels of debate and lack any justification for their realism. The real question is: does a theory need to be true to be useful?

1. Wavefunction realism

Quantum mechanics is probably our most successful scientific theory. So, if one wants to know what the world is made of, or how the world looks at the fundamental level, one is well-advised to search for the answers in this theory. What does it say about these problems? Well, that is a difficult question, with no single answer. Many interpretative options arise, and one quickly ends up in a dispute about the pros and cons of the different views. Wavefunction realists attempt to overcome those difficulties by looking directly at the formalism of the theory: the theory is a description of the behavior of a mathematical entity, the wavefunction, so why not think that quantum mechanics is, fundamentally, about wavefunctions? The view that emerges is, as Alyssa Ney puts it, that.

The AI & Quantum Revolution Reshaping Innovation

The AI & quantum revolution: redefining research & development, manufacturing & technological exploration

By Chuck Brooks


By Chuck Brooks, president of Brooks Consulting International

We are at a crucial juncture in the annals of technical history. Throughout decades of writing, lecturing, teaching and consulting on emerging technologies, I have observed cycles of invention transform companies, governments and society. The current frontier—a synthesis of artificial intelligence and quantum technologies—is propelling that shift more rapidly and deeply than ever before. These technologies are transforming research methodologies and changing the architecture of production and discovery, presenting remarkable potential alongside significant constraints.

Research & development reconceived: accelerated, intelligent & solution-oriented.

Direct 3D printing of nanolasers can boost optical computing and quantum security

In future high-tech industries, such as high-speed optical computing for massive AI, quantum cryptographic communication, and ultra-high-resolution augmented reality (AR) displays, nanolasers—which process information using light—are gaining significant attention as core components for next-generation semiconductors.

A research team has proposed a new manufacturing technology capable of high-density placement of nanolasers on semiconductor chips, which process information in spaces thinner than a human hair.

A joint research team led by Professor Ji Tae Kim from the Department of Mechanical Engineering and Professor Junsuk Rho from POSTECH, has developed an ultra-fine 3D printing technology capable of creating “vertical nanolasers,” a key component for ultra-high-density optical integrated circuits.

Advanced quantum detectors are reinventing the search for dark matter

When it comes to understanding the universe, what we know is only a sliver of the whole picture.

Dark matter and dark energy make up about 95% of the universe, leaving only 5% “ordinary matter,” or what we can see. Dr. Rupak Mahapatra, an experimental particle physicist at Texas A&M University, designs highly advanced semiconductor detectors with cryogenic quantum sensors, powering experiments worldwide and pushing the boundaries to explore this most profound mystery.

Mahapatra likens our understanding of the universe—or lack thereof—to an old parable: “It’s like trying to describe an elephant by only touching its tail. We sense something massive and complex, but we’re only grasping a tiny part of it.”

Solving quantum computing’s longstanding ‘no cloning’ problem with an encryption workaround

A team of researchers at the University of Waterloo have made a breakthrough in quantum computing that elegantly bypasses the fundamental “no cloning” problem. The research, “Encrypted Qubits can be Cloned,” appears in Physical Review Letters.

Quantum computing is an exciting technological frontier, where information is stored and processed in tiny units—called qubits. Qubits can be stored, for example, in individual electrons, photons (particles of light), atoms, ions or tiny currents.

Universities, industry, and governments around the world are spending billions of dollars to perfect the technology for controlling these qubits so that they can be combined into large, reliable quantum computers. This technology will have powerful applications, including in cybersecurity, materials science, medical research and optimization.

New framework unifies space and time in quantum systems

Quantum mechanics and relativity are the two pillars of modern physics. However, for over a century, their treatment of space and time has remained fundamentally disconnected. Relativity unifies space and time into a single fabric called spacetime, describing it seamlessly. In contrast, traditional quantum theory employs different languages: quantum states (density matrix) for spatial systems and quantum channels for temporal evolution.

A recent breakthrough by Assistant Professor Seok Hyung Lie from the Department of Physics at UNIST offers a way to describe quantum correlations across both space and time within a single, unified framework. Assistant Professor Lie is first author, with Professor James Fullwood from Hainan University serving as the corresponding author. Their collaboration creates new tools that could significantly impact future studies in quantum science and beyond. The study has been published in Physical Review Letters.

In this study, the team developed a new theoretical approach that treats the entire timeline as one quantum state. This concept introduces what they call the multipartite quantum states over time. In essence, it allows us to describe quantum processes at different points in time as parts of a single, larger quantum state. This means that both spatially separated systems and systems separated in time can be analyzed using the same mathematical language.

Electrons that lag behind nuclei in 2D materials could pave way for novel electronics

One of the great successes of 20th-century physics was the quantum mechanical description of solids. This allowed scientists to understand for the first time how and why certain materials conduct electric current and how these properties could be purposefully modified. For instance, semiconductors such as silicon could be used to produce transistors, which revolutionized electronics and made modern computers possible.

To be able to mathematically capture the complex interplay between electrons and atomic nuclei and their motions in a solid, physicists had to make some simplifications. They assumed, for example, that the light electrons in an atom follow the motion of the much heavier atomic nuclei in a crystal lattice without any delay. For several decades, this Born-Oppenheimer approximation worked well.

Error-correction technology to turn quantum computing into real-world power

Ripples spreading across a calm lake after raindrops fall—and the way ripples from different drops overlap and travel outward—is one image that helps us picture how a quantum computer handles information.

Unlike conventional computers, which process digital data as “0 or 1,” quantum computers can process information in an in-between state where it is “both 0 and 1.” These quantum states behave like waves: they can overlap, reinforcing one another or canceling one another out. In computations that exploit this property, states that lead to the correct answer are amplified, while states that lead to wrong answers are suppressed.

Thanks to this interference between waves, a quantum computer can sift through many candidate answers at once. Our everyday computers take time because they evaluate each candidate one by one. Quantum computers, by contrast, can narrow down the answer in a single sweep—earning them the reputation of “dream machines” that could solve in an instant problem that might take hundreds of years on today’s computers.

Making sense of quantum gravity in five dimensions

Quantum theory and Einstein’s theory of general relativity are two of the greatest successes in modern physics. Each works extremely well in its own domain: Quantum theory explains how atoms and particles behave, while general relativity describes gravity and the structure of spacetime. However, despite many decades of effort, scientists still do not have a satisfying theory that combines both into one clear picture of reality.

Most common approaches assume that gravity must also be described using quantum ideas. As physicist Richard Feynman once said, “We’re in trouble if we believe in quantum mechanics but don’t quantize gravity.” Yet quantum theory itself has deep unresolved problems. It does not clearly explain how measurements lead to definite outcomes, and it relies on strange ideas that clash with everyday experience, such as objects seemingly behaving like both waves and particles, and apparent nonlocal connections between distant systems.

These puzzles become even sharper because of Bell’s theorem. This theorem shows that no theory based on ordinary ideas—such as locality, an objective reality, and freely chosen measurements—can fully match the predictions of quantum theory within our usual four-dimensional view of space and time. These quantum predictions have been repeatedly confirmed in tests of entanglement, first discussed by Einstein, Podolsky, and Rosen (EPR). As a result, simple classical explanations limited to ordinary four-dimensional spacetime cannot fully account for what we observe.

/* */