Toggle light / dark theme

How most of the universe’s visible mass is generated: Experiments explore emergence of hadron mass

Deep in the heart of the matter, some numbers don’t add up. For example, while protons and neutrons are made of quarks, nature’s fundamental building blocks bound together by gluons, their masses are much larger than the individual quarks from which they are formed.

This leads to a central puzzle … why? In the theory of the strong interaction, known as quantum chromodynamics or QCD, quarks acquire their bare mass through the Higgs mechanism. The long-hypothesized process was confirmed by experiments at the CERN Large Hadron Collider in Switzerland and led to the Nobel Prize for Peter Higgs in 2013.

Yet the inescapable issue remains that “this mechanism contributes to the measured proton and neutron masses at the level of less than 2%,” said Victor Mokeev, a staff scientist and phenomenologist at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility.

Efficient quantum process tomography for enabling scalable optical quantum computing

Optical quantum computers are gaining attention as a next-generation computing technology with high speed and scalability. However, accurately characterizing complex optical processes, where multiple optical modes interact to generate quantum entanglement, has been considered an extremely challenging task.

A KAIST research team has overcome this limitation, developing a highly efficient technique that enables complete characterization of complex multimode in experiment. This technology, which can analyze large-scale operations with less data, represents an important step toward scalable and quantum communication technologies.

A research team led by Professor Young-Sik Ra from the Department of Physics has developed a Multimode Quantum Process Tomography technique capable of efficiently identifying the characteristics of second-order nonlinear optical quantum processes that are essential for optical quantum computing.

Q‑CTRL integrates Fire Opal with RIKEN’s IBM Quantum System Two to unlock maximum performance for hybrid quantum-classical computing

Performance management software is now available through RIKEN’s HPC environment, accelerating quantum-HPC hybrid application research.

New Quantum Algorithm Could Explain Why Matter Exists at All

Researchers used IBM’s quantum computers to create scalable quantum circuits that simulate matter under extreme conditions, offering new insight into fundamental forces and the origins of the universe. Simulating how matter behaves under extreme conditions is essential for exploring some of the d

Scientists Make “Dark” Light States Shine, Unlocking New Quantum Tech

A breakthrough in manipulating dark excitons could pave the way for next-generation quantum communication systems and ultra-compact photonic devices. A research group from the City University of New York and the University of Texas at Austin has developed a method to illuminate light states that

World’s Leading Scientific Supercomputing Centers Adopt NVIDIA NVQLink to Integrate Grace Blackwell Platform With Quantum Processors

NVIDIA today announced that the world’s leading scientific computing centers are adopting NVIDIA® NVQLink™, a first-of-its-kind, universal interconnect for linking quantum processors with state-of-the-art accelerated computing.

Supercomputer simulates quantum chip in unprecedented detail

A broad association of researchers from across Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California, Berkeley have collaborated to perform an unprecedented simulation of a quantum microchip, a key step forward in perfecting the chips required for this next-generation technology. The simulation used more than 7,000 NVIDIA GPUs on the Perlmutter supercomputer at the National Energy Research Scientific Computing Center (NERSC), a U.S. Department of Energy (DOE) user facility.

Modeling quantum chips allows researchers to understand their function and performance before they’re fabricated, ensuring that they work as intended and spotting any problems that might come up. Quantum Systems Accelerator (QSA) researchers Zhi Jackie Yao and Andy Nonaka of the Applied Mathematics and Computational Research (AMCR) Division at Berkeley Lab develop electromagnetic models to simulate these chips, a key step in the process of producing better quantum hardware.

“The predicts how design decisions affect electromagnetic wave propagation in the ,” said Nonaka, “to make sure proper signal coupling occurs and avoid unwanted crosstalk.”

Unprecedented Perlmutter Simulation Details Quantum Chip

Designing quantum chips incorporates traditional microwave engineering in addition to advanced low-temperature physics. This makes a classical electromagnetic modeling tool like ARTEMIS, which was developed as part of the DOE’s Exascale Computing Project initiative, a natural choice for this type of modeling.

A large simulation for a tiny chip

Not every quantum chip simulation calls for so much computing capacity, but modeling the miniscule details of this tiny, extremely complex chip required nearly all of Perlmutter’s power. The researchers used almost all of its 7,168 NVIDIA GPUs over a period of 24 hours to capture the structure and function of a multi-layered chip measuring just 10 millimeters square and 0.3 millimeters thick, with etchings just one micron wide.

/* */