Toggle light / dark theme

Attosecond time-resolved experiments have revealed the increasing importance of electronic correlations in the collective plasmon response as the size of the system decreases to sub-nm scales.

The study, published in the journal Science Advances, was led by the University of Hamburg and DESY as part of a collaboration with Stanford, SLAC National Accelerator Laboratory, Ludwig-Maximilians-Universität München, Northwest Missouri State University, Politecnico di Milano and the Max Planck Institute for the Structure and Dynamics of Matter.

Plasmons are collective electronic excitations that give rise to unique effects in matter. They provide a means of achieving extreme light confinement, enabling groundbreaking applications such as efficient solar energy harvesting, ultrafine sensor technology, and enhanced photocatalysis.

To test this new system, the team executed what is known as Grover’s search algorithm—first described by Indian-American computer scientist Lov Grover in 1996. This search looks for a particular item in a large, unstructured dataset using superposition and entanglement in parallel. The search algorithm also exhibits a quadratic speedup, meaning a quantum computer can solve a problem with the square root of the input rather than just a linear increase. The authors report that the system achieved a 71 percent success rate.

While operating a successful distributed system is a big step forward for quantum computing, the team reiterates that the engineering challenges remain daunting. However, networking together quantum processors into a distributed network using quantum teleportation provides a small glimmer of light at the end of a long, dark quantum computing development tunnel.

“Scaling up quantum computers remains a formidable technical challenge that will likely require new physics insights as well as intensive engineering effort over the coming years,” David Lucas, principal investigator of the study from Oxford University, said in a press statement. “Our experiment demonstrates that network-distributed quantum information processing is feasible with current technology.”

From punch card-operated looms in the 1800s to modern cellphones, if an object has an “on” and an “off” state, it can be used to store information.

In a computer laptop, the binary ones and zeroes are transistors either running at low or high voltage. On a compact disc, the one is a spot where a tiny indented “pit” turns to a flat “land” or vice versa, while a zero is when there’s no change.

Historically, the size of the object making the “ones” and “zeroes” has put a limit on the size of the storage device. But now, University of Chicago Pritzker School of Molecular Engineering (UChicago PME) researchers have explored a technique to make ones and zeroes out of crystal defects, each the size of an individual atom for classical computer memory applications.

Quantum sensors can be significantly more precise than conventional sensors and are used for Earth observation, navigation, material testing, and chemical or biomedical analysis, for example. TU Darmstadt researchers have now developed and tested a technique that makes quantum sensors even more precise.

What is behind this technology? Quantum sensors, based on the wave nature of , use quantum interference to measure accelerations and rotations with extremely high precision. This technology requires optimized beam splitters and mirrors for atoms. However, atoms that are reflected in unintentional ways can significantly impair such measurements.

The scientists therefore use specially designed as velocity-selective atom , which reflect the desired atoms and allow parasitic atoms to pass through. This approach reduces the noise in the signal, making the measurements much more precise. The research is published in the journal Physical Review Research.

Researchers are breaking new ground with halide perovskites, promising a revolution in energy-efficient technologies.

By exploring these materials at the nanoscale.

The term “nanoscale” refers to dimensions that are measured in nanometers (nm), with one nanometer equaling one-billionth of a meter. This scale encompasses sizes from approximately 1 to 100 nanometers, where unique physical, chemical, and biological properties emerge that are not present in bulk materials. At the nanoscale, materials exhibit phenomena such as quantum effects and increased surface area to volume ratios, which can significantly alter their optical, electrical, and magnetic behaviors. These characteristics make nanoscale materials highly valuable for a wide range of applications, including electronics, medicine, and materials science.

As quantum computers threaten traditional encryption, researchers are developing quantum networks to enable ultra-secure communication.

Scientists at Leibniz University Hannover have pioneered a new method using light frequencies to enhance quantum key distribution. This breakthrough reduces complexity, cuts costs, and paves the way for scalable, tap-proof quantum internet infrastructure.

Physicists have performed a groundbreaking simulation they say sheds new light on an elusive phenomenon that could determine the ultimate fate of the Universe.

Pioneering research in quantum field theory around 50 years ago proposed that the universe may be trapped in a false vacuum – meaning it appears stable but in fact could be on the verge of transitioning to an even more stable, true vacuum state. While this process could trigger a catastrophic change in the Universe’s structure, experts agree that predicting the timeline is challenging, but it is likely to occur over an astronomically long period, potentially spanning millions of years.

In an international collaboration between three research institutions, the team report gaining valuable insights into false vacuum decay – a process linked to the origins of the cosmos and the behaviour of particles at the smallest scales. The collaboration was led by Professor Zlatko Papic, from the University of Leeds, and Dr Jaka Vodeb, from the Jülich Supercomputing Centre (JSC) at Forschungszentrum Jülich, Germany.

Read “” by Sebastian Schepis on Medium.


Imagine a world where thoughts aren’t confined to the brain, but instantly shared across a vast network of neurons, transcending the limits of space and time. This isn’t science fiction, but a possibility hinted at by one of the most puzzling aspects of quantum physics: entanglement.

Quantum entanglement, famously dubbed spooky action at a distance by Einstein, describes a phenomenon where two or more particles become intrinsically linked. They share a quantum state, no matter how far apart they are. Change one entangled particle, and its partner instantly reacts, even across vast distances.

This property, which troubled Einstein, has been repeatedly confirmed through experiments, notably by physicist John Clauser and his colleagues, who received the 2022 Nobel Prize in Physics for their groundbreaking work on quantum entanglement.

To address this challenge, the researchers propose two alternative QV tests that sidestep classical simulation entirely. Their primary modification involves using parity-preserving quantum gates — gates that maintain the parity (even or odd sum) of qubits throughout the computation. This allows the heavy output subspace to be known in advance, eliminating the need for classical verification.

The first approach, the parity-preserving benchmark, modifies the structure of the quantum circuits while keeping the number of two-qubit interactions the same. The researchers argue that this change has minimal impact on experimental implementation but significantly reduces computational costs.

“Since the interaction part is unaffected, the number of fundamental two-qubit gates, 3 in case of CNOTs, remains unchanged,” they write in the paper.