Toggle light / dark theme

Lab-in-the-loop framework enables rapid evolution of complex multi-mutant proteins

The search space for protein engineering grows exponentially with complexity. A protein of just 100 amino acids has 20100 possible variants—more combinations than atoms in the observable universe. Traditional engineering methods might test hundreds of variants but limit exploration to narrow regions of the sequence space. Recent machine learning approaches enable broader searches through computational screening. However, these approaches still require tens of thousands of measurements, or 5–10 iterative rounds.

With the advent of these foundational protein models, the bottleneck for protein engineering swings back to the lab. For a single protein engineering campaign, researchers can only efficiently build and test hundreds of variants. What is the best way to choose those hundreds to most effectively uncover an evolved protein with substantially increased function? To address this problem, researchers have developed MULTI-evolve, a framework for efficient protein evolution that applies machine learning models trained on datasets of ~200 variants focused specifically on pairs of function-enhancing mutations.

Published in Science, this work represents Arc Institute’s first lab-in-the-loop framework for biological design, where computational prediction and experimental design are tightly integrated from the outset, reflecting a broader investment in AI-guided research.

Particles don’t always go with the flow (and why that matters)

It is commonly assumed that tiny particles just go with the flow as they make their way through soil, biological tissue, and other complex materials. But a team of Yale researchers led by Professor Amir Pahlavan shows that even gentle chemical gradients, such as a small change in salt concentration, can dramatically reshape how particles move through porous materials. Their results are published in Science Advances.

How small particles known as colloids, like fine clays, microbes, or engineered particles, move through porous materials such as soil, filters, and biological tissue can have significant and wide-ranging effects on everything from environmental cleanups to agriculture.

It’s long been known that chemical gradients—that is, gradual changes in the concentration of salt or other chemicals—can drive colloids to migrate directionally, a phenomenon known as diffusiophoresis. But it was often assumed that this effect would matter only when there was little or no flow, because phoretic speeds are typically orders of magnitude smaller than average flow speeds in porous media. Experiments set up in Pahlavan’s lab demonstrated a very different outcome.

Microscopic mirrors for future quantum networks: A new way to make high-performance optical resonators

Researchers in the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and the Faculty of Arts and Sciences have devised a new way to make some of the smallest, smoothest mirrors ever created for controlling single particles of light, known as photons. These mirrors could play key roles in future quantum computers, quantum networks, integrated lasers, environmental sensing equipment, and more.

A team from the labs of Marko Lončar, the Tiantsai Lin Professor of Electrical Engineering at SEAS; Mikhail Lukin, the Joshua and Beth Friedman University Professor in the Department of Physics; and Kiyoul Yang, assistant professor of electrical engineering at SEAS; have described their new method for making high-performance, curved optical mirrors in a study published in Optica.

Using two such mirrors to trap light between them, the team demonstrated state-of-the-art optical resonators that can control light at near-infrared wavelengths, which is important for manipulating single atoms in quantum computing applications.

Machine learning algorithm fully reconstructs LHC particle collisions

The CMS Collaboration has shown, for the first time, that machine learning can be used to fully reconstruct particle collisions at the LHC. This new approach can reconstruct collisions more quickly and precisely than traditional methods, helping physicists better understand LHC data. The paper has been submitted to the European Physical Journal C and is currently available on the arXiv preprint server.

Each proton–proton collision at the LHC sprays out a complex pattern of particles that must be carefully reconstructed to allow physicists to study what really happened. For more than a decade, CMS has used a particle-flow (PF) algorithm, which combines information from the experiment’s different detectors, to identify each particle produced in a collision. Although this method works remarkably well, it relies on a long chain of hand-crafted rules designed by physicists.

The new CMS machine-learning-based particle-flow (MLPF) algorithm approaches the task fundamentally differently, replacing much of the rigid hand-crafted logic with a single model trained directly on simulated collisions. Instead of being told how to reconstruct particles, the algorithm learns how particles look in the detectors, like how humans learn to recognize faces without memorizing explicit rules.

Measuring chaos: Researchers quantify the quantum butterfly effect

For the first time, researchers in China have accurately quantified how chaos increases in a quantum many-body system as it evolves over time. Combining experiments and theory, a team led by Yu-Chen Li at the University of Science and Technology of China showed that the level of chaos grows exponentially when time reversal is applied to these systems—matching predictions of their extreme sensitivity to errors. The research has been published in Physical Review Letters.

The butterfly effect is a well-known expression of chaos theory. It describes how a complex system can quickly become unpredictable as it evolves: make just a few small errors when specifying the system’s starting conditions, and it may look completely different from your calculations a short time later.

This effect is especially relevant in many-body quantum systems, where entanglement creates intricate webs of interconnection between particles—even in relatively small systems. As the system evolves, information about its initial state becomes increasingly dispersed across these connections.

Record-breaking photons at telecom wavelengths

A team of researchers from the University of Stuttgart and the Julius-Maximilians-Universität Würzburg led by Prof. Stefanie Barz (University of Stuttgart) has demonstrated a source of single photons that combines on-demand operation with record-high photon quality in the telecommunications C-band—a key step toward scalable photonic quantum computation and quantum communication. “The lack of a high-quality on-demand C-band photon source has been a major problem in quantum optics laboratories for over a decade—our new technology now removes this obstacle,” says Prof. Stefanie Barz.

The key: Identical photons on demand In everyday life, distinguishing features may often be desirable. Few want to be exactly like everyone else. When it comes to quantum technologies, however, complete indistinguishability is the name of the game. Quantum particles such as photons that are identical in all their properties can interfere with each other—much as in noise-canceling headphones, where sound waves that are precisely inverted copies of the incoming noise cancel out the background.

When identical photons are made to act in synchrony, then the probability that certain measurement outcomes occur can be either boosted or decreased. Such quantum effects give rise to powerful new phenomena that lie at the heart of emerging technologies such as quantum computing and quantum networking. For these technologies to become feasible, high-quality interference between photons is essential.

Silicon quantum processor detects single-qubit errors while preserving entanglement

Quantum computers are alternative computing devices that process information, leveraging quantum mechanical effects, such as entanglement between different particles. Entanglement establishes a link between particles that allows them to share states in such a way that measuring one particle instantly affects the others, irrespective of the distance between them.

Quantum computers could, in principle, outperform classical computers in some optimization and computational tasks. However, they are also known to be highly sensitive to environmental disturbances (i.e., noise), which can cause quantum errors and adversely affect computations.

Researchers at the International Quantum Academy, Southern University of Science and Technology, and Hefei National Laboratory have developed a new approach to detect these errors in a silicon-based quantum processor. This error detection strategy, presented in a paper published in Nature Electronics, was found to successfully detect quantum errors in silicon qubits, while also preserving entanglement after their detection.

/* */