Toggle light / dark theme

Hypercharge breaking scenarios could explain the baryon asymmetry of the universe

The Standard Model ℠, the main physics framework describing elementary particles and the forces driving them, outlines key patterns in physical interactions referred to as gauge symmetries. One of the symmetries it describes is the so-called UY hypercharge: a gauge symmetry that contributes to the electric charge of particles before electromagnetic and weak forces become distinct (i.e., before the electroweak phase transition).

Researchers at Universidad Autónoma de Madrid’s Theoretical Physics Department (DFT) and Instituto de Física Teórica (IFT) recently carried out a study investigating how the conditions present in the early universe could prompt the spontaneous breaking of this gauge symmetry, linking this phenomenon to certain models of neutrino mass generation known as radiative neutrino mass models. Their paper, published in Physical Review Letters, specifically builds on a theoretical framework called the Zee-Babu model, an extension of the SM explaining neutrino mass generation.

“In the SM, the spontaneously broken electroweak gauge symmetry, which governs the electromagnetic and weak interactions of nature, was restored in the universe’s first instants, when the universe’s temperature was higher than the electroweak energy scale,” Prof. Jose Miguel No, Luca Merlo, Alvaro Lozano-Onrubia and Sergio López-Zurdo told Phys.org.

New calculation links disparate pion reactions in nuclear physics

An early-career physicist mathematically connects timelike and spacelike form factors, opening the door to further insights into the inner workings of the strong force. A new lattice QCD calculation connects two seemingly disparate reactions involving the pion, the lightest particle governed by the strong interaction.

As an undergraduate student at Tecnológico de Monterrey in Mexico, Felipe Ortega-Gama worked at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility as part of the Science Undergraduate Laboratory Internships program. There, Ortega-Gama worked with Raúl Briceño, who was a jointly appointed staff scientist in the lab’s Center for Theoretical and Computational Physics (Theory Center) and professor at Old Dominion University.

Briceño introduced him to quantum chromodynamics (QCD), the theory that describes the strong interaction. This is the force that binds quarks and gluons together to form protons, neutrons and other particles generically called hadrons. Theorists use lattice QCD, a computational method for solving QCD, to make predictions based on this theory. These predictions are then used to help interpret the results of experiments involving hadrons.

AI designs an ultralight carbon nanomaterial that’s as strong as steel

Using machine learning, a team of researchers in Canada has created ultrahigh-strength carbon nanolattices, resulting in a material that’s as strong as carbon steel, but only as dense as Styrofoam.

The team noted last month that it was the first time this branch of AI had been used to optimize nano-architected materials. University of Toronto’s Peter Serles, one of the authors of the paper describing this work in Advanced Materials, praised the approach, saying, “It didn’t just replicate successful geometries from the training data; it learned from what changes to the shapes worked and what didn’t, enabling it to predict entirely new lattice geometries.”

To quickly recap, nanomaterials are engineered by arranging atoms or molecules in precise patterns, much like constructing structures with extremely tiny LEGO blocks. These materials often exhibit unique properties due to their nanoscale dimensions.

Online test-time adaptation for better generalization of interatomic potentials to out-of-distribution data

Molecular Dynamics (MD) simulation serves as a crucial technique across various disciplines including biology, chemistry, and material science1,2,3,4. MD simulations are typically based on interatomic potential functions that characterize the potential energy surface of the system, with atomic forces derived as the negative gradients of the potential energies. Subsequently, Newton’s laws of motion are applied to simulate the dynamic trajectories of the atoms. In ab initio MD simulations5, the energies and forces are accurately determined by solving the equations in quantum mechanics. However, the computational demands of ab initio MD limit its practicality in many scenarios. By learning from ab initio calculations, machine learning interatomic potentials (MLIPs) have been developed to achieve much more efficient MD simulations with ab initio-level accuracy6,7,8.

Despite their successes, the crucial challenge of implementing MLIPs is the distribution shift between training and test data. When using MLIPs for MD simulations, the data for inference are atomic structures that are continuously generated during simulations based on the predicted forces, and the training set should encompass a wide range of atomic structures to guarantee the accuracy of predictions. However, in fields such as phaseion9,10, catalysis11,12, and crystal growth13,14, the configurational space that needs to be explored is highly complex. This complexity makes it challenging to sample sufficient data for training and easy to make a potential that is not smooth enough to extrapolate to every relevant point. Consequently, a distribution shift between training and test datasets often occurs, which causes the degradation of test performance and leads to the emergence of unrealistic atomic structures, and finally the MD simulations collapse15.

Quantum Breakthrough: Artificial Atoms Store and Control Light Like Never Before

Imagine being able to see quantum objects with your own eyes — no microscopes needed. That’s exactly what researchers at TU Wien and ISTA have achieved with superconducting circuits, artificial atoms that are massive by quantum standards.

Unlike natural atoms, these structures can be engineered to have customizable properties, allowing scientists to control energy levels and interactions in ways never before possible. By coupling them, they’ve developed a method to store and retrieve light, laying the groundwork for revolutionary quantum technologies. These engineered systems also enable precise quantum pulses and act as a kind of quantum memory, offering an unprecedented level of control over light at the quantum level.

Gigantic Quantum Objects – Visible to the Naked Eye.

World’s Most Accurate Clocks Could Redefine Time

A strontium optical clock produces about 50,000 times more oscillations per second than a cesium clock, the basis for the current definition of a second.

Advances in atomic clocks may lead to a redefinition of the second, replacing the caesium standard (recent work on thorium nuclear transitions is still a long way from taking that role).

Also, NIST uses egg incubators(!) to control temperature & humidity.


New atomic clocks are more accurate than those used to define the second, suggesting the definition might need to change.

By Jay Bennett edited by Clara Moskowitz

Inside a laboratory nestled in the foothills of the Rocky Mountains, amid a labyrinth of lenses, mirrors, and other optical machinery bolted to a vibration-resistant table, an apparatus resembling a chimney pipe rises toward the ceiling. On a recent visit, the silvery pipe held a cloud of thousands of supercooled cesium atoms launched upward by lasers and then left to float back down. With each cycle, a maser—like a laser that produces microwaves—hit the atoms to send their outer electrons jumping to a different energy state.

New state of matter powers Microsoft quantum computing chip

Since their invention, traditional computers have almost always relied on semiconductor chips that use binary “bits” of information represented as strings of 1’s and 0’s. While these chips have become increasingly powerful and simultaneously smaller, there is a physical limit to the amount of information that can be stored on this hardware. Quantum computers, by comparison, utilize “qubits” (quantum bits) to exploit the strange properties exhibited by subatomic particles, often at extremely cold temperatures.

Two qubits can hold four values at any given time, with more qubits translating to an exponential increase in calculating capabilities. This allows a quantum computer to process information at speeds and scales that make today’s supercomputers seem almost antiquated. Last December, for example, Google unveiled an experimental quantum computer system that researchers say takes just five minutes to finish a calculation that would take most supercomputers over 10 septillion years to complete—longer than the age of the universe as we understand it.

But Google’s Quantum Processing Unit (QPU) is based on different technology than Microsoft’s Majorana 1 design, detailed in a paper published on February 19 in the journal Nature. The result of over 17 years of design and research, Majorana 1 relies on what the company calls “topological qubits” through the creation of topological superconductivity, a state of matter previously conceptualized but never documented.