Toggle light / dark theme

An innovative algorithm for detecting collisions of high-speed particles within nuclear fusion reactors has been developed, inspired by technologies used to determine whether bullets hit targets in video games. This advancement enables rapid predictions of collisions, significantly enhancing the stability and design efficiency of future fusion reactors.

Professor Eisung Yoon and his research team in the Department of Nuclear Engineering at UNIST announced that they have successfully developed a collision detection algorithm capable of quickly identifying collision points of high-speed particles within virtual devices. The research is published in the journal Computer Physics Communications.

When applied to the Virtual KSTAR (V-KSTAR), this algorithm demonstrated a detection speed up to 15 times faster than previous methods. The V-KSTAR is a digital twin that replicates the Korean Superconducting Tokamak Advanced Research (KSTAR) fusion experiment in a three-dimensional virtual environment.

MIT researchers have created a periodic table that shows how more than 20 classical machine-learning algorithms are connected. The new framework sheds light on how scientists could fuse strategies from different methods to improve existing AI models or come up with new ones.

For instance, the researchers used their framework to combine elements of two different algorithms to create a new image-classification that performed 8% better than current state-of-the-art approaches.

The periodic table stems from one key idea: All these algorithms learn a specific kind of relationship between data points. While each algorithm may accomplish that in a slightly different way, the core mathematics behind each approach is the same.

ICLR 2025

Shaden Alshammari, John Hershey, Axel Feldmann, William T. Freeman, Mark Hamilton.

MIT, Microsoft, Google.

(https://mhamilton.net/icon.

[ https://openreview.net/forum?id=WfaQrKCr4X](https://openreview.net/forum?id=WfaQrKCr4X

[ https://github.com/mhamilton723/STEGO](https://github.com/mhamilton723/STEGO

Genome editing has advanced at a rapid pace with promising results for treating genetic conditions—but there is always room for improvement. A new paper by investigators from Mass General Brigham showcases the power of scalable protein engineering combined with machine learning to boost progress in the field of gene and cell therapy.

In their study, the authors developed a machine learning algorithm—known as PAMmla—that can predict the properties of approximately 64 million enzymes. The work could help reduce off-target effects and improve editing safety, enhance editing efficiency, and enable researchers to predict customized enzymes for new therapeutic targets. The results are published in Nature.

“Our study is a first step in dramatically expanding our repertoire of effective and safe CRISPR-Cas9 enzymes. In our manuscript, we demonstrate the utility of these PAMmla-predicted enzymes to precisely edit disease-causing sequences in primary and in mice,” said corresponding author Ben Kleinstiver, Ph.D., Kayden-Lambert MGH Research Scholar associate investigator at Massachusetts General Hospital (MGH).

The juridical metaphor in physics has ancient roots. Anaximander, in the 6th century BCE, was perhaps the first to invoke the concept of cosmic justice, speaking of natural entities paying “penalty and retribution to each other for their injustice according to the assessment of Time” (Kirk et al., 2010, p. 118). This anthropomorphizing tendency persisted through history, finding its formal expression in Newton’s Principia Mathematica, where he articulated his famous “laws” of motion. Newton, deeply influenced by his theological views, conceived of these laws as divine edicts — mathematical expressions of God’s will imposed upon a compliant universe (Cohen & Smith, 2002, p. 47).

This legal metaphor has served science admirably for centuries, providing a framework for conceptualizing the universe’s apparent obedience to mathematical principles. Yet it carries implicit assumptions worth examining. Laws suggest a lawgiver, hinting at external agency. They imply prescription rather than description — a subtle distinction with profound philosophical implications. As physicist Paul Davies (2010) observes, “The very notion of physical law is a theological one in the first place, a fact that makes many scientists squirm” (p. 74).

Enter the computational metaphor — a framework more resonant with our digital age. The universe, in this conceptualization, executes algorithms rather than obeying laws. Space, time, energy, and matter constitute the data structure upon which these algorithms operate. This shift is more than semantic; it reflects a fundamental reconceptualization of physical reality that aligns remarkably well with emerging theories in theoretical physics and information science.

Genome editing has advanced at a rapid pace with promising results for treating genetic conditions-but there is always room for improvement. A new paper by investigators from Mass General Brigham published in Nature showcases the power of scalable protein engineering combined with machine learning to boost progress in the field of gene and cell therapy. In their study, authors developed a machine learning algorithm-known as PAMmla-that can predict the properties of about 64 million genome editing enzymes. The work could help reduce off-target effects and improve editing safety, enhance editing efficiency, and enable researchers to predict customized enzymes for new therapeutic targets. Their results are published in Nature.

“Our study is a first step in dramatically expanding our repertoire of effective and safe CRISPR-Cas9 enzymes. In our manuscript we demonstrate the utility of these PAMmla-predicted enzymes to precisely edit disease-causing sequences in primary human cells and in mice,” said corresponding author Ben Kleinstiver, PhD, Kayden-Lambert MGH Research Scholar associate investigator at Massachusetts General Hospital (MGH), a founding member of the Mass General Brigham healthcare system. “Building on these findings, we are excited to have these tools utilized by the community and also apply this framework to other properties and enzymes in the genome editing repertoire.”

CRISPR-Cas9 enzymes can be used to edit genes at locations throughout the genomes, but there are limitations to this technology. Traditional CRISPR-Cas9 enzymes can have off-target effects, cleaving or otherwise modifying DNA at unintended sites in the genome. The newly published study aims to improve this by using machine learning to better predict and tailor enzymes to hit their targets with greater specificity. The approach also offers a scalable solution-other attempts at engineering enzymes have had a lower throughput and typically yielded orders of magnitude fewer enzymes.

“Welcome back to our channel! Today, we’re diving into an extraordinary and futuristic topic: Neural Enhancement: Human 2.0. Imagine a future where AI-driven technologies can enhance human brain functions, creating a new version of humanity with unparalleled cognitive and physical abilities. Let’s explore this revolutionary concept! 🧬🧠 #Science #Tech”

Segment 1: the concept of neural enhancement.

“Imagine a world where humans can enhance their natural abilities through advanced technology. 🧠✨ Neural enhancement uses AI and neural interfaces to boost cognitive functions, improve memory, and enhance physical capabilities, creating ‘Human 2.0.’ 🌟 #NeuralEnhancement #TechInnovation”

Segment 2: how neural enhancement works.

“So, how does neural enhancement work? 🤖🧬 Using brain-computer interfaces (BCIs), neural implants, and AI algorithms, scientists can directly interact with the brain’s neural networks. These technologies can stimulate and enhance brain functions, improving everything from memory and learning speed to physical coordination and strength. 🌐✨ #AI #NeuroTech”

PRRDetect is a new algorithm that identifies tumors with faulty DNA repair, helping doctors tailor cancer treatments more effectively. It marks a major step in using genomics for personalized cancer therapy. Researchers have developed a highly accurate algorithm, named PRRDetect, designed to iden

Researchers have achieved a major leap in quantum computing by simulating Google’s 53-qubit Sycamore circuit using over 1,400 GPUs and groundbreaking algorithmic techniques. Their efficient tensor network methods and clever “top-k” sampling approach drastically reduce the memory and computational

Neutron star mergers are collisions between neutron stars, the collapsed cores of what were once massive supergiant stars. These mergers are known to generate gravitational waves, energy-carrying waves propagating through a gravitational field, which emerge from the acceleration or disturbance of a massive body.

Collisions between neutron stars have been the topic of many theoretical physics studies, as a deeper understanding of these events could yield interesting insights into how matter behaves at extreme densities. The behavior of matter at extremely high densities is currently described by a known as the equation of state (EoS).

Recent astrophysics research has explored the possibility that EoS features, such as or a quark-hadron crossover, could be inferred from the gravitational wave spectrum observed after neuron stars have merged. However, most of these theoretical works did not consider the effects of magnetic fields on this spectrum.