Toggle light / dark theme

Breaking oxygen out of a water molecule is a relatively simple process, at least chemically. Even so, it does require components, one of the most important of which is a catalyst. Catalysts enable reactions and are linearly scalable, so if you want more reactions quickly, you need a bigger catalyst. In space exploration, bigger means heavier, which translates into more expensive. So, when humanity is looking for a catalyst to split water into oxygen and hydrogen on Mars, creating one from local Martian materials would be worthwhile. That is precisely what a team from Hefei, China, did by using what they called an “AI Chemist.”

Unfortunately, the name “AIChemist” didn’t stick, though that joke might vary depending on the font you read it in. Whatever its name, the team’s work was some serious science. It specifically applied machine learning algorithms that have become all the rage lately to selecting an effective catalyst for an “oxygen evolution reaction” by utilizing materials native to Mars.

To say it only chose the catalyst isn’t giving the system the full credit it’s due, though. It accomplished a series of steps, including developing a catalyst formula, pretreating the ore to create the catalyst, synthesizing it, and testing it once it was complete. The authors estimate that the automated process saved over 2,000 years of human labor by completing all of these tasks and point to the exceptional results of the testing to prove it.

Vorticity, a measure of the local rotation or swirling motion in a fluid, has long been studied by physicists and mathematicians. The dynamics of vorticity is governed by the famed Navier-Stokes equations, which tell us that vorticity is produced by the passage of fluid past walls. Moreover, due to their internal resistance to being sheared, viscous fluids will diffuse the vorticity within them and so any persistent swirling motions will require a constant resupply of vorticity.

Physicists at the University of Chicago and applied mathematicians at the Flatiron Institute recently carried out a study exploring the behavior of viscous fluids in which tiny rotating particles were suspended, acting as local, mobile sources of vorticity. Their paper, published in Nature Physics, outlines fluid behaviors that were never observed before, characterized by self-propulsion, flocking and the emergence of chiral active phases.

“This experiment was a confluence of three curiosities,” William T.M. Irvine, a corresponding author of the paper, told Phys.org. “We had been studying and engineering parity-breaking meta-fluids with fundamentally new properties in 2D and were interested to see how a three-dimensional analog would behave.

Quantum computers operate using quantum gates, but the complexity and large number of these gates can diminish their efficiency. A new “hybrid” approach reduces this complexity by utilizing natural system interactions, making quantum algorithms easier to execute.

This innovation helps manage the inherent “noise” issues of current quantum systems, enhancing their practical use. The approach has been effectively demonstrated with Grover’s algorithm, enabling efficient searches of large datasets without extensive error correction.

Challenges of Quantum Computing.

Every day, researchers at the Department of Energy’s SLAC National Accelerator Laboratory tackle some of the biggest questions in science and technology—from laying the foundations for new drugs to developing new battery materials and solving big data challenges associated with particle physics and cosmology.

To get a hand with that work, they are increasingly turning to artificial intelligence. “AI will help accelerate our science and technology further,” said Ryan Coffee, a SLAC senior scientist. “I am really excited about that.”

To try everything Brilliant has to offer—free—for a full 30 days, visit https://brilliant.org/ArtemKirsanov. You’ll also get 20% off an annual premium subscription.

Socials:
X/Twitter: https://twitter.com/ArtemKRSV
Patreon: / artemkirsanov.

My name is Artem, I’m a graduate student at NYU Center for Neural Science and researcher at Flatiron Institute (Center for Computational Neuroscience).

In this video, we explore the Nobel Prize-winning Hodgkin-Huxley model, the foundational equation of computational neuroscience that reveals how neurons generate electrical signals. We break down the biophysical principles of neural computation, from membrane voltage to ion channels, showing how mathematical equations capture the elegant dance of charged particles that enables information processing.

Outline:
00:00 Introduction.
01:28 Membrane Voltage.
04:56 Action Potential Overview.
6:24 Equilibrium potential and driving force.
10:11 Voltage-dependent conductance.
16:50 Review.
20:09 Limitations \& Outlook.
21:21 Sponsor: Brilliant.org.
22:44 Outro.

A novel device consisting of metal, dielectric, and metal layers remembers the history of electrical signals sent through it. This device, called a memristor, could serve as the basis for neuromorphic computers-;computers that work in ways similar to human brains. Unlike traditional digital memory, which stores information as 0s and 1s, this device exhibits so-called “analog” behavior. This means the device can store information between 0 and 1, and it can emulate how synapses function in the brain. Researchers found that the interface between metal and dielectric in the novel device is critical for stable switching and enhanced performance. Simulations indicate that circuits built on this device exhibit improved image recognition.

The Impact

Today’s computers are not energy efficient for big data and machine learning tasks. By 2030, experts predict that data centers could consume about 8% of the world’s electricity. To address this challenge, researchers are working to create computers inspired by the human brain, so-called neuromorphic computers. Artificial synapses created with memristor devices are the building blocks of these computers. These artificial synapses can store and process information in the same location, similar to how neurons and synapses work in the brain. Integrating these emergent devices with conventional computer components will reduce power needs and improve performance for tasks such as artificial intelligence and machine learning.

Summary: A new AI algorithm inspired by the genome’s ability to compress vast information offers insights into brain function and potential tech applications. Researchers found that this algorithm performs tasks like image recognition and video games almost as effectively as fully trained AI networks.

By mimicking how genomes encode complex behaviors with limited data, the model highlights the evolutionary advantage of efficient information compression. The findings suggest new pathways for developing advanced, lightweight AI systems capable of running on smaller devices like smartphones.

We may not be the only beings in the universe who use artificial intelligence. That’s according to some astronomers who say that an intelligent civilization anywhere in the cosmos would develop this tool naturally over the course of their cultural evolution.

After 13.8 billion years of existence, life has likely sprung up countless times throughout the cosmos. According to the Drake Equation, which calculates the probability of an existing, communicating civilization, there are currently an estimated 12,500 such intelligent alien societies in the Milky Way Galaxy alone. And if there are aliens who think in a way that we do, and created cultures that developed technology like us, then they probably invented a form of artificial intelligence, too, scientists say.

Assuming AI has been an integral part of intelligent societies for thousands or even millions of years, experts are increasingly considering the possibility that artificial intelligence may have grown to proportions we can scarcely imagine on Earth. Life in the universe may not only be biological, they say. AI machine-based life may dominate many extraterrestrial civilizations, according to a burgeoning theory among astrobiologists.

When laser energy is deposited in a target material, numerous complex processes take place at length and time scales that are too small to visually observe. To study and ultimately fine-tune such processes, researchers look to computer modeling. However, these simulations rely on accurate equation of state (EOS) models to describe the thermodynamic properties—such as pressure, density and temperature—of a target material under the extreme conditions generated by the intense heat of a laser pulse.

One process that is insufficiently addressed in current EOS models is ablation, where the irradiation from the laser beam removes solid material from the target either by means of vaporization or plasma formation (the fourth state of matter). It is this mechanism that launches a shock into the material, ultimately resulting in the high densities required for high pressure experiments such as (ICF).

To better understand laser–matter interactions with regard to ablation, researchers from Lawrence Livermore National Laboratory (LLNL), the University of California, San Diego (UCSD), SLAC National Accelerator Laboratory and other collaborating institutions conducted a study that represents the first example of using X-ray diffraction to make direct time-resolved measurements of an aluminum sample’s ablation depth. The research appears in Applied Physics Letters.

The V-score benchmarks classical and quantum algorithms in solving the many-body problem. The study highlights quantum computings potential for tackling complex material systems while providing an open-access framework for future research innovations.

Scientists aspire to use quantum computing to explore complex phenomena that have been difficult for current computers to analyze, such as the characteristics of novel and exotic materials. However, despite the excitement surrounding each announcement of “quantum supremacy,” it remains challenging to pinpoint when quantum computers and algorithms will offer a clear, practical advantage over classical systems.

A large collaboration led by Giuseppe Carleo, a physicist at the Swiss Federal Institute for Technology (EPFL) in Lausane and the member of the National Center for Competence in Research NCCR MARVEL, has now introduced a method to compare the performance of different algorithms, both classical and quantum ones, when simulating complex phenomena in condensed matter physics. The new benchmark, called V-score, is described in an article just published in Science.