Toggle light / dark theme

Quantum computers operate using quantum gates, but the complexity and large number of these gates can diminish their efficiency. A new “hybrid” approach reduces this complexity by utilizing natural system interactions, making quantum algorithms easier to execute.

This innovation helps manage the inherent “noise” issues of current quantum systems, enhancing their practical use. The approach has been effectively demonstrated with Grover’s algorithm, enabling efficient searches of large datasets without extensive error correction.

Challenges of Quantum Computing.

Every day, researchers at the Department of Energy’s SLAC National Accelerator Laboratory tackle some of the biggest questions in science and technology—from laying the foundations for new drugs to developing new battery materials and solving big data challenges associated with particle physics and cosmology.

To get a hand with that work, they are increasingly turning to artificial intelligence. “AI will help accelerate our science and technology further,” said Ryan Coffee, a SLAC senior scientist. “I am really excited about that.”

To try everything Brilliant has to offer—free—for a full 30 days, visit https://brilliant.org/ArtemKirsanov. You’ll also get 20% off an annual premium subscription.

Socials:
X/Twitter: https://twitter.com/ArtemKRSV
Patreon: / artemkirsanov.

My name is Artem, I’m a graduate student at NYU Center for Neural Science and researcher at Flatiron Institute (Center for Computational Neuroscience).

In this video, we explore the Nobel Prize-winning Hodgkin-Huxley model, the foundational equation of computational neuroscience that reveals how neurons generate electrical signals. We break down the biophysical principles of neural computation, from membrane voltage to ion channels, showing how mathematical equations capture the elegant dance of charged particles that enables information processing.

A novel device consisting of metal, dielectric, and metal layers remembers the history of electrical signals sent through it. This device, called a memristor, could serve as the basis for neuromorphic computers-;computers that work in ways similar to human brains. Unlike traditional digital memory, which stores information as 0s and 1s, this device exhibits so-called “analog” behavior. This means the device can store information between 0 and 1, and it can emulate how synapses function in the brain. Researchers found that the interface between metal and dielectric in the novel device is critical for stable switching and enhanced performance. Simulations indicate that circuits built on this device exhibit improved image recognition.

The Impact

Today’s computers are not energy efficient for big data and machine learning tasks. By 2030, experts predict that data centers could consume about 8% of the world’s electricity. To address this challenge, researchers are working to create computers inspired by the human brain, so-called neuromorphic computers. Artificial synapses created with memristor devices are the building blocks of these computers. These artificial synapses can store and process information in the same location, similar to how neurons and synapses work in the brain. Integrating these emergent devices with conventional computer components will reduce power needs and improve performance for tasks such as artificial intelligence and machine learning.

Summary: A new AI algorithm inspired by the genome’s ability to compress vast information offers insights into brain function and potential tech applications. Researchers found that this algorithm performs tasks like image recognition and video games almost as effectively as fully trained AI networks.

By mimicking how genomes encode complex behaviors with limited data, the model highlights the evolutionary advantage of efficient information compression. The findings suggest new pathways for developing advanced, lightweight AI systems capable of running on smaller devices like smartphones.

We may not be the only beings in the universe who use artificial intelligence. That’s according to some astronomers who say that an intelligent civilization anywhere in the cosmos would develop this tool naturally over the course of their cultural evolution.

After 13.8 billion years of existence, life has likely sprung up countless times throughout the cosmos. According to the Drake Equation, which calculates the probability of an existing, communicating civilization, there are currently an estimated 12,500 such intelligent alien societies in the Milky Way Galaxy alone. And if there are aliens who think in a way that we do, and created cultures that developed technology like us, then they probably invented a form of artificial intelligence, too, scientists say.

Assuming AI has been an integral part of intelligent societies for thousands or even millions of years, experts are increasingly considering the possibility that artificial intelligence may have grown to proportions we can scarcely imagine on Earth. Life in the universe may not only be biological, they say. AI machine-based life may dominate many extraterrestrial civilizations, according to a burgeoning theory among astrobiologists.

When laser energy is deposited in a target material, numerous complex processes take place at length and time scales that are too small to visually observe. To study and ultimately fine-tune such processes, researchers look to computer modeling. However, these simulations rely on accurate equation of state (EOS) models to describe the thermodynamic properties—such as pressure, density and temperature—of a target material under the extreme conditions generated by the intense heat of a laser pulse.

One process that is insufficiently addressed in current EOS models is ablation, where the irradiation from the laser beam removes solid material from the target either by means of vaporization or plasma formation (the fourth state of matter). It is this mechanism that launches a shock into the material, ultimately resulting in the high densities required for high pressure experiments such as (ICF).

To better understand laser–matter interactions with regard to ablation, researchers from Lawrence Livermore National Laboratory (LLNL), the University of California, San Diego (UCSD), SLAC National Accelerator Laboratory and other collaborating institutions conducted a study that represents the first example of using X-ray diffraction to make direct time-resolved measurements of an aluminum sample’s ablation depth. The research appears in Applied Physics Letters.

The V-score benchmarks classical and quantum algorithms in solving the many-body problem. The study highlights quantum computings potential for tackling complex material systems while providing an open-access framework for future research innovations.

Scientists aspire to use quantum computing to explore complex phenomena that have been difficult for current computers to analyze, such as the characteristics of novel and exotic materials. However, despite the excitement surrounding each announcement of “quantum supremacy,” it remains challenging to pinpoint when quantum computers and algorithms will offer a clear, practical advantage over classical systems.

A large collaboration led by Giuseppe Carleo, a physicist at the Swiss Federal Institute for Technology (EPFL) in Lausane and the member of the National Center for Competence in Research NCCR MARVEL, has now introduced a method to compare the performance of different algorithms, both classical and quantum ones, when simulating complex phenomena in condensed matter physics. The new benchmark, called V-score, is described in an article just published in Science.

Dr. Seung-Woo Lee and his team at the Quantum Technology Research Center at the Korea Institute of Science and Technology (KIST) have developed a world-class quantum error correction technology and designed a fault-tolerant quantum computing architecture based on it.


- Quantum error correction is a key technology in the implementation and practicalization of quantum computing.

- Groundbreaking quantum error correction technology contributes to the development of K-quantum computing deployments.

Solving the problem of error is essential for the practical application of quantum computing technologies that surpass the performance of digital computers. Information input into a qubit, the smallest unit of quantum computation, is quickly lost and error-prone. No matter how much we mitigate errors and improve the accuracy of qubit control, as the system size and computation scale increase, errors accumulate and algorithms become impossible to perform. Quantum error correction is a way to solve this problem. As the race for global supremacy in quantum technology intensifies, most major companies and research groups leading the development of quantum computing are now focusing on developing quantum error correction technology.

Oil and gas extraction in places like Texas’ Permian Basin leads to several waste products, including significant amounts of wastewater and flares firing into the sky. Texas Engineer Vaibhav Bahadur is researching how those byproducts, which are harmful to the environment, could be repurposed to serve as key elements in the creation of “green” hydrogen.

Bahadur, an associate professor in the Walker Department of Mechanical Engineering, recently published a new paper in the journal Desalination about a new way to potentially produce green hydrogen. It involves using the energy wasted via gas flaring to power reverse osmosis, a common, low-energy technique used for municipal water treatment. Hydrogen production requires pristine water, and this process satisfies that need by removing salts and other elements from the equation.

Learn more about green hydrogen in the Q&A with Bahadur below, as well as his research, next steps and its broader implications.