Toggle light / dark theme

A game of chess requires its players to think several moves ahead, a skill that computer programs have mastered over the years. Back in 1996, an IBM supercomputer famously beat the then world chess champion Garry Kasparov. Later, in 2017, an artificial intelligence (AI) program developed by Google DeepMind, called AlphaZero, triumphed over the best computerized chess engines of the time after training itself to play the game in a matter of hours.

More recently, some mathematicians have begun to actively pursue the question of whether AI programs can also help in cracking some of the world’s toughest problems. But, whereas an average game of chess lasts about 30 to 40 moves, these research-level math problems require solutions that take a million or more steps, or moves.

In a paper appearing on the arXiv preprint server, a team led by Caltech’s Sergei Gukov, the John D. MacArthur Professor of Theoretical Physics and Mathematics, describes developing a new type of machine-learning algorithm that can solve math problems requiring extremely long sequences of steps. The team used their to solve families of problems related to an overarching decades-old math problem called the Andrews–Curtis conjecture. In essence, the algorithm can think farther ahead than even advanced programs like AlphaZero.

Quantum computing is an alternative computing paradigm that exploits the principles of quantum mechanics to enable intrinsic and massive parallelism in computation. This potential quantum advantage could have significant implications for the design of future computational intelligence systems, where the increasing availability of data will necessitate ever-increasing computational power. However, in the current NISQ (Noisy Intermediate-Scale Quantum) era, quantum computers face limitations in qubit quality, coherence, and gate fidelity. Computational intelligence can play a crucial role in optimizing and mitigating these limitations by enhancing error correction, guiding quantum circuit design, and developing hybrid classical-quantum algorithms that maximize the performance of NISQ devices. This webinar aims to explore the intersection of quantum computing and computational intelligence, focusing on efficient strategies for using NISQ-era devices in the design of quantum-based computational intelligence systems.

Speaker Biography:
Prof. Giovanni Acampora is a Professor of Artificial Intelligence and Quantum Computing at the Department of Physics “Ettore Pancini,” University of Naples Federico II, Italy. He earned his M.Sc. (cum laude) and Ph.D. in Computer Science from the University of Salerno. His research focuses on computational intelligence and quantum computing. He is Chair of the IEEE-SA 1855 Working Group, Founder and Editor-in-Chief of Quantum Machine Intelligence. Acampora has received multiple awards, including the IEEE-SA Emerging Technology Award, IBM Quantum Experience Award and Fujitsu Quantum Challenge Award for his contributions to computational intelligence and quantum AI.

David Furman, an immunologist and data scientist at the Buck Institute for Research on Aging and Stanford University, uses artificial intelligence to parse big data to identify interventions for healthy aging.

Read more.

David Furman uses computational power, collaborations, and cosmic inspiration to tease apart the role of the immune system in aging.

“ tabindex=”0” KAIST researchers have discovered a molecular switch that can revert cancer cells back to normal by capturing the critical transition state before full cancer development. Using a computational gene network model based on single-cell RNA

Ribonucleic acid (RNA) is a polymeric molecule similar to DNA that is essential in various biological roles in coding, decoding, regulation and expression of genes. Both are nucleic acids, but unlike DNA, RNA is single-stranded. An RNA strand has a backbone made of alternating sugar (ribose) and phosphate groups. Attached to each sugar is one of four bases—adenine (A), uracil (U), cytosine ©, or guanine (G). Different types of RNA exist in the cell: messenger RNA (mRNA), ribosomal RNA (rRNA), and transfer RNA (tRNA).

In a milestone that brings quantum computing tangibly closer to large-scale practical use, scientists at Oxford University’s Department of Physics have demonstrated the first instance of distributed quantum computing. Using a photonic network interface, they successfully linked two separate quantum processors to form a single, fully connected quantum computer, paving the way to tackling computational challenges previously out of reach. The results have been published in Nature.

Scattering takes place across the universe at large and miniscule scales. Billiard balls clank off each other in bars, the nuclei of atoms collide to power the stars and create heavy elements, and even sound waves deviate from their original trajectory when they hit particles in the air.

Understanding such scattering can lead to discoveries about the forces that govern the universe. In a recent publication in Physical Review C, researchers from Lawrence Livermore National Laboratory (LLNL), the InQubator for Quantum Simulations and the University of Trento developed an algorithm for a quantum computer that accurately simulates scattering.

“Scattering experiments help us probe and their interactions,” said LLNL scientist Sofia Quaglioni. “The scattering of particles in matter [materials, atoms, molecules, nuclei] helps us understand how that matter is organized at a .”

To identify signs of particles like the Higgs boson, CERN researchers work with mountains of data generated by LHC collisions.

Hunting for evidence of an object whose behavior is predicted by existing theories is one thing. But having successfully observed the elusive boson, identifying new and unexpected particles and interactions is an entirely different matter.

To speed up their analysis, physicists feed data from the billions of collisions that occur in LHC experiments into machine learning algorithms. These models are then trained to identify anomalous patterns.

❗FlexiSpot Amazon Black Friday Deal Up to 70% OFF❗
Free Orders Nov.24 & Nov.27! 🎁
Use code COMHARDESK for an additional 5% OFF for my model: https://amzn.to/3ZoZT4u.
US site: https://amzn.to/3t8r9I
Canada Site: https://amzn.to/3sYHtLH
#blackfriday #amazon #standingdesk #flexispotus.

Human brain organoids (“mini-brains”) are being grown in labs around the world. They’re being fed neurotransmitters, competing with AI to solve non-linear equations, and going to space to study the effects of microgravity. This video reviews three preprints, preliminary reports of new scientific studies. (My AI voice caught a cold this week.)

Support the channel: https://www.patreon.com/ihmcurious.

Preprints:

Year 2024 face_with_colon_three


Our memristor is inspired and supported by a comprehensive theory directly derived from the underlying physical equations of diffusive and electric continuum ion transport. We experimentally quantitatively verified the predictions of our theory on multiple occasions, among which the specific and surprising prediction that the memory retention time of the channel depends on the channel diffusion time, despite the channel being constantly voltage-driven. The theory exclusively relies on physical parameters, such as channel dimensions and ion concentrations, and enabled streamlined experimentation by pinpointing the relevant signal timescales, signal voltages, and suitable reservoir computing protocol. Additionally, we identify an inhomogeneous charge density as the key ingredient for iontronic channels to exhibit current rectification (provided they are well described by slab-averaged PNP equations). Consequently, our theory paves the way for targeted advancements in iontronic circuits and facilitates efficient exploration of their diverse applications.

For future prospects, a next step is the integration of multiple devices, where the flexible fabrication methods do offer a clear path toward circuits that couple multiple channels. Additionally, optimizing the device to exhibit strong conductance modulation for lower voltages would be of interest to bring electric potentials found in nature into the scope of possible inputs and reduce the energy consumption for conductance modulation. From a theoretical perspective, the understanding of the (origin of the) inhomogeneous space charge and the surface conductance is still somewhat limited. These contain (physical) parameters that are now partially chosen from a reasonable physical regime to yield good agreement, but do not directly follow from underlying physical equations. We also assume that the inhomogeneous ionic space charge distribution is constant, while it might well be voltage-dependent.

A Canadian startup called Xanadu has built a new quantum computer it says can be easily scaled up to achieve the computational power needed to tackle scientific challenges ranging from drug discovery to more energy-efficient machine learning.

Aurora is a “photonic” quantum computer, which means it crunches numbers using photonic qubits—information encoded in light. In practice, this means combining and recombining laser beams on multiple chips using lenses, fibers, and other optics according to an algorithm. Xanadu’s computer is designed in such a way that the answer to an algorithm it executes corresponds to the final number of photons in each laser beam. This approach differs from one used by Google and IBM, which involves encoding information in properties of superconducting circuits.