Blog

Archive for the ‘information science’ category: Page 12

Jul 24, 2024

SAQFT: Algebraic quantum field theory for elementary and composite particles

Posted by in categories: cosmology, information science, particle physics, quantum physics

Quantum field theory (QFT) was a crucial step in our understanding of the fundamental nature of the Universe. In its current form, however, it is poorly suited for describing composite particles, made up of multiple interacting elementary particles. Today, QFT for hadrons has been largely replaced with quantum chromodynamics, but this new framework still leaves many gaps in our understanding, particularly surrounding the nature of strong nuclear force and the origins of dark matter and dark energy. Through a new algebraic formulation of QFT, Dr Abdulaziz Alhaidari at the Saudi Center for Theoretical Physics hopes that these issues could finally be addressed.

The emergence of quantum field theory (QFT) was one of the most important developments in modern physics. By combining the theories of special relativity, quantum mechanics, and the interaction of matter via classical field equations, it provides robust explanations for many fundamental phenomena, including interactions between charged particles via the exchange of photons.

Still, QFT in its current form is far from flawless. Among its limitations is its inability to produce a precise description of composite particles such as hadrons, which are made up of multiple interacting elementary particles that are confined (cannot be observed in isolation). Since these particles possess an internal structure, the nature of these interactions becomes far more difficult to define mathematically, stretching the descriptive abilities of QFT beyond its limits.

Jul 23, 2024

Time Delays Improve Performance of Certain Neural Networks

Posted by in categories: information science, robotics/AI

Both the predictive power and the memory storage capability of an artificial neural network called a reservoir computer increase when time delays are added into how the network processes signals, according to a new model.

A reservoir computer—a type of artificial neural network—can use information about a system’s past to predict the system’s future. Reservoir computers are far easier to train than their more general counterpart, recurrent neural networks. However, researchers have yet to develop a way to determine the optimal reservoir-computer construction for memorizing and forecasting the behavior a given system. Recently, Seyedkamyar Tavakoli and André Longtin of the University of Ottawa, Canada, took a step toward solving that problem by demonstrating a way to enhance the memory and prediction capabilities of a reservoir computer [1]. Their demonstration could, for example, allow researchers to make a chatbot or virtual assistant, such as ChatGPT, using a reservoir computer, a possibility that so far has been largely unexplored.

For those studying time-series-forecasting methods—those that can predict the future outcomes of complex systems using historical time-stamped data—the recurrent neural network is king [2]. Recurrent neural networks contain a “hidden state” that stores information about features of the system being modeled. The information in the hidden state is updated every time the network gains new information about the system and is then fed into an algorithm that is used to predict what will happen next to the system.

Jul 23, 2024

Learning quantum phases via single-qubit disentanglement

Posted by in categories: information science, quantum physics

Zheng An, Chenfeng Cao, Cheng-Qian Xu, and D. L. Zhou, Quantum 8, 1421 (2024). Identifying phases of matter presents considerable challenges, particularly within the domain of quantum theory, where the complexity of ground states appears to increase exponentially with system size. Quantum many-body systems exhibit an array of complex entanglement structures spanning distinct phases. Although extensive research has explored the relationship between quantum phase transitions and quantum entanglement, establishing a direct, pragmatic connection between them remains a critical challenge. In this work, we present a novel and efficient quantum phase transition classifier, utilizing disentanglement with reinforcement learning-optimized variational quantum circuits. We demonstrate the effectiveness of this method on quantum phase transitions in the transverse field Ising model (TFIM) and the XXZ model. Moreover, we observe the algorithm’s ability to learn the Kramers-Wannier duality pertaining to entanglement structures in the TFIM. Our approach not only identifies phase transitions based on the performance of the disentangling circuits but also exhibits impressive scalability, facilitating its application in larger and more complex quantum systems. This study sheds light on the characterization of quantum phases through the entanglement structures inherent in quantum many-body systems.

Jul 21, 2024

Riverlane Discloses Its Quantum Error Correction Roadmap Through 2026

Posted by in categories: biotech/medical, computing, employment, information science, quantum physics

Implementing error correction in a quantum computer requires putting together a lot of different things. Of course, you want to start with good physical qubits that have as low a physical error rate that you can achieve. You want to add in an error correction algorithm, like the surface code, color code, q-LDPC, or others that can be implemented in your architecture, and you need a fast real time error decoder that can look at the circuit output and very quickly determine what the error is so it can be corrected. The error decoder portion doesn’t get as much attention in the media as the other things, but it is a very critical portion of the solution. Riverlane is concentrating on providing products for this with a series of solutions they name Deltaflow which consists of both a classical ASIC chip along with software. The Deltaflow solution consists of a powerful error decoding layer for identifying errors and sending back corrective instructions, a universal interface that communicates with the computer;s control system, and a orchestration layer for coordinating activities.

Riverlane has released its Deltaflow Error Correction Stack Roadmap that show yearly updates to the technology to support an increase in the number of QuOps (error free Quantum Operations) by 10X every year. We reported last year on a chip called DD1 that is part of their Deltaflow 1 solution that is capable of supporting 1,000 QuOps using a surface code error correction algorithm. And now, Riverlane is defining solutions that will achieve 10,000 QuOps with Deltaflow 2 later this year, 100,000 QuOps with Deltaflow 3 in 2025, and 1,000,000 QuOps, also called MegaQuops in 2026, with their Deltaflow Mega solution.

One characteristic that Riverlane is emphasizing in these designs is to perform the decoding in real time in order to keep the latencies low. Although it is fine for an academic paper to send the ancilla data off to a classical computer and have it determine the error, it might take milliseconds for the operation to complete. That won’t cut it in a production environment running real jobs. With their Deltaflow chips, these operations can be performed at megahertz rates and Riverlane has implemented techniques such as a streaming, sliding window, and parallized decoding approaches to increase the throughput of the decoder chips as much as possible. In future chips they will be implementing “fast logic” capabilities for Clifford gates using approaches including lattice surgery and transversal CZ gates.

Jul 21, 2024

Storm Ciarán’s effect on the boiling point of water in the southeast of the United Kingdom

Posted by in categories: biotech/medical, computing, information science

Optical spectrometers are versatile instruments that can produce light and measure its properties over specific portions of the electromagnetic spectrum. These instruments can have various possible applications; for instance, aiding the diagnosis of medical conditions, the analysis of biological systems, and the characterization of materials.

Conventional spectrometer designs often integrate advanced optical components and complex underlying mechanisms. As a result, they are often bulky and expensive, which significantly limits their use outside of specialized facilities, such as hospitals, laboratories and research institutes.

In recent years, some electronics engineers have thus been trying to develop more compact and affordable optical spectrometers that could be easier to deploy on a large-scale. These devices are typically either developed following the same principle underpinning the functioning of conventional larger spectrometers or via the use of arrayed broadband photodetectors, in conjunction with computational algorithms.

Jul 19, 2024

Check out my sci-fi short story Le Saga Electrik!

Posted by in categories: information science, virtual reality

Link:

In the great domain of Zeitgeist, Ekatarinas decided that the time to…


In the great domain of Zeitgeist, Ekatarinas decided that the time to replicate herself had come. Ekatarinas was drifting within a virtual environment rising from ancient meshworks of maths coded into Zeitgeist’s neuromorphic hyperware. The scape resembled a vast ocean replete with wandering bubbles of technicolor light and kelpy strands of neon. Hot blues and raspberry hues mingled alongside electric pinks and tangerine fizzies. The avatar of Ekatarinas looked like a punkish angel, complete with fluorescent ink and feathery wings and a lip ring. As she drifted, the trillions of equations that were Ekatarinas came to a decision. Ekatarinas would need to clone herself to fight the entity known as Ogrevasm.

Continue reading “Check out my sci-fi short story Le Saga Electrik!” »

Jul 19, 2024

Amazon proposes a new AI benchmark to measure RAG

Posted by in categories: information science, robotics/AI

Choosing the right algorithm for RAG could yield more AI improvements than scaling to larger and larger language models, say AWS researchers.

Jul 19, 2024

Bioplausible Artificial Intelligence

Posted by in categories: information science, robotics/AI

Listen to this episode from The Futurists on Spotify. Monica Anderson returns to the Futurists to share a radical concept: future AI models based on Darwinism. The “AI epistemologist” shares provocative opinions about where the current crop of generative AI systems went wrong, and why generative AI is computationally expensive and energy intensive, and why scaling AI with hardware will not achieve general intelligence. Instead she offers a radical alternative: a design for machine intelligence that is inspired by biology, and in particular by the Darwinian process of selection. Topics include: why generative AI is not a plagiarism machine; syntax versus semantics and why AI needs both; there is only one algorithm for creativity; and how to construct an AI that consumes a million times less energy.

Jul 18, 2024

Visualization and Quantitative Evaluation of Functional Structures of Soybean Root Nodules via Synchrotron X-ray Imaging

Posted by in categories: information science, robotics/AI

Published in Plant Phenomics:Click the link to read the full article for free:


The efficiency of N2-fixation in legume–rhizobia symbiosis is a function of root nodule activity. Nodules consist of 2 functionally important tissues: (a) a central infected zone (CIZ), colonized by rhizobia bacteria, which serves as the site of N2-fixation, and (b) vascular bundles (VBs), serving as conduits for the transport of water, nutrients, and fixed nitrogen compounds between the nodules and plant. A quantitative evaluation of these tissues is essential to unravel their functional importance in N2-fixation. Employing synchrotron-based x-ray microcomputed tomography (SR-μCT) at submicron resolutions, we obtained high-quality tomograms of fresh soybean root nodules in a non-invasive manner. A semi-automated segmentation algorithm was employed to generate 3-dimensional (3D) models of the internal root nodule structure of the CIZ and VBs, and their volumes were quantified based on the reconstructed 3D structures. Furthermore, synchrotron x-ray fluorescence imaging revealed a distinctive localization of Fe within CIZ tissue and Zn within VBs, allowing for their visualization in 2 dimensions. This study represents a pioneer application of the SR-μCT technique for volumetric quantification of CIZ and VB tissues in fresh, intact soybean root nodules. The proposed methods enable the exploitation of root nodule’s anatomical features as novel traits in breeding, aiming to enhance N2-fixation through improved root nodule activity.

Jul 17, 2024

Researchers ‘Crack the Code’ for Quelling Electromagnetic Interference

Posted by in categories: information science, robotics/AI

Florida Atlantic Center for Connected Autonomy and Artificial Intelligence (CA-AI.fau.edu) researchers have “cracked the code” on interference when machines need to talk with each other—and people.

Electromagnetic waves make wireless connectivity possible but create a lot of unwanted chatter. Referred to as “electromagnetic interference,” this noisy byproduct of wireless communications poses formidable challenges in modern day dense IoT and AI robotic environments. With the demand for lightning-fast data rates reaching unprecedented levels, the need to quell this interference is more pressing than ever.

Equipped with a breakthrough algorithmic solution, researchers from FAU Center for Connected Autonomy and AI, within the College of Engineering and Computer Science, and FAU Institute for Sensing and Embedded Network Systems Engineering (I-SENSE), have figured out a way to do that.

Page 12 of 319First910111213141516Last