Blog

Archive for the ‘supercomputing’ category: Page 64

Oct 14, 2019

New approach for the simulation of quantum chemistry—modelling the molecular architecture

Posted by in categories: chemistry, particle physics, quantum physics, supercomputing

Searching for new substances and developing new techniques in the chemical industry: tasks that are often accelerated using computer simulations of molecules or reactions. But even supercomputers quickly reach their limits. Now researchers at the Max Planck Institute of Quantum Optics in Garching (MPQ) have developed an alternative, analogue approach. An international team around Javier Argüello-Luengo, Ph.D. candidate at the Institute of Photonic Sciences (ICFO), Ignacio Cirac, Director and Head of the Theory Department at the MPQ, Peter Zoller, Director at the Institute of Quantum Optics and Quantum Information in Innsbruck (IQOQI), and others have designed the first blueprint for a quantum simulator that mimics the quantum chemistry of molecules. Like an architectural model can be used to test the statics of a future building, a molecule simulator can support investigating the properties of molecules. The results are now published in the scientific journal Nature.

Using hydrogen, the simplest of all , as an example, the global team of physicists from Garching, Barcelona, Madrid, Beijing and Innsbruck theoretically demonstrate that the quantum simulator can reproduce the behaviour of a real molecule’s . In their work, they also show how experimental physicists can build such a simulator step by step. “Our results offer a new approach to the investigation of phenomena appearing in quantum chemistry,” says Javier Argüello-Luengo. This is highly interesting for chemists because classical computers notoriously struggle to simulate chemical compounds, as molecules obey the laws of quantum physics. An electron in its shell, for example, can rotate to the left and right simultaneously. In a compound of many particles, such as a molecule, the number of these parallel possibilities multiplies. Because each electron interacts with each other, the complexity quickly becomes impossible to handle.

As a way out, in 1982, the American physicist Richard Feynman suggested the following: We should simulate quantum systems by reconstructing them as simplified models in the laboratory from , which are inherently quantum, and therefore implying a parallelism of the possibilities by default. Today, quantum simulators are already in use, for example to imitate crystals. They have a regular, three-dimensional atomic lattice which is imitated by several intersecting , the “optical lattice.” The intersection points form something like wells in an egg carton into which the are filled. The interaction between the atoms can be controlled by amplifying or attenuating the rays. This way researchers gain a variable model in which they can study atomic behavior very precisely.

Sep 27, 2019

DARPA aims to make networks 100 times speedier with FastNIC

Posted by in categories: internet, supercomputing

Having a slow connection is always frustrating, but just imagine how supercomputers feel. All those cores doing all kinds of processing at lightning speed, but in the end they’re all waiting on an outdated network interface to stay in sync. DARPA doesn’t like it. So DARPA wants to change it — specifically by making a new network interface a hundred times faster.

The problem is this. As DARPA estimates it, processors and memory on a computer or server can in a general sense work at a speed of roughly 1014 bits per second — that’s comfortably into the terabit region — and networking hardware like switches and fiber are capable of about the same.

“The true bottleneck for processor throughput is the network interface used to connect a machine to an external network, such as an Ethernet, therefore severely limiting a processor’s data ingest capability,” explained DARPA’s Jonathan Smith in a news post by the agency about the project. (Emphasis mine.)

Sep 21, 2019

Google researchers have reportedly achieved “quantum supremacy”

Posted by in categories: quantum physics, supercomputing

The news: According to a report in the Financial Times, a team of researchers from Google led by John Martinis have demonstrated quantum supremacy for the first time. This is the point at which a quantum computer is shown to be capable of performing a task that’s beyond the reach of even the most powerful conventional supercomputer. The claim appeared in a paper that was posted on a NASA website, but the publication was then taken down. Google did not respond to a request for comment from MIT Technology Review.

Why NASA? Google struck an agreement last year to use supercomputers available to NASA as benchmarks for its supremacy experiments. According to the Financial Times report, the paper said that Google’s quantum processor was able to perform a calculation in three minutes and 20 seconds that would take today’s most advanced supercomputer, known as Summit, around 10,000 years. In the paper, the researchers said that, to their knowledge, the experiment “marks the first computation that can only be performed on a quantum processor.”

Quantum speed up: Quantum machines are so powerful because they harness quantum bits, or qubits. Unlike classical bits, which are either a 1 or a 0, qubits can be in a kind of combination of both at the same time. Thanks to other quantum phenomena, which are described in our explainer here, quantum computers can crunch large amounts of data in parallel that conventional machines have to work through sequentially. Scientists have been working for years to demonstrate that the machines can definitively outperform conventional ones.

Sep 21, 2019

Ghost post! Google creates world’s most powerful computer, NASA ‘accidentally reveals’ …and then publication vanishes

Posted by in categories: quantum physics, supercomputing

Google’s new quantum computer reportedly spends mere minutes on the tasks the world’s top supercomputers would need several millennia to perform. The media found out about this after NASA “accidentally” shared the firm’s research.

The software engineers at Google have built the world’s most powerful computer, the Financial Times and Fortune magazine reported on Friday, citing the company’s now-removed research paper. The paper is said to have been posted on a website hosted by NASA, which partners with Google, but later quietly taken down, without explanation.

Google and NASA have refused to comment on the matter. A source within the IT giant, however, told Fortune that NASA had “accidentally” published the paper before its team could verify its findings.

Sep 20, 2019

HPE to acquire supercomputer manufacturer Cray for $1.3 billion

Posted by in category: supercomputing

Hewlett Packard Enterprise has reached an agreement to acquire Cray, the manufacturer of supercomputing systems.

HPE says the acquisition will cost $35 per share, in a transaction valued at approximately $1.3 billion, net of cash.

Antonio Neri, president and CEO, HPE, says: Answers to some of society’s most pressing challenges are buried in massive amounts of data.

Sep 13, 2019

Brain-inspired computing could tackle big problems in a small way

Posted by in categories: neuroscience, supercomputing

While computers have become smaller and more powerful and supercomputers and parallel computing have become the standard, we are about to hit a wall in energy and miniaturization. Now, Penn State researchers have designed a 2-D device that can provide more than yes-or-no answers and could be more brainlike than current computing architectures.

“Complexity scaling is also in decline owing to the non-scalability of traditional von Neumann computing architecture and the impending ‘Dark Silicon’ era that presents a severe threat to multi-core processor technology,” the researchers note in today’s (Sept 13) online issue of Nature Communications.

The Dark Silicon era is already upon us to some extent and refers to the inability of all or most of the devices on a computer chip to be powered up at once. This happens because of too much heat generated from a . Von Neumann architecture is the standard structure of most modern computers and relies on a digital approach—” yes” or “no” answers—where program instruction and data are stored in the same memory and share the same communications channel.

Sep 7, 2019

Scientists develop a deep learning method to solve a fundamental problem in statistical physics

Posted by in categories: biotech/medical, robotics/AI, supercomputing

A team of scientists at Freie Universität Berlin has developed an Artificial Intelligence (AI) method that provides a fundamentally new solution of the “sampling problem” in statistical physics. The sampling problem is that important properties of materials and molecules can practically not be computed by directly simulating the motion of atoms in the computer because the required computational capacities are too vast even for supercomputers. The team developed a deep learning method that speeds up these calculations massively, making them feasible for previously intractable applications. “AI is changing all areas of our life, including the way we do science,” explains Dr. Frank Noé, professor at Freie Universität Berlin and main author of the study. Several years ago, so-called deep learning methods bested human experts in pattern recognition—be it the reading of handwritten texts or the recognition of cancer cells from medical images. “Since these breakthroughs, AI research has skyrocketed. Every day, we see new developments in application areas where traditional methods have left us stuck for years. We believe our approach could be such an advance for the field of statistical physics.” The results were published in Science.

Statistical Physics aims at the calculation of properties of materials or molecules based on the interactions of their constituent components—be it a metal’s melting temperature, or whether an antibiotic can bind to the molecules of a bacterium and thereby disable it. With statistical methods, such properties can be calculated in the computer, and the properties of the material or the efficiency of a specific medication can be improved. One of the main problems when doing this calculation is the vast computational cost, explains Simon Olsson, a coauthor of the study: “In principle we would have to consider every single structure, that means every way to position all the atoms in space, compute its probability, and then take their average. But this is impossible because the number of possible structures is astronomically large even for small molecules.

Sep 6, 2019

Secretary Perry Stands Up Office for Artificial Intelligence and Technology

Posted by in categories: biotech/medical, cybercrime/malcode, robotics/AI, supercomputing, sustainability

WASHINGTON, D.C.-Today, U.S. Secretary of Energy Rick Perry announced the establishment of the DOE Artificial Intelligence and Technology Office (AITO). The Secretary has established the office to serve as the coordinating hub for the work being done across the DOE enterprise in Artificial Intelligence. This action has been taken as part of the President’s call for a national AI strategy to ensure AI technologies are developed to positively impact the lives of Americans.

DOE-fueled AI is already being used to strengthen our national security and cybersecurity, improve grid resilience, increase environmental sustainability, enable smarter cities, improve water resource management, as well as speed the discovery of new materials and compounds, and further the understanding, prediction, and treatment of disease. DOE’s National Labs are home to four of the top ten fastest supercomputers in the world, and we’re currently building three next-generation, exascale machines, which will be even faster and more AI-capable computers.

“The world is in the midst of the Golden Age of AI, and DOE’s world class scientific and computing capabilities will be critical to securing America’s dominance in this field,” said Secretary Perry. “This new office housed within the Department of Energy will concentrate our existing efforts while also facilitating partnerships and access to federal data, models and high performance computing resources for America’s AI researchers. Its mission will be to elevate, accelerate and expand DOE’s transformative work to accelerate America’s progress in AI for years to come.”

Aug 28, 2019

AI learns to model our Universe

Posted by in categories: particle physics, robotics/AI, space, supercomputing

Researchers have successfully created a model of the Universe using artificial intelligence, reports a new study.

Researchers seek to understand our Universe by making to match observations. Historically, they have been able to model simple or highly simplified physical systems, jokingly dubbed the “spherical cows,” with pencils and paper. Later, the arrival of computers enabled them to model complex phenomena with . For example, researchers have programmed supercomputers to simulate the motion of billions of particles through billions of years of cosmic time, a procedure known as the N-body simulations, in order to study how the Universe evolved to what we observe today.

“Now with , we have developed the first neural network model of the Universe, and demonstrated there’s a third route to making predictions, one that combines the merits of both analytic calculation and numerical simulation,” said Yin Li, a Postdoctoral Researcher at the Kavli Institute for the Physics and Mathematics of the Universe, University of Tokyo, and jointly the University of California, Berkeley.

Aug 25, 2019

Researchers observe spontaneous occurrence of skyrmions in atomically thin cobalt films

Posted by in categories: particle physics, quantum physics, supercomputing

Since their experimental discovery, magnetic skyrmions—tiny magnetic knots—have moved into the focus of research. Scientists from Hamburg and Kiel have now been able to show that individual magnetic skyrmions with a diameter of only a few nanometers can be stabilized in magnetic metal films even without an external magnetic field. They report on their discovery in the journal Nature Communications.

The existence of magnetic skyrmions as particle-like objects was predicted 30 years ago by , but could only be proven experimentally in 2013. Skyrmions with a diameter from micrometers to a few nanometers were discovered in different magnetic material systems. Although they can be generated on a surface of a few atoms and manipulated with , they show a high stability against external influences. This makes them for future data storage or logic devices. In order to be competitive for technological applications, however, skyrmions must not only be very small, but also stable without an applied magnetic field.

Researchers at the universities of Hamburg and Kiel have now taken an important step in this direction. On the basis of quantum mechanical numerical calculations carried out on the supercomputers of the North-German Supercomputing Alliance (HLRN), the physicists from Kiel were able to predict that individual skyrmions with a diameter of only a few nanometers would appear in an atomically thin, ferromagnetic cobalt film (see Fig. 1). “The stability of the magnetic knots in these films is due to an unusual competition between different magnetic interactions,” says Sebastian Meyer, Ph.D. student in Prof. Stefan Heinze’s research group at the Kiel University.

Page 64 of 94First6162636465666768Last