Blog

Archive for the ‘supercomputing’ category: Page 10

Sep 19, 2020

A “Supercomputer” is Deciding the Politics of Australians

Posted by in categories: economics, supercomputing

By Taleed Brown

By decree of an anonymous university “supercomputer,” Victoria’s Dan Andrews has opted to extend stage 4 lockdowns. This is once again stalling the economic recovery of the region and plundering the wealth and liberty of millions across the state.

Continue reading “A ‘Supercomputer’ is Deciding the Politics of Australians” »

Sep 17, 2020

Physicists make electrical nanolasers even smaller

Posted by in categories: mobile phones, physics, supercomputing

Researchers from the Moscow Institute of Physics and Technology and King’s College London cleared the obstacle that had prevented the creation of electrically driven nanolasers for integrated circuits. The approach, reported in a recent paper in Nanophotonics, enables coherent light source design on the scale not only hundreds of times smaller than the thickness of a human hair but even smaller than the wavelength of light emitted by the laser. This lays the foundation for ultrafast optical data transfer in the manycore microprocessors expected to emerge in the near future.

Light signals revolutionized information technologies in the 1980s, when optical fibers started to replace copper wires, making data transmission orders of magnitude faster. Since optical communication relies on light— with a frequency of several hundred terahertz—it allows transferring terabytes of data every second through a single fiber, vastly outperforming electrical interconnects.

Fiber optics underlies the modern internet, but light could do much more for us. It could be put into action even inside the microprocessors of supercomputers, workstations, smartphones, and other devices. This requires using optical communication lines to interconnect the purely , such as processor cores. As a result, vast amounts of information could be transferred across the chip nearly instantaneously.

Sep 2, 2020

A Supercomputer Analyzed Covid-19 — and an Interesting New Theory Has Emerged

Posted by in categories: biotech/medical, supercomputing

Aug 28, 2020

Scientists use reinforcement learning to train quantum algorithm

Posted by in categories: chemistry, information science, quantum physics, robotics/AI, supercomputing

Recent advancements in quantum computing have driven the scientific community’s quest to solve a certain class of complex problems for which quantum computers would be better suited than traditional supercomputers. To improve the efficiency with which quantum computers can solve these problems, scientists are investigating the use of artificial intelligence approaches.

In a new study, scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have developed a based on reinforcement learning to find the optimal parameters for the Quantum Approximate Optimization Algorithm (QAOA), which allows a quantum computer to solve certain combinatorial problems such as those that arise in materials design, chemistry and wireless communications.

“Combinatorial optimization problems are those for which the solution space gets exponentially larger as you expand the number of decision variables,” said Argonne scientist Prasanna Balaprakash. “In one traditional example, you can find the shortest route for a salesman who needs to visit a few cities once by enumerating all possible routes, but given a couple thousand cities, the number of possible routes far exceeds the number of stars in the universe; even the fastest supercomputers cannot find the shortest route in a reasonable time.”

Aug 17, 2020

Elon Musk hints at Tesla’s not-so-secret Dojo AI-training supercomputer capacity

Posted by in categories: Elon Musk, robotics/AI, supercomputing

Elon Musk has made a rare new comment about Tesla’s now not-so-secret ‘Dojo’ program to create an AI-training supercomputer and gave a hint of its capacity.

Aug 17, 2020

The quantum state of play — cloud-based QCaaS and Covid-19

Posted by in categories: biotech/medical, business, quantum physics, supercomputing

Quantum computing requires meticulously prepared hardware and big budgets, but cloud-based solutions could make the technology available to broader business audiences Several tech giants are racing to achieve “quantum supremacy”, but reliability and consistency in quantum output is no simple trick Covid-19 has prompted some researchers to look at how quantum computing could mitigate future pandemics with scientific precision and speed Quantum computing (QC) has been theorized for decades and has evolved rapidly over the last few years. An escalation in spend and development has seen powerhouses IBM, Microsoft, and Google race for ‘quantum supremacy’ — whereby quantum reliably and consistently outperforms existing computers. But do quantum computers remain a sort of elitist vision of the future or are we on course for more financially and infrastructurally viable applications across industries?

Getting to grips with qubits How much do you know? Ordinary computers (even supercomputers) deploy bits, and these bits comprise of traditional binary code. Computer processes – like code – are made up of countless combinations of 0’s and 1’s. Quantum computers, however, are broken down into qubits. Qubits are capable of ‘superpositions’: effectively adopting both 1 and 0 simultaneously, or any space on the spectrum between these two formerly binary points. The key to a powerful, robust, and reliable quantum computer is more qubits. Every qubit added exponentially increases the processing capacity of the machine.

Qubits and the impact of the superposition give quantum computers the ability to process large datasets within seconds, doing what it would take humans decades to do. They can decode and deconstruct, hypothesize and validate, tackling problems of absurd complexity and dizzying magnitude — and can do so across many different industries.

Continue reading “The quantum state of play — cloud-based QCaaS and Covid-19” »

Aug 6, 2020

A Quintillion Calculations a Second: DOE Calculating the Benefits of Exascale and Quantum Computers

Posted by in categories: information science, quantum physics, supercomputing

A quintillion calculations a second. That’s one with 18 zeros after it. It’s the speed at which an exascale supercomputer will process information. The Department of Energy (DOE) is preparing for the first exascale computer to be deployed in 2021. Two more will follow soon after. Yet quantum computers may be able to complete more complex calculations even faster than these up-and-coming exascale computers. But these technologies complement each other much more than they compete.

It’s going to be a while before quantum computers are ready to tackle major scientific research questions. While quantum researchers and scientists in other areas are collaborating to design quantum computers to be as effective as possible once they’re ready, that’s still a long way off. Scientists are figuring out how to build qubits for quantum computers, the very foundation of the technology. They’re establishing the most fundamental quantum algorithms that they need to do simple calculations. The hardware and algorithms need to be far enough along for coders to develop operating systems and software to do scientific research. Currently, we’re at the same point in quantum computing that scientists in the 1950s were with computers that ran on vacuum tubes. Most of us regularly carry computers in our pockets now, but it took decades to get to this level of accessibility.

In contrast, exascale computers will be ready next year. When they launch, they’ll already be five times faster than our fastest computer – Summit, at Oak Ridge National Laboratory’s Leadership Computing Facility, a DOE Office of Science user facility. Right away, they’ll be able to tackle major challenges in modeling Earth systems, analyzing genes, tracking barriers to fusion, and more. These powerful machines will allow scientists to include more variables in their equations and improve models’ accuracy. As long as we can find new ways to improve conventional computers, we’ll do it.

Continue reading “A Quintillion Calculations a Second: DOE Calculating the Benefits of Exascale and Quantum Computers” »

Aug 5, 2020

Supercomputer to scan ‘entire sky’ for signs of aliens

Posted by in categories: alien life, chemistry, supercomputing

Scientists are ramping up their efforts in the search for signs of alien life.

Experts at the SETI Institute, an organization dedicated to tracking extraterrestrial intelligence, are developing state-of-the-art techniques to detect signatures from space that indicate the possibility of extraterrestrial existence.

These so-called “technosignatures” can range from the chemical composition of a planet’s atmosphere, to laser emissions, to structures orbiting other stars, among others, they said.

Aug 4, 2020

Calculating the benefits of exascale and quantum computers

Posted by in categories: information science, quantum physics, supercomputing

A quintillion calculations a second. That’s one with 18 zeros after it. It’s the speed at which an exascale supercomputer will process information. The Department of Energy (DOE) is preparing for the first exascale computer to be deployed in 2021. Two more will follow soon after. Yet quantum computers may be able to complete more complex calculations even faster than these up-and-coming exascale computers. But these technologies complement each other much more than they compete.

It’s going to be a while before quantum computers are ready to tackle major scientific research questions. While quantum researchers and scientists in other areas are collaborating to design quantum computers to be as effective as possible once they’re ready, that’s still a long way off. Scientists are figuring out how to build qubits for quantum computers, the very foundation of the technology. They’re establishing the most fundamental quantum algorithms that they need to do simple calculations. The hardware and algorithms need to be far enough along for coders to develop operating systems and software to do scientific research. Currently, we’re at the same point in that scientists in the 1950s were with computers that ran on vacuum tubes. Most of us regularly carry computers in our pockets now, but it took decades to get to this level of accessibility.

In contrast, exascale computers will be ready next year. When they launch, they’ll already be five times faster than our fastest —Summit, at Oak Ridge National Laboratory’s Leadership Computing Facility, a DOE Office of Science user facility. Right away, they’ll be able to tackle major challenges in modeling Earth systems, analyzing genes, tracking barriers to fusion, and more. These powerful machines will allow scientists to include more variables in their equations and improve models’ accuracy. As long as we can find new ways to improve conventional computers, we’ll do it.

Aug 1, 2020

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

Posted by in categories: quantum physics, supercomputing

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Last week a Google draft paper, discovered by the Financial Times, claimed attaining Quantum Supremacy using a 53-qubit superconducting processor. The paper found on NASA’s website was later withdrawn. Conversation around it has been bubbling in the QC community since.

More on D-Wave’s announcements later – the Advantage system isn’t expected to be broadly available until mid-2020 which is roughly in keeping with its stated plans. The Google work on quantum supremacy is fascinating. Google has declined to comment on the paper. How FT became aware of the paper isn’t clear. A few observers suggest it looks like an early draft.

Quantum supremacy, of course, is the notion of a quantum computer doing something that classical computers simply can’t reasonably do. In this instance, the reported Google paper claimed it was able to perform as task (a particular random number generation) on its QC in 200 seconds versus what would take on the order 10,000 years on a supercomputer. In an archived copy of the draft that HPCwire was able to find, the authors say they “estimated the classical computational cost” of running supremacy circuits on Summit and on a large Google cluster. (For an excellent discussion of quantum supremacy see Scott Aaronson’s (University of Texas) blog yesterday, Scott’s Supreme Quantum Supremacy FAQ)

Continue reading “D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim” »

Page 10 of 48First7891011121314Last