Blog

Archive for the ‘supercomputing’ category: Page 40

May 11, 2022

IBM wants its quantum supercomputers running at 4,000-plus qubits by 2025

Posted by in categories: military, quantum physics, supercomputing

Forty years after it first began to dabble in quantum computing, IBM is ready to expand the technology out of the lab and into more practical applications — like supercomputing! The company has already hit a number of development milestones since it released its previous quantum roadmap in 2020, including the 127-qubit Eagle processor that uses quantum circuits and the Qiskit Runtime API. IBM announced on Wednesday that it plans to further scale its quantum ambitions and has revised the 2020 roadmap with an even loftier goal of operating a 4,000-qubit system by 2025.

Before it sets about building the biggest quantum computer to date, IBM plans release its 433-qubit Osprey chip later this year and migrate the Qiskit Runtime to the cloud in 2023, “bringing a serverless approach into the core quantum software stack,” per Wednesday’s release. Those products will be followed later that year by Condor, a quantum chip IBM is billing as “the world’s first universal quantum processor with over 1,000 qubits.”

This rapid four-fold jump in quantum volume (the number of qubits packed into a processor) will enable users to run increasingly longer quantum circuits, while increasing the processing speed — measured in CLOPS (circuit layer operations per second) — from a maximum of 2,900 OPS to over 10,000. Then it’s just a simple matter of quadrupaling that capacity in the span of less than 24 months.

May 11, 2022

Expanding the IBM Quantum Roadmap to anticipate the future of quantum-centric supercomputing

Posted by in categories: quantum physics, supercomputing

We’re excited to present an update to the IBM Quantum roadmap, and our plan to weave quantum processors, CPUs, and GPUs into a compute fabric capable of solving problems beyond the scope of classical resources.


Two years ago, we issued our first draft of that map to take our first steps: our ambitious three-year plan to develop quantum computing technology, called our development roadmap. Since then, our exploration has revealed new discoveries, gaining us insights that have allowed us to refine that map and travel even further than we’d planned. Today, we’re excited to present to you an update to that map: our plan to weave quantum processors, CPUs, and GPUs into a compute fabric capable of solving problems beyond the scope of classical resources alone.

Our goal is to build quantum-centric supercomputers. The quantum-centric supercomputer will incorporate quantum processors, classical processors, quantum communication networks, and classical networks, all working together to completely transform how we compute. In order to do so, we need to solve the challenge of scaling quantum processors, develop a runtime environment for providing quantum calculations with increased speed and quality, and introduce a serverless programming model to allow quantum and classical processors to work together frictionlessly.

Continue reading “Expanding the IBM Quantum Roadmap to anticipate the future of quantum-centric supercomputing” »

May 11, 2022

IBM’s massive ‘Kookaburra’ quantum processor might land in 2025

Posted by in categories: quantum physics, supercomputing

Today’s classical supercomputers can do a lot. But because their calculations are limited to binary states of 0 or 1, they can struggle with enormously complex problems such as natural science simulations. This is where quantum computers, which can represent information as 0, 1, or possibly both at the same time, might have an advantage.

Last year, IBM debuted a 127-qubit computing chip and a structure called the IBM Quantum System Two, intended to house components like the chandelier cryostat, wiring, and electronics for these bigger chips down the line. These developments edged IBM ahead of other big tech companies like Google and Microsoft in the race to build the most powerful quantum computer. Today, the company is laying out its three-year-plan to reach beyond 4,000-qubits by 2025 with a processor it is calling “Kookaburra.” Here’s how it is planning to get there.”


To get to its 2025 goal of a 4,000 qubit plus chip, IBM has micro-milestones it wants to hit on both the hardware and software side.

May 8, 2022

Tesla Sues Engineer Over ‘Dojo’ Supercomputer Technology Theft

Posted by in category: supercomputing

May 4, 2022

Cutting the carbon footprint of supercomputing in scientific research

Posted by in categories: information science, supercomputing

Simon Portegies Zwart, an astrophysicist at Leiden University in the Netherlands, says more efficient coding is vital for making computing greener. While for mathematician and physicist Loïc Lannelongue, the first step is for computer modellers to become more aware of their environmental impacts, which vary significantly depending on the energy mix of the country hosting the supercomputer. Lannelongue, who is based at the University of Cambridge, UK, has developed Green Algorithms, an online tool that enables researchers to estimate the carbon footprint of their computing projects.

Apr 27, 2022

Physicists Developed a Superconductor Circuit Long Thought to Be Impossible

Posted by in categories: quantum physics, supercomputing

By exchanging a classical material for one with unique quantum properties, scientists have made a superconducting circuit that’s capable of feats long thought to be impossible.

The discovery, made by researchers from Germany, the Netherlands, and the US, overturns a century of thought on the nature of superconducting circuits, and how their currents can be tamed and put to practical use.

Low-waste, high-speed circuits based on the physics of superconductivity present a golden opportunity to take supercomputing technology to a whole new level.

Apr 24, 2022

Atomic Layer Etching Could Lead to Ever-More Powerful Microchips and Supercomputers

Posted by in categories: mobile phones, particle physics, supercomputing

Over the course of almost 60 years, the information age has given the world the internet, smart phones, and lightning-fast computers. This has been made possible by about doubling the number of transistors that can be packed onto a computer chip every two years, resulting in billions of atomic-scale transistors that can fit on a fingernail-sized device. Even individual atoms may be observed and counted within such “atomic scale” lengths.

Physical limit

With this doubling reaching its physical limit, the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) has joined industry efforts to prolong the process and find new techniques to make ever-more powerful, efficient, and cost-effective chips. In the first PPPL research conducted under a Cooperative Research and Development Agreement (CRADA) with Lam Research Corp., a global producer of chip-making equipment, laboratory scientists properly predicted a fundamental phase in atomic-scale chip production through the use of modeling.

Apr 18, 2022

Tachyum Prodigy Processor — Small can be Amazing!

Posted by in categories: robotics/AI, supercomputing

The world’s first universal processor. See the benefits of the fastest running processor for hyperscale data centers and supercomputers and AI.

Apr 12, 2022

How to build brain-inspired neural networks based on light

Posted by in categories: biotech/medical, robotics/AI, supercomputing

Supercomputers are extremely fast, but also use a lot of power. Neuromorphic computing, which takes our brain as a model to build fast and energy-efficient computers, can offer a viable and much-needed alternative. The technology has a wealth of opportunities, for example in autonomous driving, interpreting medical images, edge AI or long-haul optical communications. Electrical engineer Patty Stabile is a pioneer when it comes to exploring new brain-and biology-inspired computing paradigms. “TU/e combines all it takes to demonstrate the possibilities of photon-based neuromorphic computing for AI applications.”

Patty Stabile, an associate professor in the department of Electrical Engineering, was among the first to enter the emerging field of photonic neuromorphic computing.

“I had been working on a proposal to build photonic digital artificial neurons when in 2017 researchers from MIT published an article describing how they developed a small chip for carrying out the same algebraic operations, but in an analog way. That is when I realized that synapses based on analog technology were the way to go for running artificial intelligence, and I have been hooked on the subject ever since.”

Apr 10, 2022

Swiss researchers make spin ice supercomputing breakthrough

Posted by in categories: energy, supercomputing

The smallest artificial spin ice ever created could be part of novel low-power HPC.

Page 40 of 94First3738394041424344Last