Toggle light / dark theme

Tianhe Xingyi: China unveils the ‘fastest’ homegrown supercomputer

No specifications have been revealed, but officials have claimed that it surpasses the capabilities of the famous Tianhe-2 supercomputer.


The National Supercomputing Center (NSC) in Guangzhou, China, has unveiled the Tianhe Xingyi, a homegrown supercomputer, at an industrial event in Guangdong Province, which several media reports have confirmed. The NSC is the parent organization under whose guidance the Tianhe-2 supercomputer was also developed.

Supercomputers are a crucial component of a nation’s progress as they aid in solving the most complex and technical problems. The US has conventionally led the world in hosting the fastest supercomputers, as captured by the TOP500 listings, while also leading in the absolute number of supercomputers available to its researchers.

The high computing prowess of supercomputers can be used to carry out simulations for understanding climate change, carrying out material research, space exploration, and finding cures for various diseases. Of late, supercomputers have assumed importance for developing AI models, and access to advanced supercomputers could be critical in determining who leads the next frontier of information technology.

DARPA-Funded Research Leads to Quantum Computing Breakthrough

Some new concepts for me but interesting and a good step forward.


A team of researchers working on DARPA’s Optimization with Noisy Intermediate-Scale Quantum devices (ONISQ) program has created the first-ever quantum circuit with logical quantum bits (qubits), a key discovery that could accelerate fault-tolerant quantum computing and revolutionize concepts for designing quantum computer processors.

The ONISQ program began in 2020 seeking to demonstrate a quantitative advantage of quantum information processing by leapfrogging the performance of classical-only supercomputers to solve a particularly challenging class of problem known as combinatorial optimization. The program pursued a hybrid concept to combine intermediate-sized “noisy”— or error-prone — quantum processors with classical systems focused specifically on solving optimization problems of interest to defense and commercial industry. Teams were selected to explore various types of physical, non-logical qubits including superconducting qubits, ion qubits, and Rydberg atomic qubits.

The Harvard research team, supported by MIT, QuEra Computing, Caltech, and Princeton, focused on exploring the potential of Rydberg qubits, and in the course of their research made a major breakthrough: The team developed techniques to create error-correcting logical qubits using arrays of “noisy” physical Rydberg qubits. Logical qubits are a critical missing piece in the puzzle to realize fault-tolerant quantum computing. In contrast to error-prone physical qubits, logical qubits are error-corrected to maintain their quantum state, making them useful for solving a diverse set of complex problems.

IBM finally unveils quantum powerhouse, a 1,000+ qubit processor

With a processor that has fewer qubits, IBM has improved error correction, paving the way for the use of these processors in real life.


IBM has unveiled its much-awaited 1,000+ qubit quantum processor Condor, alongside a utility-scale processor dubbed IBM Quantum Heron at its Quantum Summit in New York. The latter is the first in the series of utility-scale quantum processors that IBM took four years to build, the company said in a press release.

Quantum computers, considered the next frontier of computing, have locked companies big and small in a race to build the platform that everybody would want to use to solve complex problems in medicine, physics, mathematics, and many more.

Even the fastest supercomputers of today are years behind the potential of quantum computers, whose capabilities keep improving with the addition of quantum bits or qubits in the processor. So, a 1,000+ qubit processor is a big deal, and even though a startup may have beaten IBM to this milestone, the latter’s announcement is still significant for what else IBM brings to the table.

Quantum computers could solve problems in minutes that would take today’s supercomputers millions of years

“We’re looking at a race, a race between China, between IBM, Google, Microsoft, Honeywell,” Kaku said. “All the big boys are in this race to create a workable, operationally efficient quantum computer. Because the nation or company that does this will rule the world economy.”

It’s not just the economy quantum computing could impact. A quantum computer is set up at Cleveland Clinic, where Chief Research Officer Dr. Serpil Erzurum believes the technology could revolutionize the world of health care.

Quantum computers can potentially model the behavior of proteins, the molecules that regulate all life, Erzurum said. Proteins change their shape to change their function in ways that are too complex to follow, but quantum computing could change that understanding.

How one national lab is getting its supercomputers ready for the AI age

OAK RIDGE, Tenn. — At Oak Ridge National Laboratory, the government-funded science research facility nestled between Tennessee’s Great Smoky Mountains and Cumberland Plateau that is perhaps best known for its role in the Manhattan Project, two supercomputers are currently rattling away, speedily making calculations meant to help tackle some of the biggest problems facing humanity.

You wouldn’t be able to tell from looking at them. A supercomputer called Summit mostly comprises hundreds of black cabinets filled with cords, flashing lights and powerful graphics processing units, or GPUs. The sound of tens of thousands of spinning disks on the computer’s file systems, and air cooling technology for ancillary equipment, make the device sound somewhat like a wind turbine — and, at least to the naked eye, the contraption doesn’t look much different from any other corporate data center. Its next-door neighbor, Frontier, is set up in a similar manner across the hall, though it’s a little quieter and the cabinets have a different design.

Yet inside those arrays of cabinets are powerful specialty chips and components capable of, collectively, training some of the largest AI models known. Frontier is currently the world’s fastest supercomputer, and Summit is the world’s seventh-fastest supercomputer, according to rankings published earlier this month. Now, as the Biden administration boosts its focus on artificial intelligence and touts a new executive order for the technology, there’s growing interest in using these supercomputers to their full AI potential.

Paradox of ultramassive black hole formation solved by supercomputer

With a gravitational field so strong that not even light can escape its grip, black holes are probably the most interesting and bizarre objects in the universe.

Due to their extreme properties, a theoretical description of these celestial bodies is impossible within the framework of Newton’s classical theory of gravity. It requires the use of general relativity, the theory proposed by Einstein in 1915, which treats gravitational fields as deformations in the fabric of space-time.

Black holes are usually formed from the collapse of massive stars during their final stage of evolution. Therefore, when a black hole is born, its mass does not exceed a few dozen solar masses.