A technology that feels like it’s ‘always five years away’ may suddenly be two years away—but businesses are a little preoccupied.
Category: computing
A team of researchers has developed a “gut-on-chip” (a miniature model of the human intestine on a chip-sized device) capable of reproducing the main features of intestinal inflammation and of predicting the response of melanoma patients to immunotherapy treatment. The results have just been published in Nature Biomedical Engineering.
The interaction between microbiota and immunotherapy has long been known. It is the result of both systemic effects, i.e., the immune response elicited in the entire body by immunotherapy, and local processes, especially in the gut, where most of the bacteria that populate our body live. However, the latter can only be studied in animal models, with all their limitations.
Indeed, there is no clinical reason to subject a patient receiving immunotherapy for melanoma to colonoscopy and colon biopsy. Yet intestinal inflammation is one of the main side effects of this treatment, often forcing the therapy to be discontinued.
To test this new system, the team executed what is known as Grover’s search algorithm—first described by Indian-American computer scientist Lov Grover in 1996. This search looks for a particular item in a large, unstructured dataset using superposition and entanglement in parallel. The search algorithm also exhibits a quadratic speedup, meaning a quantum computer can solve a problem with the square root of the input rather than just a linear increase. The authors report that the system achieved a 71 percent success rate.
While operating a successful distributed system is a big step forward for quantum computing, the team reiterates that the engineering challenges remain daunting. However, networking together quantum processors into a distributed network using quantum teleportation provides a small glimmer of light at the end of a long, dark quantum computing development tunnel.
“Scaling up quantum computers remains a formidable technical challenge that will likely require new physics insights as well as intensive engineering effort over the coming years,” David Lucas, principal investigator of the study from Oxford University, said in a press statement. “Our experiment demonstrates that network-distributed quantum information processing is feasible with current technology.”
From punch card-operated looms in the 1800s to modern cellphones, if an object has an “on” and an “off” state, it can be used to store information.
In a computer laptop, the binary ones and zeroes are transistors either running at low or high voltage. On a compact disc, the one is a spot where a tiny indented “pit” turns to a flat “land” or vice versa, while a zero is when there’s no change.
Historically, the size of the object making the “ones” and “zeroes” has put a limit on the size of the storage device. But now, University of Chicago Pritzker School of Molecular Engineering (UChicago PME) researchers have explored a technique to make ones and zeroes out of crystal defects, each the size of an individual atom for classical computer memory applications.
Our brain and eyes can play tricks on us—not least when it comes to the expanding hole illusion. A new computational model developed by Flinders University experts helps to explain how cells in the human retina make us “see” the dark central region of a black hole graphic expand outwards.
In a new article posted to the arXiv preprint server, the Flinders University experts highlight the role of the eye’s retinal ganglion cells in processing contrast and motion perception—and how messages from the cerebral cortex then give the beholder an impression of a moving or “expanding hole.”
“Visual illusions provide valuable insights into the mechanisms of human vision, revealing how the brain interprets complex stimuli,” says Dr. Nasim Nematzadeh, from the College of Science and Engineering at Flinders University.
As quantum computers threaten traditional encryption, researchers are developing quantum networks to enable ultra-secure communication.
Scientists at Leibniz University Hannover have pioneered a new method using light frequencies to enhance quantum key distribution. This breakthrough reduces complexity, cuts costs, and paves the way for scalable, tap-proof quantum internet infrastructure.
A new report from TechInsights breaks things down, suggesting we could be in for a closely matched competition.
When it comes to transistor density, TSMC’s N2 appears to take the lead. The publication’s data estimates N2’s high-density standard cell transistor density at an impressive 313 million transistors per square millimeter, outpacing Intel’s 18A at 238 million and Samsung’s SF3 at 231 million. Of course, density isn’t everything; chip designers use a mix of high-, standard-, and low-power cells. However, TSMC’s advantage in density could provide an edge for certain workloads.
The comparison becomes less clear when it comes to performance projections. Intel’s 18A may have an advantage over TSMC’s N2 and Samsung’s SF3, but these are still just estimates based on extrapolating from previous node improvements.
Thanks to their excellent smelling ability, dogs have been used for hundreds of years to hunt down wild game and search for criminals. At airports, they help identify explosives and illicit drugs. In disaster situations, they can rescue survivors and find human remains.
But each dog can only be trained to detect one class of odor compounds, which limits the range of smells it’s able to detect. Training costs tens of thousands of dollars and takes several months. For Florida startup Canaery, the solution is merging canines with neurotechnology to allow them to detect everything from bombs and other contraband to human diseases and environmental toxins—no specialized training needed.
A microchip-sized particle accelerator has been successfully tested, marking a breakthrough in miniaturized high-energy physics.
To address this challenge, the researchers propose two alternative QV tests that sidestep classical simulation entirely. Their primary modification involves using parity-preserving quantum gates — gates that maintain the parity (even or odd sum) of qubits throughout the computation. This allows the heavy output subspace to be known in advance, eliminating the need for classical verification.
The first approach, the parity-preserving benchmark, modifies the structure of the quantum circuits while keeping the number of two-qubit interactions the same. The researchers argue that this change has minimal impact on experimental implementation but significantly reduces computational costs.
“Since the interaction part is unaffected, the number of fundamental two-qubit gates, 3 in case of CNOTs, remains unchanged,” they write in the paper.