Toggle light / dark theme

Researchers Demonstrate QuantumShield-BC Blockchain Framework

Researchers have developed QuantumShield-BC, a blockchain framework designed to resist attacks from quantum computers by integrating post-quantum cryptography (PQC) utilising algorithms such as Dilithium and SPHINCS+, quantum key distribution (QKD), and quantum Byzantine fault tolerance (Q-BFT) leveraging quantum random number generation (QRNG) for unbiased leader selection. The framework was tested on a controlled testbed with up to 100 nodes, demonstrating resistance to simulated quantum attacks and achieving fairness through QRNG-based consensus. An ablation study confirmed the contribution of each quantum component to overall security, although the QKD implementation was simulated and scalability to larger networks requires further investigation.

Thermodynamic computing system for AI applications

Recent breakthroughs in artificial intelligence (AI) algorithms have highlighted the need for alternative computing hardware in order to truly unlock the potential for AI. Physics-based hardware, such as thermodynamic computing, has the potential to provide a fast, low-power means to accelerate AI primitives, especially generative AI and probabilistic AI. In this work, we present a small-scale thermodynamic computer, which we call the stochastic processing unit. This device is composed of RLC circuits, as unit cells, on a printed circuit board, with 8 unit cells that are all-to-all coupled via switched capacitances. It can be used for either sampling or linear algebra primitives, and we demonstrate Gaussian sampling and matrix inversion on our hardware. The latter represents a thermodynamic linear algebra experiment. We envision that this hardware, when scaled up in size, will have significant impact on accelerating various probabilistic AI applications.

#Repost Nature Publishing


Current digital hardware struggles with high computational demands in applications such as probabilistic AI. Here, authors present a small-scale thermodynamic computer composed of eight RLC circuits, demonstrating Gaussian sampling and matrix inversion, suggesting potential speed and energy efficiency advantages over digital GPUs.

China data link could offer faster coordination during hypersonic attacks

China’s military data link could offer faster coordination during hypersonic attacks.


Chinese researchers explain that traditional tactical data links rely on round-trip time (RTT) for synchronization, which works for low-speed aircraft. Systems like NATO’s Link-16 achieve roughly 100-nanosecond accuracy under these conditions.

However, in hypersonic cooperative strike systems operating above Mach 5, the rapid relative motion between widely dispersed platforms creates asymmetric transmission paths, severely reducing the precision of conventional RTT algorithms. This highlights the need for new communication technologies capable of maintaining ultra-precise timing at extreme speeds.

What came before the Big Bang? Supercomputers may hold the answer

Scientists are rethinking the universe’s deepest mysteries using numerical relativity, complex computer simulations of Einstein’s equations in extreme conditions. This method could help explore what happened before the Big Bang, test theories of cosmic inflation, investigate multiverse collisions, and even model cyclic universes that endlessly bounce through creation and destruction.

A new perspective on how cosmological correlations change based on kinematic parameters

To study the origin and evolution of the universe, physicists rely on theories that describe the statistical relationships between different events or fields in spacetime, broadly referred to as cosmological correlations. Kinematic parameters are essentially the data that specify a cosmological correlation—the positions of particles, or the wavenumbers of cosmological fluctuations.

Changes in cosmological correlations influenced by variations in parameters can be described using so-called differential equations. These are a type of mathematical equation that connect a function (i.e., a relationship between an input and an output) to its rate of change. In physics, these equations are used extensively as they are well-suited for capturing the universe’s highly dynamic nature.

Researchers at Princeton’s Institute for Advanced Study, the Leung Center for Cosmology and Particle Astrophysics in Taipei, Caltech’s Walter Burke Institute for Theoretical Physics, the University of Chicago, and the Scuola Normale Superiore in Pisa recently introduced a new perspective to approach equations describing how cosmological correlations are affected by smooth changes in kinematic parameters.

Relativistic Motion Boosts Engine Efficiency Beyond Limits

The pursuit of more efficient engines continually pushes the boundaries of thermodynamics, and recent work demonstrates that relativistic effects may offer a surprising pathway to surpass conventional limits. Tanmoy Pandit from the Leibniz Institute of Hannover, along with Tanmoy Pandit from TU Berlin and Pritam Chattopadhyay from the Weizmann Institute of Science, and colleagues, investigate a novel thermal machine that harnesses the principles of relativity to achieve efficiencies beyond those dictated by the Carnot cycle. Their research reveals that by incorporating relativistic motion into the system, specifically through the reshaping of energy spectra via the Doppler effect, it becomes possible to extract useful work even without a temperature difference, effectively establishing relativistic motion as a valuable resource for energy conversion. This discovery not only challenges established thermodynamic boundaries, but also opens exciting possibilities for designing future technologies that leverage the fundamental principles of relativity to enhance performance.


The appendices detail the Lindblad superoperator used to describe the system’s dynamics and the transformation to a rotating frame to simplify the analysis. They show how relativistic motion affects the average number of quanta in the reservoir and the superoperators, and present the detailed derivation of the steady-state density matrix elements for the three-level heat engine, providing equations for power output and efficiency. The document describes the Monte Carlo method used to estimate the generalized Carnot-like efficiency bound in relativistic quantum thermal machines, providing pseudocode for implementation and explaining how the efficiency bound is extracted from efficiency and power pairs. Overall, this is an excellent supplementary material document that provides a comprehensive and detailed explanation of the theoretical framework, calculations, and numerical methods used in the research paper. The clear organization, detailed derivations, and well-explained physical concepts make it a valuable resource for anyone interested in relativistic quantum thermal machines.

Relativistic Motion Boosts Heat Engine Efficiency

Researchers have demonstrated that relativistic motion can function as a genuine thermodynamic resource, enabling a heat engine to surpass the conventional limits of efficiency. The team investigated a three-level maser, where thermal reservoirs are in constant relativistic motion relative to the working medium, using a model that accurately captures the effects of relativistic motion on energy transfer. The results reveal that the engine’s performance is not solely dictated by temperature differences, but is significantly influenced by the velocity of the thermal reservoirs. Specifically, the engine can operate with greater efficiency than predicted by the Carnot limit, due to the reshaping of the energy spectrum caused by relativistic motion.

Grok answers my questions about what Elon meant when he said Tesla FSD v14 will seem sentient

Questions to inspire discussion.

Advanced Navigation and Obstacle Recognition.

🛣️ Q: How will FSD v14 handle unique driveway features? A: The improved neural net and higher resolution video processing will help FSD v14 better recognize and navigate features like speed bumps and humps, adjusting speed and steering smoothly based on their shape and height.

🚧 Q: What improvements are expected in distinguishing real obstacles? A: Enhanced object detection driven by improved algorithms and higher resolution video inputs will make FSD v14 better at distinguishing real obstacles from false positives like tire marks, avoiding abrupt breaking and overreacting.

Edge case handling and smooth operation.

🧩 Q: How will FSD v14 handle complex edge cases? A: The massive jump in parameter count and better video compression will help the AI better understand edge cases, allowing it to reason that non-threatening objects like a stationary hatch in the road aren’t obstacles, maintaining smooth cruising.

What happened before the Big Bang? Computational method may provide answers

We’re often told it is “unscientific” or “meaningless” to ask what happened before the Big Bang. But a new paper by FQxI cosmologist Eugene Lim, of King’s College London, UK, and astrophysicists Katy Clough, of Queen Mary University of London, UK, and Josu Aurrekoetxea, at Oxford University, UK, published in Living Reviews in Relativity, proposes a way forward: using complex computer simulations to numerically (rather than exactly) solve Einstein’s equations for gravity in extreme situations.

Ultrathin metasurface enables high-efficiency vectorial holography

Holography—the science of recording and reconstructing light fields—has long been central to imaging, data storage, and encryption. Traditional holographic systems, however, rely on bulky optical setups and interference experiments, making them impractical for compact or integrated devices. Computational methods such as the Gerchberg–Saxton (GS) algorithm have simplified hologram design by eliminating the need for physical interference patterns, but these approaches typically produce scalar holograms with uniform polarization, limiting the amount of information that can be encoded.

A Wearable Robot That Learns

Having lived with an ALS diagnosis since 2018, Kate Nycz can tell you firsthand what it’s like to slowly lose motor function for basic tasks. “My arm can get to maybe 90 degrees, but then it fatigues and falls,” the 39-year-old said. “To eat or do a repetitive motion with my right hand, which was my dominant hand, is difficult. I’ve mainly become left-handed.”

People like Nycz who live with a neurodegenerative disease like ALS or who have had a stroke often suffer from impaired movement of the shoulder, arm or hands, preventing them from daily tasks like tooth-brushing, hair-combing or eating.

For the last several years, Harvard bioengineers have been developing a soft, wearable robot that not only provides movement assistance for such individuals but could even augment therapies to help them regain mobility.

But no two people move exactly the same way. Physical motions are highly individualized, especially for the mobility-impaired, making it difficult to design a device that works for many different people.

It turns out advances in machine learning can create a more personal touch. Researchers in the John A. Paulson School of Engineering and Applied Sciences (SEAS), together with physician-scientists at Massachusetts General Hospital and Harvard Medical School, have upgraded their wearable robot to be responsive to an individual user’s exact movements, endowing the device with more personalized assistance that could give users better, more controlled support for daily tasks.


/* */