Quantum computers will not kill blockchain, but they might trigger fundamental changes in underlying cryptography.

The field has narrowed in the race to protect sensitive electronic information from the threat of quantum computers, which one day could render many of our current encryption methods obsolete.
As the latest step in its program to develop effective defenses, the National Institute of Standards and Technology (NIST) has winnowed the group of potential encryption tools—known as cryptographic algorithms—down to a bracket of 26. These algorithms are the ones NIST mathematicians and computer scientists consider to be the strongest candidates submitted to its Post-Quantum Cryptography Standardization project, whose goal is to create a set of standards for protecting electronic information from attack by the computers of both tomorrow and today.
“These 26 algorithms are the ones we are considering for potential standardization, and for the next 12 months we are requesting that the cryptography community focus on analyzing their performance,” said NIST mathematician Dustin Moody. “We want to get better data on how they will perform in the real world.”
In their book “Era of Exponential Encryption — Beyond Cryptographic Routing” the authors provide a vision that can demonstrate an increasing multiplication of options for encryption and decryption processes: Similar to a grain of rice that doubles exponentially in every field of a chessboard, more and more newer concepts and programming in the area of cryptography increase these manifolds: both, encryption and decryption, require more session-related and multiple keys, so that numerous options even exist for configuring hybrid encryption: with different keys and algorithms, symmetric and asymmetrical methods, or even modern multiple encryption, with that ciphertext is converted again and again to ciphertext. It will be analyzed how a handful of newer applications like e.g. Spot-On and GoldBug E-Mail Client & Crypto Chat Messenger and other open source software programming implement these encryption mechanisms. Renewing a key several times — within the dedicated session with “cryptographic calling” — has forwarded the term of “perfect forward secrecy” to “instant perfect forward secrecy” (IPFS). But even more: if in advance a bunch of keys is sent, a decoding of a message has to consider not only one present session key, but over dozens of keys are sent — prior before the message arrives. The new paradigm of IPFS has already turned into the newer concept of these Fiasco Keys are keys, which provide over a dozen possible ephemeral keys within one session and define Fiasco Forwarding, the approach which complements and follows IPFS. And further: by adding routing- and graph-theory to the encryption process, which is a constant part of the so called Echo Protocol, an encrypted packet might take different graphs and routes within the network. This shifts the current status to a new age: The Era of Exponential Encryption, so the vision and description of the authors. If routing does not require destination information but is replaced by cryptographic in.
RSA Encryption is an essential safeguard for our online communications. It was also destined to fail even before the Internet made RSA necessary, thanks the work of Peter Shor, whose algorithm in 1994 proved quantum computers could actually be used to solve problems classical computers could not.
In conventional holography a photographic film can record the interference pattern of monochromatic light scattered from the object to be imaged with a reference beam of un-scattered light. Scientists can then illuminate the developed image with a replica of the reference beam to create a virtual image of the original object. Holography was originally proposed by the physicist Dennis Gabor in 1948 to improve the resolution of an electron microscope, demonstrated using light optics. A hologram can be formed by capturing the phase and amplitude distribution of a signal by superimposing it with a known reference. The original concept was followed by holography with electrons, and after the invention of lasers optical holography became a popular technique for 3D imaging macroscopic objects, information encryption and microscopy imaging.
However, extending holograms to the ultrafast domain currently remains a challenge with electrons, although developing the technique would allow the highest possible combined spatiotemporal resolution for advanced imaging applications in condensed matter physics. In a recent study now published in Science Advances, Ivan Madan and an interdisciplinary research team in the departments of Ultrafast Microscopy and Electron Scattering, Physics, Science and Technology in Switzerland, the U.K. and Spain, detailed the development of a hologram using local electromagnetic fields. The scientists obtained the electromagnetic holograms with combined attosecond/nanometer resolution in an ultrafast transmission electron microscope (UEM).
In the new method, the scientists relied on electromagnetic fields to split an electron wave function in a quantum coherent superposition of different energy states. The technique deviated from the conventional method, where the signal of interest and reference spatially separated and recombined to reconstruct the amplitude and phase of a signal of interest to subsequently form a hologram. The principle can be extended to any kind of detection configuration involving a periodic signal capable of undergoing interference, including sound waves, X-rays or femtosecond pulse waveforms.