The use of artificial intelligence (AI) scares many people as neural networks, modeled after the human brain, are so complex that even experts do not understand them. However, the risk to society of applying opaque algorithms varies depending on the application.
While AI can cause great damage in democratic elections through the manipulation of social media, in astrophysics it at worst leads to an incorrect view of the cosmos, says Dr. Jonas Glombitza from the Erlangen Center for Astroparticle Physics (ECAP) at Friedrich-Alexander Universität Erlangen-Nürnberg (FAU).
The astrophysicist uses AI to accelerate the analysis of data from an observatory that researches cosmic radiation.
The race toward scalable quantum computing has reached a pivotal moment, with major players like Microsoft, Google, and IBM pushing forward with breakthroughs. Microsoft’s recent announcement of its Majorana 1 chip marks a significant milestone, while Google’s Willow chip and IBM’s long-term quantum roadmap illustrate the industry’s diverse approaches to achieving fault-tolerant quantum systems. As the quantum computing industry debates the timeline for practical implementation, breakthroughs like Majorana 1 and Willow suggest that major advancements may be closer than previously thought. At the same time, skepticism remains, with industry leaders such as Nvidia CEO Jensen Huang cautioning that meaningful commercial quantum applications could still be decades away.
Microsoft is redefining quantum computing with its new Majorana 1 chip, a significant breakthrough in the pursuit of scalable and fault-tolerant quantum systems. This quantum processor is built on a novel topological architecture that integrates Majorana particles, exotic quantum states that enhance qubit stability and reduce errors. Unlike conventional qubit technologies, which require extensive error correction, Microsoft’s approach aims to build fault tolerance directly into the hardware, significantly improving the feasibility of large-scale quantum computing. Satya Nadella, Microsoft’s CEO, highlighted the significance of this milestone in his LinkedIn post, We’ve created an entirely new state of matter, powered by a new class of materials, topoconductors. This fundamental leap in computing enables the first quantum processing unit built on a topological core.
IFLScience needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time.
Please join my mailing list here 👉 https://briankeating.com/list to win a meteorite 💥 Sabine (@SabineHossenfelder) argues that superdeterminism eliminates free will, challenging the idea of causal choice and possibly undermining science if the laws of physics govern all phenomena. However, inspired by daily life experiences in Southern California, I present a defense of indeterminism, countering the claim that everything is predetermined, while also exploring the ideas of cosmologists Raphael Bousso and Alan Guth.
Sabine Hossenfelder, a theoretical physicist, has argued in favor of superdeterminism, a theory that suggests the universe is deterministic and that our choices are predetermined.
According to her, the apparent randomness in quantum mechanics is an illusion, and the universe is actually a predetermined, clockwork-like system. She claims that if we knew enough about the initial conditions of the universe, we could predict every event, including human decisions.
Hossenfelder’s argument relies on the idea that the randomness in quantum mechanics is not fundamental, but rather a result of our lack of knowledge about the underlying variables. She suggests that if we could access these “hidden variables,” we would find that the universe is deterministic. However, this argument is flawed.
For example, consider the double-slit experiment, where particles passing through two slits create an interference pattern on a screen. Hossenfelder would argue that the particles’ behavior is predetermined, and that the apparent randomness is due to our lack of knowledge about the initial conditions. However, this ignores the fact that the act of observation itself can change the outcome of the experiment, a phenomenon known as wave function collapse.
Compare news coverage. Spot media bias. Avoid algorithms. Try Ground News today and get 40% off your subscription by going to https://ground.news/upandatom.
Special thanks to Chuankun Zhang, Tian Ooi, Jacob S. Higgins, and Jack F. Doyle from Prof. Jun Ye’s lab at JILA/NIST/University of Colorado, as well as Prof. Victor Flambaum from UNSW’s Department of Theoretical Physics, for their valuable assistance and consultation on this video.
Hi! I’m Jade. If you’d like to consider supporting Up and Atom, head over to my Patreon page smile https://www.patreon.com/upandatom.
A Kansas State University engineer recently published results from an observational study in support of a century-old theory that directly challenges the validity of the Big Bang theory.
Lior Shamir, associate professor of computer science, used imaging from a trio of telescopes and more than 30,000 galaxies to measure the redshift of galaxies based on their distance from Earth. Redshift is the change in the frequency of light waves that a galaxy emits, which astronomers use to gauge a galaxy’s speed.
Shamir’s findings lend support to the century-old “tired light” theory instead of the Big Bang. The findings are published in the journal Particles.
Light-emitting diodes (LEDs) are widely used electroluminescent devices that emit light in response to an applied electric voltage. These devices are central components of various electronic and optoelectronic technologies, including displays, sensors and communication systems.
Over the past decades, some engineers have been developing alternative LEDs known as quantum LEDs (QLEDs), which utilize quantum dots (i.e., nm-size semiconducting particles) as light-emitting components instead of conventional semiconductors. Compared to traditional LEDs, these quantum dot-based devices could achieve better energy-efficiencies and operational stabilities.
Despite their potential, most QLEDs developed so far have been found to have significantly slower response speeds than typical LEDs using inorganic III-V semiconductors. In other words, they are known to take a longer time to emit light in response to an applied electrical voltage.
A team of researchers from the University of Ottawa has made significant strides in understanding the ionization of atoms and molecules, a fundamental process in physics that has implications for various fields including X-ray generation and plasma physics.
The research, titled “Orbital angular momentum control of strong-field ionization in atoms and molecules,” is published in Nature Communications.
Think about atoms—the building blocks of everything around us. Sometimes, they lose their electrons and become charged particles (that’s ionization). It happens in lightning, in plasma TVs, and even in the northern lights. Until now, scientists thought they could only control this process in limited ways.
Magnetic materials have become indispensable to various technologies that support our modern society, such as data storage devices, electric motors, and magnetic sensors.
High-magnetization ferromagnets are especially important for the development of next-generation spintronics, sensors, and high-density data storage technologies. Among these materials, the iron-cobalt (Fe-Co) alloy is widely used due to its strong magnetic properties. However, there is a limit to how much their performance can be improved, necessitating a new approach.
Some earlier studies have shown that epitaxially grown films made up of Fe-Co alloys doped with heavier elements exhibit remarkably high magnetization. Moreover, recent advances in computational techniques, such as the integration of machine learning with ab initio calculations, have significantly accelerated the search for new material compositions.