Blog

Archive for the ‘information science’ category: Page 179

May 22, 2021

Scientists Just Made A Quantum Computing Breakthrough!!

Posted by in categories: information science, particle physics, quantum physics, supercomputing

Keep watching to look at three of the most fantastic quantum breakthroughs that bring liberation and freedom to the world of science today! Subscribe to Futurity for more videos.

#quantum #quantumcomputing #google.

Continue reading “Scientists Just Made A Quantum Computing Breakthrough!!” »

May 21, 2021

Researchers see atoms at record resolution

Posted by in categories: information science, particle physics

In 2018, Cornell researchers built a high-powered detector that, in combination with an algorithm-driven process called ptychography, set a world record by tripling the resolution of a state-of-the-art electron microscope.

As successful as it was, that approach had a weakness. It only worked with ultrathin samples that were a few thick. Anything thicker would cause the electrons to scatter in ways that could not be disentangled.

Now a team, again led by David Muller, the Samuel B. Eckert Professor of Engineering, has bested its own record by a factor of two with an electron microscope pixel array detector (EMPAD) that incorporates even more sophisticated 3D reconstruction algorithms.

May 19, 2021

Unknown Physics on the Cosmic Scale? 1000 Supernova Explosions Chart the Expansion History of the Universe

Posted by in categories: cosmology, information science, physics

An international research team analyzed a database of more than 1000 supernova explosions and found that models for the expansion of the Universe best match the data when a new time dependent variation is introduced. If proven correct with future, higher-quality data from the Subaru Telescope and other observatories, these results could indicate still unknown physics working on the cosmic scale.

Edwin Hubble’s observations over 90 years ago showing the expansion of the Universe remain a cornerstone of modern astrophysics. But when you get into the details of calculating how fast the Universe was expanding at different times in its history, scientists have difficulty getting theoretical models to match observations.

To solve this problem, a team led by Maria Dainotti (Assistant Professor at the National Astronomical Observatory of Japan and the Graduate University for Advanced Studies, SOKENDAI in Japan and an affiliated scientist at the Space Science Institute in the U.S.A.) analyzed a catalog of 1048 supernovae which exploded at different times in the history of the Universe. The team found that the theoretical models can be made to match the observations if one of the constants used in the equations, appropriately called the Hubble constant, is allowed to vary with time.

May 18, 2021

Brain-Computer Interface Translates Brain Signals Associated with Handwriting into Text

Posted by in categories: computing, information science, neuroscience

Researchers with the BrainGate Collaboration have deciphered the brain activity associated with handwriting: working with a 65-year-old (at the time of the study) participant with paralysis who has sensors implanted in his brain, they used an algorithm to identify letters as he attempted to write them; then, the system displayed the text on a screen; by attempting handwriting, the participant typed 90 characters per minute — more than double the previous record for typing with a brain-computer interface.

So far, a major focus of brain-computer interface research has been on restoring gross motor skills, such as reaching and grasping or point-and-click typing with a computer cursor.

May 18, 2021

Understanding dimensionality reduction in machine learning models

Posted by in categories: information science, robotics/AI

Machine learning algorithms have gained fame for being able to ferret out relevant information from datasets with many features, such as tables with dozens of rows and images with millions of pixels. Thanks to advances in cloud computing, you can often run very large machine learning models without noticing how much computational power works behind the scenes.

But every new feature that you add to your problem adds to its complexity, making it harder to solve it with machine learning algorithms. Data scientists use dimensionality reduction, a set of techniques that remove excessive and irrelevant features from their machine learning models.

Continue reading “Understanding dimensionality reduction in machine learning models” »

May 13, 2021

Israel wary of Iran allies spreading war beyond Gaza

Posted by in category: information science

| by TOM O’CONNOR — SHAOLIN’S FINEST REPORTER.


“Of course, we are supporters,” a Hezbollah spokesperson told Newsweek. “But I don’t think they’re in need of our people. The numbers are available. All the rockets and capabilities are in the hands of the resistance fighters in Palestine.”

Hezbollah leadership also felt there was more to come.

Continue reading “Israel wary of Iran allies spreading war beyond Gaza” »

May 10, 2021

A new Method Simulates the Universe 1000 Times Faster

Posted by in categories: cosmology, information science, robotics/AI

Cosmologists love universe simulations. Even models covering hundreds of millions of light years can be useful for understanding fundamental aspects of cosmology and the early universe. There’s just one problem – they’re extremely computationally intensive. A 500 million light year swath of the universe could take more than 3 weeks to simulate… Now, scientists led by Yin Li at the Flatiron Institute have developed a way to run these cosmically huge models 1000 times faster. That 500 million year light year swath could then be simulated in 36 minutes.

Older algorithms took such a long time in part because of a tradeoff. Existing models could either simulate a very detailed, very small slice of the cosmos or a vaguely detailed larger slice of it. They could provide either high resolution or a large area to study, not both.

To overcome this dichotomy, Dr. Li turned to an AI technique called a generative adversarial network (GAN). This algorithm pits two competing algorithms again each other, and then iterates on those algorithms with slight changes to them and judges whether those incremental changes improved the algorithm or not. Eventually, with enough iterations, both algorithms become much more accurate naturally on their own.

May 7, 2021

Latest Neural Nets Solve World’s Hardest Equations Faster Than Ever Before

Posted by in categories: information science, robotics/AI

😀


Two new approaches allow deep neural networks to solve entire families of partial differential equations, making it easier to model complicated systems and to do so orders of magnitude faster.

May 6, 2021

New device can measure glucose in sweat with the touch of a fingertip

Posted by in categories: biotech/medical, chemistry, information science

Many people with diabetes endure multiple, painful finger pricks each day to measure their blood glucose. Now, researchers reporting in ACS Sensors have developed a device that can measure glucose in sweat with the touch of a fingertip, and then a personalized algorithm provides an accurate estimate of blood glucose levels.

According to the American Diabetes Association, more than 34 million children and adults in the U.S. have diabetes. Although self-monitoring of blood glucose is a critical part of diabetes management, the pain and inconvenience caused by finger-stick blood sampling can keep people from testing as often as they should.

The researchers made a touch-based sweat glucose sensor with a polyvinyl alcohol hydrogel on top of an electrochemical sensor, which was screen-printed onto a flexible plastic strip. When a volunteer placed their fingertip on the sensor surface for 1 minute, the hydrogel absorbed tiny amounts of sweat. Inside the sensor, glucose in the sweat underwent an enzymatic reaction that resulted in a small electrical current that was detected by a hand-held device.

May 6, 2021

New algorithm uses a hologram to control trapped ions

Posted by in categories: computing, engineering, holograms, information science, quantum physics

Researchers have discovered the most precise way to control individual ions using holographic optical engineering technology.

The new technology uses the first known holographic optical engineering device to control trapped ion qubits. This technology promises to help create more precise controls of qubits that will aid the development of quantum industry-specific hardware to further new quantum simulation experiments and potentially quantum error correction processes for trapped ion qubits.

“Our algorithm calculates the hologram’s profile and removes any aberrations from the light, which lets us develop a highly precise technique for programming ions,” says lead author Chung-You Shih, a Ph.D. student at the University of Waterloo’s Institute for Quantum Computing (IQC).