Blog

Archive for the ‘information science’ category: Page 193

Nov 29, 2020

Sorting Out Viruses With Machine Learning: AI-Powered Nanotechnology May Lead to New Rapid COVID-19 Tests

Posted by in categories: biotech/medical, information science, nanotechnology, particle physics, robotics/AI

Scientists at Osaka University develop a label-free method for identifying respiratory viruses based on changes in electrical current when they pass through silicon nanopores, which may lead to new rapid COVID-19 tests.

The ongoing global pandemic has created an urgent need for rapid tests that can diagnose the presence of the SARS-CoV-2 virus, the pathogen that causes COVID-19, and distinguish it from other respiratory viruses. Now, researchers from Japan have demonstrated a new system for single-virion identification of common respiratory pathogens using a machine learning algorithm trained on changes in current across silicon nanopores. This work may lead to fast and accurate screening tests for diseases like COVID-19 and influenza.

In a study published this month in ACS Sensors scientists at Osaka University have introduced a new system using silicon nanopores sensitive enough to detect even a single virus particle when coupled with a machine learning algorithm.

Nov 28, 2020

Gut microbes: The key to normal sleep

Posted by in categories: biotech/medical, food, information science, neuroscience

With fall and winter holidays coming up, many will be pondering the relationship between food and sleep. Researchers led by Professor Masashi Yanagisawa at the University of Tsukuba in Japan hope they can focus people on the important middlemen in the equation: bacterial microbes in the gut. Their detailed study in mice revealed the extent to which bacteria can change the environment and contents of the intestines, which ultimately impacts behaviors like sleep.

The experiment itself was fairly simple. The researchers gave a group of a powerful cocktail of antibiotics for four weeks, which depleted them of intestinal microorganisms. Then, they compared intestinal contents between these mice and control mice who had the same diet. Digestion breaks food down into bits and pieces called metabolites. The research team found significant differences between metabolites in the microbiota-depleted mice and the control mice. As Professor Yanagisawa explains, “we found more than 200 differences between mouse groups. About 60 normal metabolites were missing in the microbiota-depleted mice, and the others differed in the amount, some more and some less than in the control mice.”

The team next set out to determine what these metabolites normally do. Using metabolome set enrichment analysis, they found that the biological pathways most affected by the antibiotic treatment were those involved in making neurotransmitters, the molecules that cells in the brain use to communicate with each other. For example, the tryptophan–serotonin pathway was almost totally shut down; the microbiota-depleted mice had more tryptophan than controls, but almost zero serotonin. This shows that without important gut microbes, the mice could not make any serotonin from the tryptophan they were eating. The team also found that the mice were deficient in vitamin B6 metabolites, which accelerate production of the neurotransmitters serotonin and dopamine.

Nov 26, 2020

AI trained on the bible spits out bleak religious prophecies

Posted by in categories: existential risks, information science, robotics/AI

Code Unto Caesar

Durendal’s algorithm wrote scripture about three topics: “the plague,” “Caesar,” and “the end of days.” So it’s not surprising that things took a grim turn. The full text is full of glitches characteristic of AI-written texts, like excerpts where over half of the nouns are “Lord.” But some passages are more coherent and read like bizarre doomsday prophecies.

For example, from the plague section: “O LORD of hosts, the God of Israel; When they saw the angel of the Lord above all the brethren which were in the wilderness, and the soldiers of the prophets shall be ashamed of men.”

Nov 23, 2020

NASA Uses Powerful Supercomputers and AI to Map Earth’s Trees, Discovers Billions of Trees in West African Drylands

Posted by in categories: information science, mapping, robotics/AI, supercomputing

Scientists from NASA ’s Goddard Space Flight Center in Greenbelt, Maryland, and international collaborators demonstrated a new method for mapping the location and size of trees growing outside of forests, discovering billions of trees in arid and semi-arid regions and laying the groundwork for more accurate global measurement of carbon storage on land.

Using powerful supercomputers and machine learning algorithms, the team mapped the crown diameter – the width of a tree when viewed from above – of more than 1.8 billion trees across an area of more than 500,000 square miles, or 1,300,000 square kilometers. The team mapped how tree crown diameter, coverage, and density varied depending on rainfall and land use.

Nov 17, 2020

Army-funded algorithm decodes brain signals responsible for behaviors

Posted by in categories: biotech/medical, cyborgs, information science, neuroscience

Like walking and breathing and could one day allow people to control prosthetics simply by thinking…


The US Army is funding a project that could lead to brain-machine interfaces. It includes an algorithm capable of isolating brain signals and linking them to specific behaviors such as walking and breathing.

Continue reading “Army-funded algorithm decodes brain signals responsible for behaviors” »

Nov 15, 2020

Get started on the new Advantage quantum computer. Try it for free

Posted by in categories: computing, information science, quantum physics

Sign up for Leap™ and get a free minute of direct QC access time, which is enough to run between 400 and 4000 problems. Alternatively, get 20 minutes of free access to Leap’s quantum-classical hybrid solvers, which exploit the complementary strengths of both best-in-class classical algorithms and quantum resources.

Nov 14, 2020

The World Needs Nuclear Power, And We Shouldn’t Be Afraid Of It

Posted by in categories: information science, nuclear energy, sustainability

Although many different approaches have been proposed to address this problem, it’s clear that any sustainable, long-term solution will include one important component: a transition to energy sources that don’t result in additional carbon dioxide emissions. While most of the ideas put forth — such as the hypothetical Green New Deal — focus on renewable energy sources like solar and wind power, there’s another option that we should seriously reconsider: nuclear fission power.


As we embrace green solutions, nuclear should absolutely be part of the equation.

Nov 13, 2020

CCNY team in quantum algorithm breakthrough

Posted by in categories: computing, information science, particle physics, quantum physics

Researchers led by City College of New York physicist Pouyan Ghaemi report the development of a quantum algorithm with the potential to study a class of many-electron quantums system using quantum computers. Their paper, entitled “Creating and Manipulating a Laughlin-Type ν=1/3 Fractional Quantum Hall State on a Quantum Computer with Linear Depth Circuits,” appears in the December issue of PRX Quantum, a journal of the American Physical Society.

“Quantum physics is the fundamental theory of nature which leads to formation of molecules and the resulting matter around us,” said Ghaemi, assistant professor in CCNY’s Division of Science. “It is already known that when we have a macroscopic number of quantum particles, such as electrons in the metal, which interact with each other, novel phenomena such as superconductivity emerge.”

However, until now, according to Ghaemi, tools to study systems with large numbers of interacting quantum particles and their novel properties have been extremely limited.

Nov 13, 2020

New study outlines steps higher education should take to prepare a new quantum workforce

Posted by in categories: education, employment, information science, quantum physics

A new study outlines ways colleges and universities can update their curricula to prepare the workforce for a new wave of quantum technology jobs. Three researchers, including Rochester Institute of Technology Associate Professor Ben Zwickl, suggested steps that need to be taken in a new paper in Physical Review Physics Education Research after interviewing managers at more than 20 quantum technology companies across the U.S.

The study’s authors from University of Colorado Boulder and RIT set out to better understand the types of entry-level positions that exist in these companies and the educational pathways that might lead into those jobs. They found that while the companies still seek employees with traditional STEM degrees, they want the candidates to have a grasp of fundamental concepts in quantum information science and technology.

“For a lot of those roles, there’s this idea of being ‘quantum aware’ that’s highly desirable,” said Zwickl, a member of RIT’s Future Photon Initiative and Center for Advancing STEM Teaching, Learning and Evaluation. “The companies told us that many positions don’t need to have deep expertise, but students could really benefit from a one- or two-semester introductory sequence that teaches the foundational concepts, some of the hardware implementations, how the algorithms work, what a qubit is, and things like that. Then a graduate can bring in all the strength of a traditional STEM degree but can speak the language that the is talking about.”

Nov 13, 2020

Google Brain Paper Demystifies Learned Optimizers

Posted by in categories: information science, robotics/AI

Learned optimizers are algorithms that can be trained to solve optimization problems. Although learned optimizers can outperform baseline optimizers in restricted settings, the ML research community understands remarkably little about their inner workings or why they work as well as they do. In a paper currently under review for ICLR 2021, a Google Brain research team attempts to shed some light on the matter.

The researchers explain that optimization algorithms can be considered the basis of modern machine learning. A popular research area in recent years has focused on learning optimization algorithms by directly parameterizing and training an optimizer on a distribution of tasks.

Research on learned optimizers aims to replace the baseline “hand-designed” optimizers with a parametric optimizer trained on a set of tasks, which can then be applied more generally. In contrast to baseline optimizers that use simple update rules derived from theoretical principles, learned optimizers use flexible, high-dimensional, nonlinear parameterizations.