Blog

Archive for the ‘information science’ category: Page 210

May 31, 2020

Self-driving laboratory for accelerated discovery of thin-film materials

Posted by in categories: information science, robotics/AI, solar power, sustainability

Discovering and optimizing commercially viable materials for clean energy applications typically takes more than a decade. Self-driving laboratories that iteratively design, execute, and learn from materials science experiments in a fully autonomous loop present an opportunity to accelerate this research process. We report here a modular robotic platform driven by a model-based optimization algorithm capable of autonomously optimizing the optical and electronic properties of thin-film materials by modifying the film composition and processing conditions. We demonstrate the power of this platform by using it to maximize the hole mobility of organic hole transport materials commonly used in perovskite solar cells and consumer electronics. This demonstration highlights the possibilities of using autonomous laboratories to discover organic and inorganic materials relevant to materials sciences and clean energy technologies.

Optimizing the properties of thin films is time intensive because of the large number of compositional, deposition, and processing parameters available (1, 2). These parameters are often correlated and can have a profound effect on the structure and physical properties of the film and any adjacent layers present in a device. There exist few computational tools for predicting the properties of materials with compositional and structural disorder, and thus, the materials discovery process still relies heavily on empirical data. High-throughput experimentation (HTE) is an established method for sampling a large parameter space (4, 5), but it is still nearly impossible to sample the full set of combinatorial parameters available for thin films. Parallelized methodologies are also constrained by the experimental techniques that can be used effectively in practice.

May 30, 2020

Scientists Use Artificial Intelligence and Computer Vision to Study Lithium-Ion Batteries

Posted by in categories: information science, particle physics, robotics/AI

New machine learning methods bring insights into how lithium ion batteries degrade, and show it’s more complicated than many thought.

Lithium-ion batteries lose their juice over time, causing scientists and engineers to work hard to understand that process in detail. Now, scientists at the Department of Energy’s SLAC National Accelerator Laboratory have combined sophisticated machine learning algorithms with X-ray tomography data to produce a detailed picture of how one battery component, the cathode, degrades with use.

The new study, published this month in Nature Communications, focused on how to better visualize what’s going on in cathodes made of nickel-manganese-cobalt, or NMC. In these cathodes, NMC particles are held together by a conductive carbon matrix, and researchers have speculated that one cause of performance decline could be particles breaking away from that matrix. The team’s goal was to combine cutting-edge capabilities at SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL) and the European Synchrotron Radiation Facility (ESRF) to develop a comprehensive picture of how NMC particles break apart and break away from the matrix and how that might contribute to performance losses.

May 30, 2020

What Do the Quark Oddities at the Large Hadron Collider Mean?

Posted by in categories: information science, particle physics

In their latest analysis, first presented at a seminar in March, the LHCb physicists found that several measurements involving the decay of B mesons conflict slightly with the predictions of the standard model of particle physics—the reigning set of equations describing the subatomic world. Taken alone, each oddity looks like a statistical fluctuation, and they may all evaporate with additional data, as has happened before. But their collective drift suggests that the aberrations may be breadcrumbs leading beyond the standard model to a more complete theory.

“For the first time in certainly my working life, there are a confluence of different decays that are showing anomalies that match up,” said Mitesh Patel, a particle physicist at Imperial College London who is part of LHCb.

The B meson is so named because it contains a bottom quark, one of six fundamental quark particles that account for most of the universe’s visible matter. For unknown reasons, the quarks break down into three generations: heavy, medium, and light, each with quarks of opposite electric charge. Heavier quarks decay into their lighter variations, almost always switching their charge, too. For instance, when the negatively charged heavy bottom quark in a B meson drops a generation, it usually becomes a middleweight, positively charged “charm” quark.

May 30, 2020

Algorithm quickly simulates a roll of loaded dice

Posted by in categories: encryption, finance, information science, robotics/AI

The fast and efficient generation of random numbers has long been an important challenge. For centuries, games of chance have relied on the roll of a die, the flip of a coin, or the shuffling of cards to bring some randomness into the proceedings. In the second half of the 20th century, computers started taking over that role, for applications in cryptography, statistics, and artificial intelligence, as well as for various simulations—climatic, epidemiological, financial, and so forth.

May 30, 2020

OpenAI Finds Machine Learning Efficiency Is Outpacing Moore’s Law

Posted by in categories: entertainment, information science, robotics/AI

Eight years ago a machine learning algorithm learned to identify a cat —and it stunned the world. A few years later AI could accurately translate languages and take down world champion Go players. Now, machine learning has begun to excel at complex multiplayer video games like Starcraft and Dota 2 and subtle games like poker. AI, it would appear, is improving fast.

But how fast is fast, and what’s driving the pace? While better computer chips are key, AI research organization OpenAI thinks we should measure the pace of improvement of the actual machine learning algorithms too.

In a blog post and paper —authored by OpenAI’s Danny Hernandez and Tom Brown and published on the arXiv, an open repository for pre-print (or not-yet-peer-reviewed) studies—the researchers say they’ve begun tracking a new measure for machine learning efficiency (that is, doing more with less). Using this measure, they show AI has been getting more efficient at a wicked pace.

May 29, 2020

Solution to century-old math problem could predict transmission of infectious diseases

Posted by in categories: biotech/medical, information science, mathematics

A Bristol academic has achieved a milestone in statistical/mathematical physics by solving a 100-year-old physics problem—the discrete diffusion equation in finite space.

May 29, 2020

Quantum-Resistant Cryptography: Our Best Defense Against An Impending Quantum Apocalypse

Posted by in categories: computing, encryption, information science, quantum physics, security

As far back as 2015, the National Institute of Standards and Technology (NIST) began asking encryption experts to submit their candidate algorithms for testing against quantum computing’s expected capabilities — so this is an issue that has already been front of mind for security professionals and organizations. But even with an organization like NIST leading the way, working through all those algorithms to judge their suitability to the task will take time. Thankfully, others within the scientific community have also risen to the challenge and joined in the research.

It will take years for a consensus to coalesce around the most suitable algorithms. That’s similar to the amount of time it took ECC encryption to gain mainstream acceptance, which seems like a fair comparison. The good news is that such a timeframe still should leave the opportunity to arrive at — and widely deploy — quantum-resistant cryptography before quantum computers capable of sustaining the number of qubits necessary to seriously threaten RSA and ECC encryption become available to potential attackers.

The ongoing development of quantum-resistant encryption will be fascinating to watch, and security professionals will be sure to keep a close eye on which algorithms and encryption strategies ultimately prove most effective. The world of encryption is changing more quickly than ever, and it has never been more important for the organizations dependent on that encryption to ensure that their partners are staying ahead of the curve.

May 29, 2020

Eye-catching advances in some AI fields are not real

Posted by in categories: information science, robotics/AI

When tuned up, old algorithms can match the abilities of their successors.

May 29, 2020

Algorithm tracks down buried treasure among existing compounds

Posted by in categories: biotech/medical, chemistry, information science, robotics/AI, solar power

A machine-learning algorithm has been developed by scientists in Japan to breathe new life into old molecules. Called BoundLess Objective-free eXploration, or Blox, it allows researchers to search chemical databases for molecules with the right properties to see them repurposed. The team demonstrated the power of their technique by finding molecules that could work in solar cells from a database designed for drug discovery.

Chemical repurposing involves taking a molecule or material and finding an entirely new use for it. Suitable molecules for chemical repurposing tend to stand apart from the larger group when considering one property against another. These materials are said to be out-of-trend and can display previously undiscovered yet exceptional characteristics.

‘In public databases there are a lot of molecules, but each molecule’s properties are mostly unknown. These molecules have been synthesised for a particular purpose, for example drug development, so unrelated properties were not measured,’ explains Koji Tsuda of the Riken Centre for Advanced Intelligence and who led the development of Blox. ‘There are a lot of hidden treasures in databases.’

May 27, 2020

Why are neural networks so powerful?

Posted by in categories: information science, robotics/AI

It is common knowledge that neural networks are very powerful and they can be used for almost any statistical learning problem with great results. But have you thought about why is this the case? Why is this method more powerful in most scenarios than many other algorithms?