Blog

Archive for the ‘information science’ category: Page 77

May 23, 2023

Meta’s open-source speech AI recognizes over 4,000 spoken languages

Posted by in categories: information science, robotics/AI

Meta has created an AI language model that (in a refreshing change of pace) isn’t a ChatGPT clone. The company’s Massively Multilingual Speech (MMS) project can recognize over 4,000 spoken languages and produce speech (text-to-speech) in over 1,100. Like most of its other publicly announced AI projects, Meta is open-sourcing MMS today to help preserve language diversity and encourage researchers to build on its foundation. “Today, we are publicly sharing our models and code so that others in the research community can build upon our work,” the company wrote. “Through this work, we hope to make a small contribution to preserve the incredible language diversity of the world.”

Speech recognition and text-to-speech models typically require training on thousands of hours of audio with accompanying transcription labels. (Labels are crucial to machine learning, allowing the algorithms to correctly categorize and “understand” the data.) But for languages that aren’t widely used in industrialized nations — many of which are in danger of disappearing in the coming decades — “this data simply does not exist,” as Meta puts it.

Meta used an unconventional approach to collecting audio data: tapping into audio recordings of translated religious texts. “We turned to religious texts, such as the Bible, that have been translated in many different languages and whose translations have been widely studied for text-based language translation research,” the company said. “These translations have publicly available audio recordings of people reading these texts in different languages.” Incorporating the unlabeled recordings of the Bible and similar texts, Meta’s researchers increased the model’s available languages to over 4,000.

May 22, 2023

Will My Son’s Blood Make Me Younger?

Posted by in categories: biotech/medical, evolution, information science, life extension

At Blueprint we’ve explored and evaluated hundreds of anti-aging therapies.

Recently, we had a daring idea: what if my father, son and I completed the world’s first ever multi-generational plasma exchange?

Continue reading “Will My Son’s Blood Make Me Younger?” »

May 20, 2023

Machine-learning program reveals genes responsible for sex-specific differences in Alzheimer’s disease progression

Posted by in categories: biotech/medical, genetics, information science, robotics/AI, sex

Alzheimer’s disease (AD) is a complex neurodegenerative illness with genetic and environmental origins. Females experience faster cognitive decline and cerebral atrophy than males, while males have greater mortality rates. Using a new machine-learning method they developed called “Evolutionary Action Machine Learning (EAML),” researchers at Baylor College of Medicine and the Jan and Dan Duncan Neurological Research Institute (Duncan NRI) at Texas Children’s Hospital have discovered sex-specific genes and molecular pathways that contribute to the development and progression of this condition. The study was published in Nature Communications.

“We have developed a unique machine-learning software that uses an advanced computational predictive metric called the evolutionary action (EA) score as a feature to identify that influence AD risk separately in males and females,” Dr. Olivier Lichtarge, MD, Ph.D., professor of biochemistry and at Baylor College of Medicine, said. “This approach lets us exploit a massive amount of evolutionary data efficiently, so we can now probe with greater accuracy smaller cohorts and identify involved in in AD.”

EAML is an ensemble computational approach that includes nine machine learning algorithms to analyze the functional impact of non-synonymous coding variants, defined as DNA mutations that affect the structure and function of the resulting protein, and estimates their deleterious effect on using the evolutionary action (EA) score.

May 19, 2023

New data-driven algorithm can forecast the mortality risk for certain cardiac surgery patients

Posted by in categories: biotech/medical, health, information science, robotics/AI

A machine learning-based method developed by a Mount Sinai research team allows medical facilities to forecast the mortality risk for certain cardiac surgery patients. The new method is the first institution-specific model for determining the risk of a cardiac patient before surgery and was developed using vast amounts of Electronic Health Data (EHR).

Comparing the data-driven approach to the current population-derived models reveals a considerable performance improvement.

May 19, 2023

New algorithm-backed tool offers accurate tracking for deforestation crisis

Posted by in categories: information science, innovation

Approximately 27 football fields’ worth of forests are lost every minute around the globe. That’s a massive annual loss of 15 billion trees.

Scientists have unveiled an innovative and comprehensive strategy to effectively detect and track large-scale forest disturbances, according to a new study published in the Journal of Remo.

Approximately 27 football fields’ worth of forests are lost every minute around the globe, resulting in a massive annual loss of 15 billion trees, according to the WWF. Given this concerning context, the new forest monitoring approach could be a valuable tool for effectively monitoring and managing forests as they undergo changes over time.

May 18, 2023

A programmable surface plasmonic neural network to detect and process microwaves

Posted by in categories: information science, robotics/AI

AI tools based on artificial neural networks (ANNs) are being introduced in a growing number of settings, helping humans to tackle many problems faster and more efficiently. While most of these algorithms run on conventional digital devices and computers, electronic engineers have been exploring the potential of running them on alternative platforms, such as diffractive optical devices.

A research team led by Prof. Tie Jun Cui at Southeast University in China has recently developed a new programmable neural network based on a so-called spoof surface plasmon polariton (SSPP), which is a surface that propagates along planar interfaces. This newly proposed surface plasmonic neural network (SPNN) architecture, introduced in a paper in Nature Electronics, can detect and process microwaves, which could be useful for wireless communication and other technological applications.

“In digital hardware research for the implementation of , optical neural networks and diffractive deep neural networks recently emerged as promising solutions,” Qian Ma, one of the researchers who carried out the study, told Tech Xplore. “Previous research focusing on optical neural networks showed that simultaneous high-level programmability and nonlinear computing can be difficult to achieve. Therefore, these ONN devices usually have been limited to without programmability, or only applied for simple recognition tasks (i.e., linear problems).”

May 17, 2023

How is human behaviour impacted by an unfair AI? A game of Tetris reveals all

Posted by in categories: entertainment, information science, robotics/AI

A team of researchers give a spin to Tetris, and make observations as people play the game.

We live in a world run by machines. They make important decisions for us, like who to hire, who gets approved for a loan, or recommending user content on social media. Machines and computer programs have an increasing influence over our lives, now more than ever, with artificial intelligence (AI) making inroads in our lives in new ways. And this influence goes far beyond the person directly interacting with machines.


A Cornell University-led experiment in which two people play a modified version of Tetris revealed that players who get fewer turns perceived the other player as less likable, regardless of whether a person or an algorithm allocated the turns.

Continue reading “How is human behaviour impacted by an unfair AI? A game of Tetris reveals all” »

May 16, 2023

Compression algorithms run on AI hardware to simulate nature’s most complex systems

Posted by in categories: climatology, information science, robotics/AI, space

High-performance computing (HPC) has become an essential tool for processing large datasets and simulating nature’s most complex systems. However, researchers face difficulties in developing more intensive models because Moore’s Law—which states that computational power doubles every two years—is slowing, and memory bandwidth still cannot keep up with it. But scientists can speed up simulations of complex systems by using compression algorithms running on AI hardware.

A team led by computer scientist Hatem Ltaief are tackling this problem head-on by employing designed for (AI) to help scientists make their code more efficient. In a paper published in the journal High Performance Computing, they now report making simulations up to 150 times faster in the diverse fields of climate modeling, astronomy, seismic imaging and wireless communications.

Previously, Ltaief and co-workers showed that many scientists were riding the wave of hardware development and “over-solving” their models, carrying out lots of unnecessary calculations.

May 16, 2023

Supercomputing simulations spot electron orbital signatures

Posted by in categories: information science, mathematics, particle physics, quantum physics, supercomputing

Something not musk:


No one will ever be able to see a purely mathematical construct such as a perfect sphere. But now, scientists using supercomputer simulations and atomic resolution microscopes have imaged the signatures of electron orbitals, which are defined by mathematical equations of quantum mechanics and predict where an atom’s electron is most likely to be.

Scientists at UT Austin, Princeton University, and ExxonMobil have directly observed the signatures of electron orbitals in two different transition-metal atoms, iron (Fe) and cobalt (Co) present in metal-phthalocyanines. Those signatures are apparent in the forces measured by atomic force microscopes, which often reflect the underlying orbitals and can be so interpreted.

Continue reading “Supercomputing simulations spot electron orbital signatures” »

May 16, 2023

Quantum Computing Algorithm Breakthrough Brings Practical Use Closer to Reality

Posted by in categories: chemistry, computing, information science, quantum physics

Out of all common refrains in the world of computing, the phrase “if only software would catch up with hardware” would probably rank pretty high. And yet, software does sometimes catch up with hardware. In fact, it seems that this time, software can go as far as unlocking quantum computations for classical computers. That’s according to researchers with the RIKEN Center for Quantum Computing, Japan, who have published work on an algorithm that significantly accelerates a specific quantum computing workload. More significantly, the workload itself — called time evolution operators — has applications in condensed matter physics and quantum chemistry, two fields that can unlock new worlds within our own.

Normally, an improved algorithm wouldn’t be completely out of the ordinary; updates are everywhere, after all. Every app update, software update, or firmware upgrade is essentially bringing revised code that either solves problems or improves performance (hopefully). And improved algorithms are nice, as anyone with a graphics card from either AMD or NVIDIA can attest. But let’s face it: We’re used to being disappointed with performance updates.

Page 77 of 322First7475767778798081Last