Blog

Archive for the ‘information science’ category: Page 158

Nov 19, 2021

Researchers Find Human Learning Can be Duplicated in Synthetic Matter

Posted by in categories: information science, robotics/AI

Rutgers researchers and their collaborators have found that learning — a universal feature of intelligence in living beings — can be mimicked in synthetic matter, a discovery that in turn could inspire new algorithms for artificial intelligence (AI).

The study appears in the journal PNAS.

One of the fundamental characteristics of humans is the ability to continuously learn from and adapt to changing environments. But until recently, AI has been narrowly focused on emulating human logic. Now, researchers are looking to mimic human cognition in devices that can learn, remember and make decisions the way a human brain does.

Nov 18, 2021

Understanding Bias in AI: What Is Your Role, and Should You Care?

Posted by in categories: information science, robotics/AI

There are billions of people around the world whose online experience has been shaped by algorithms that utilize artificial intelligence (AI) and machine learning (ML). Some form of AI and ML is employed almost every time people go online, whether they are searching for content, watching a video, or shopping for a product. Not only do these technologies increase the efficiency and accuracy of consumption but, in the online ecosystem, service providers innovate upon and monetize behavioral data that is captured either directly from a user’s device, a website visit or by third parties.

Advertisers are increasingly dependent on this data and the algorithms that adtech and martech employ to understand where their ads should be placed, which ads consumers are likely to engage with, which audiences are most likely to convert, and which publisher should get credit for conversions.

Additionally, the collection and better utilization of data helps publishers generate revenue, minimize data risks and costs, and provide relevant consumer-preference-based audiences for brands.

Nov 17, 2021

A computer algorithm that speeds up experiments on plasma

Posted by in categories: biotech/medical, computing, information science, nuclear energy

A team of researchers from Tri Alpha Energy Inc. and Google has developed an algorithm that can be used to speed up experiments conducted with plasma. In their paper published in the journal Scientific Reports, the group describes how they plan to use the algorithm in nuclear fusion research.

As research into harnessing has progressed, scientists have found that some of its characteristics are too complex to be solved in a reasonable amount of time using current technology. So they have increasingly turned to computers to help. More specifically, they want to adjust certain parameters in a device created to achieve fusion in a reasonable way. Such a device, most in the field agree, must involve the creation of a certain type of that is not too hot or too cold, is stable, and has a certain desired density.

Finding the right parameters that meet these conditions has involved an incredible amount of trial and error. In this new effort, the researchers sought to reduce the workload by using a to reduce some of the needed trials. To that end, they have created what they call the “optometrist’s .” In its most basic sense, it works like an optometrist attempting to measure the visual ability of a patient by showing them images and asking if they are better or worse than other images. The idea is to use the crunching power of a computer with the intelligence of a human being—the computer generates the options and the human tells it whether a given option is better or worse.

Nov 17, 2021

Do Androids Dream of Electric Sheep? Dr. Ben Goertzel with Philip K. Dick at the Web Summit 2019

Posted by in categories: bitcoin, information science, internet, robotics/AI, singularity

Dr. Ben Goertzel with Philip K. Dick at the Web Summit in Lisbon 2019.

Ben showcases the use of OpenCog within the SingularityNET enviroment which is powering the AI of the Philip K. Dick Robot.

Continue reading “Do Androids Dream of Electric Sheep? Dr. Ben Goertzel with Philip K. Dick at the Web Summit 2019” »

Nov 17, 2021

Mathematicians derive the formulas for boundary layer turbulence 100 years after the phenomenon was first formulated

Posted by in categories: information science, mathematics

Turbulence makes many people uneasy or downright queasy. And it’s given researchers a headache, too. Mathematicians have been trying for a century or more to understand the turbulence that arises when a flow interacts with a boundary, but a formulation has proven elusive.

Now an international team of mathematicians, led by UC Santa Barbara professor Björn Birnir and the University of Oslo professor Luiza Angheluta, has published a complete description of boundary turbulence. The paper appears in Physical Review Research, and synthesizes decades of work on the topic. The theory unites empirical observations with the Navier-Stokes equation—the mathematical foundation of dynamics—into a .

This phenomenon was first described around 1920 by Hungarian physicist Theodore von Kármán and German physicist Ludwig Prandtl, two luminaries in fluid dynamics. “They were honing in on what’s called boundary layer turbulence,” said Birnir, director of the Center for Complex and Nonlinear Science. This is turbulence caused when a flow interacts with a boundary, such as the fluid’s surface, a pipe wall, the surface of the Earth and so forth.

Nov 16, 2021

New algorithms advance the computing power of early-stage quantum computers

Posted by in categories: chemistry, computing, information science, quantum physics

A group of scientists at the U.S. Department of Energy’s Ames Laboratory has developed computational quantum algorithms that are capable of efficient and highly accurate simulations of static and dynamic properties of quantum systems. The algorithms are valuable tools to gain greater insight into the physics and chemistry of complex materials, and they are specifically designed to work on existing and near-future quantum computers.

Scientist Yong-Xin Yao and his research partners at Ames Lab use the power of advanced computers to speed discovery in condensed matter physics, modeling incredibly complex quantum mechanics and how they change over ultra-fast timescales. Current high performance computers can model the properties of very simple, small quantum systems, but larger or more rapidly expand the number of calculations a computer must perform to arrive at an , slowing the pace not only of computation, but also discovery.

“This is a real challenge given the current early-stage of existing quantum computing capabilities,” said Yao, “but it is also a very promising opportunity, since these calculations overwhelm classical computer systems, or take far too long to provide timely answers.”

Nov 14, 2021

Physicists take the most detailed image of atoms to date

Posted by in categories: information science, mobile phones, particle physics

Physicists just put Apple’s latest iPhone to shame, taking the most detailed image of atoms to date with a device that magnifies images 100 million times, reports. The researchers, who set the record for the highest resolution microscope in 2018, outdid themselves with a study published last month. Using a method called electron ptychography, in which a beam of electrons is shot at an object and bounced off to create a scan that algorithms use to reverse engineer the above image, were used to visualize the sample. Previously, scientists could only use this method to image objects that were a few atoms thick. But the new study lays out a technique that can image samples 30 to 50 nanometers wide—a more than 10-fold increase in resolution, they report in. The breakthrough could help develop more efficient electronics and batteries, a process that requires visualizing components on the atomic level.

Nov 13, 2021

With the Metaverse on the way, an AI Bill of Rights is urgent

Posted by in categories: information science, internet, robotics/AI, security, sustainability

AI is a classic double-edged sword in much the same way as other major technologies have been since the start of the Industrial Revolution. Burning carbon drives the industrial world but leads to global warming. Nuclear fission provides cheap and abundant electricity though could be used to destroy us. The Internet boosts commerce and provides ready access to nearly infinite amounts of useful information, yet also offers an easy path for misinformation that undermines trust and threatens democracy. AI finds patterns in enormous and complex datasets to solve problems that people cannot, though it often reinforces inherent biases and is being used to build weapons where life and death decisions could be automated. The danger associated with this dichotomy is best described by sociobiologist E.O. Wilson at a Harvard debate, where he said “The real problem of humanity is the following: We have paleolithic emotions; medieval institutions; and God-like technology.”

Full Story:

Continue reading “With the Metaverse on the way, an AI Bill of Rights is urgent” »

Nov 13, 2021

Artificial Intelligence Predicts Eye Movements

Posted by in categories: biotech/medical, information science, robotics/AI

Summary: A newly developed AI algorithm can directly predict eye position and movement during an MRI scan. The technology could provide new diagnostics for neurological disorders that manifest in changes in eye-movement patterns.

Source: Max Planck Institute.

A large amount of information constantly flows into our brain via the eyes. Scientists can measure the resulting brain activity using magnetic resonance imaging (MRI). The precise measurement of eye movements during an MRI scan can tell scientists a great deal about our thoughts, memories and current goals, but also about diseases of the brain.

Nov 13, 2021

Crypto Miners Driving High Demand for AMD CPUs with Big L3 Caches

Posted by in categories: bitcoin, computing, cryptocurrencies, information science

Now that crypto miners and their scalping ilk have succeeded in taking all of our precious GPU stock, it appears they’re now setting their sights on one more thing gamers cherish: the AMD CPU supply. According to a report in the UK’s Bitcoin Press, part of the reason it’s so hard to find a current-gen AMD CPU for sale anywhere is because of a crypto currency named Raptoreum that uses the CPU to mine instead of an ASIC or a GPU. Apparently, its mining is sped up significantly by the large L3 cache embedded in CPUs such as AMD Ryzen, Epyc, and Threadripper.

Raptoreum was designed as an anti-ASIC currency, as they wanted to keep the more expensive hardware solutions off their blockchain since they believed it lowered profits for everyone. To accomplish this they chose the Ghostrider mining algorithm, which is a combination of Cryptonite and x16r algorithms, and thew in some unique code to make it heavily randomized, thus its preference for L3 cache.

In case you weren’t aware, AMD’s high-end CPUs have more cache than their competitors from Intel, making them a hot item for miners of this specific currency. For example, a chip like the Threadripper 3990X has a chonky 256MB of L3 cache, but since that’s a $5,000 CPU, miners are settling for the still-beefy Ryzen chips. A CPU like the Ryzen 5900X has a generous 64MB of L3 cache compared to just 30MB on Intel’s Alder Lake CPUs, and just 16MB on Intel’s 11th-gen chips. Several models of AMD CPUs have this much cache too, not just the flagship silicon, including the previous-gen Ryen 9 3900X CPU. The really affordable models, such as the 5800X, have just 32MB of L3 cache, however.