Blog

Archive for the ‘information science’ category: Page 64

Aug 30, 2023

Quantum Computing May Help Protect AI From Attack

Posted by in categories: information science, quantum physics, robotics/AI

This post is also available in: he עברית (Hebrew)

At a crucial time when the development and deployment of AI are rapidly evolving, experts are looking at ways we can use quantum computing to protect AI from its vulnerabilities.

Machine learning is a field of artificial intelligence where computer models become experts in various tasks by consuming large amounts of data, instead of a human explicitly programming their level of expertise. These algorithms do not need to be taught but rather learn from seeing examples, similar to how a child learns.

Aug 28, 2023

What is the next wave after artificial intelligence|The Singularity Is Near|#audiobooks

Posted by in categories: information science, Ray Kurzweil, robotics/AI, singularity, space

https://www.youtube.com/watch?v=lmmaHadPf_Y

This book, ‘The Singularity Is Near’, predicts the future. However, unlike most best-selling futurology books, its author, Kurzweil, is a renowned technology expert. His insights into the future are not technocratic wild fantasies but are rooted in his profound contemplation of technological principles.

This audio informs us that, due to Moore’s Law, the pace of human technological advancement in the future will far exceed our expectations. By 2045, we will reach the technological ‘Singularity’, which will profoundly alter our human condition, and technology may even enable humans to conquer the universe within a millennium.

Continue reading “What is the next wave after artificial intelligence|The Singularity Is Near|#audiobooks” »

Aug 28, 2023

How Powerful Will AI Be In 2030?

Posted by in categories: biotech/medical, finance, information science, robotics/AI, transportation

🔒 Keep Your Digital Life Private and Stay Safe Online: https://nordvpn.com/safetyfirst.
Welcome to our channel! In this exciting video, we delve into the fascinating realm of artificial intelligence (AI) and explore the question that has intrigued tech enthusiasts and experts alike: “How powerful will AI be in 2030?” Join us as we embark on a captivating journey into the future of AI, examining the possibilities, advancements, and potential impact that await us.

In the next decade, AI is poised to revolutionize numerous industries and transform the way we live and work. As we peer into the crystal ball of technological progress, we aim to shed light on the potential power and capabilities that AI could possess by 2030. Brace yourself for mind-blowing insights and expert analysis that will leave you in awe.

Continue reading “How Powerful Will AI Be In 2030?” »

Aug 27, 2023

What if AI becomes self-aware?

Posted by in categories: information science, robotics/AI

A glimpse into the dynamics between a man and a self-conscious machine.

Artificial Intelligence (AI) in a nutshell

Artificial intelligence (AI) and Cognitive robotics are the two stalwart fields of design and engineering, seizing all the spotlight lately. Artificial intelligence is a human intelligence simulation processed by machines, whereas cognitive robotics is the corollary of robotics and cognitive science that deals with cognitive phenomena like learning, reasoning, perception, anticipation, memory, and attention. Robotics is a part of AI, where the robots are programmed with artificial intelligence to perform the tasks, and AI is the program or algorithm that a robot employs to perform cognitive functions. In simpler terms, a robot is a machine, and AI is the intellect fuel fortified to ignite perceptual abilities in a machine.

Aug 27, 2023

IBM develops a new 64-core mixed-signal in-memory computing chip

Posted by in categories: information science, robotics/AI

For decades, electronics engineers have been trying to develop increasingly advanced devices that can perform complex computations faster and consuming less energy. This has become even more salient after the advent of artificial intelligence (AI) and deep learning algorithms, which typically have substantial requirements both in terms of data storage and computational load.

A promising approach for running these algorithms is known as analog in-memory computing (AIMC). As suggested by its name, this approach consists of developing electronics that can perform computations and store data on a . To realistically achieve both improvements in speed and energy consumption, this approach should ideally also support on-chip digital operations and communications.

Researchers at IBM Research Europe recently developed a new 64-core mixed-signal in-memory computing chip based on phase-change memory devices that could better support the computations of deep neural networks. Their 64-core chip, presented in a paper in Nature Electronics, has so far attained highly promising results, retaining the accuracy of deep learning algorithms, while reducing computation times and energy consumption.

Aug 27, 2023

Android Focused Malware Could Extract Information From Calls

Posted by in categories: health, information science, mobile phones, robotics/AI

This post is also available in: he עברית (Hebrew)

Many users who want more from their smartphones are glad to use a plethora of advanced features, mainly for health and entertainment. Turns out that these features create a security risk when making or receiving calls.

Researchers from Texas A&M University and four other institutions created malware called EarSpy, which uses machine learning algorithms to filter caller information from ear speaker vibration data recorded by an Android smartphone’s own motion sensors, without overcoming any safeguards or needing user permissions.

Aug 26, 2023

New Quantum Computing Paradigm: Game-Changing Hardware for Faster Computation

Posted by in categories: computing, information science, quantum physics

Using natural quantum interactions allows faster, more robust computation for Grover’s algorithm and many others.

Los Alamos National Laboratory scientists have developed a groundbreaking quantum computing.

Performing computation using quantum-mechanical phenomena such as superposition and entanglement.

Aug 25, 2023

Could the Universe be a giant quantum computer?

Posted by in categories: alien life, computing, information science, mathematics, particle physics, quantum physics

In their 1982 paper, Fredkin and Toffoli had begun developing their work on reversible computation in a rather different direction. It started with a seemingly frivolous analogy: a billiard table. They showed how mathematical computations could be represented by fully reversible billiard-ball interactions, assuming a frictionless table and balls interacting without friction.

This physical manifestation of the reversible concept grew from Toffoli’s idea that computational concepts could be a better way to encapsulate physics than the differential equations conventionally used to describe motion and change. Fredkin took things even further, concluding that the whole Universe could actually be seen as a kind of computer. In his view, it was a ‘cellular automaton’: a collection of computational bits, or cells, that can flip states according to a defined set of rules determined by the states of the cells around them. Over time, these simple rules can give rise to all the complexities of the cosmos — even life.

He wasn’t the first to play with such ideas. Konrad Zuse — a German civil engineer who, before the Second World War, had developed one of the first programmable computers — suggested in his 1969 book Calculating Space that the Universe could be viewed as a classical digital cellular automaton. Fredkin and his associates developed the concept with intense focus, spending years searching for examples of how simple computational rules could generate all the phenomena associated with subatomic particles and forces3.

Aug 25, 2023

Novel approach uses machine learning for quick and easy rheumatic disease diagnosis

Posted by in categories: biotech/medical, information science, robotics/AI

In a recent study published in the journal Frontiers in Medicine, researchers evaluated fluorescence optical imaging (FOI) as a method to accurately and rapidly diagnose rheumatic diseases of the hands.

They used machine learning algorithms to identify the minimum number of FOI features to differentiate between osteoarthritis (OA), rheumatoid arthritis (RA), and connective tissue disease (CTD). Of the 20 features identified as associated with the conditions, results indicate that reduced sets of features between five and 15 in number were sufficient to diagnose each of the diseases under study accurately.

Aug 24, 2023

Why Is 1/137 One of the Greatest Unsolved Problems In Physics?

Posted by in categories: information science, quantum physics

Thank you to Squarespace for supporting PBS. Go to https://www.squarespace.com/pbs for a free trial, and when you are ready to launch, go to Squarespace.com/PBS to save 10% off your first purchase of a website or domain.

PBS Member Stations rely on viewers like you. To support your local station, go to: http://to.pbs.org/DonateSPACE

Continue reading “Why Is 1/137 One of the Greatest Unsolved Problems In Physics?” »

Page 64 of 322First6162636465666768Last