Archive for the ‘computing’ category: Page 834
Feb 22, 2016
RMIT Researchers Examine Environmental and Health Risks Posed by 3D Printing
Posted by Karen Hurst in categories: 3D printing, computing, health, materials
3D Printing hazardous to the environment due to toxins.
Three-dimensional (3D) printing, also known as additive manufacturing, refers to those technologies capable of developing 3D objects from raw materials, like metals and polymers based on computerized 3D parametric models.
Feb 22, 2016
Prosthetics: Amputee James Young unveils hi-tech synthetic arm inspired by Metal Gear Solid
Posted by Roman Mednitzer in categories: biotech/medical, computing, cyborgs, engineering
The job advertisement was highly specific: applicants had to be passionate about computer games and live in the UK. Oh, and they also had to be amputees who were interested in wearing a futuristic prosthetic limb.
James Young knew straight away he had a better shot than most. After losing an arm and a leg in a rail accident in 2012, the 25-year-old Londoner had taught himself to use a video-game controller with one hand and his teeth. “How many amputee gamers can there be?” he asked himself.
In the end, more than 60 people replied to the ad, which was looking for a games-mad amputee to become the recipient of a bespoke high-tech prosthetic arm inspired by Metal Gear Solid, one of the world’s best-selling computer games. Designed and built by a team of 10 experts led by London-based prosthetic sculptor Sophie de Oliveira Barata, the £60,000 carbon-fibre limb is part art project, part engineering marvel.
Feb 22, 2016
Don’t Set Your iPhone Back to 1970, No Matter What
Posted by Karen Hurst in categories: computing, mobile phones
Feb 22, 2016
IARPA Project Targets Hidden Algorithms of the Brain
Posted by Karen Hurst in categories: computing, information science, neuroscience, robotics/AI
Whether in the brain or in code, neural networks are shaping up to be one of the most critical areas of research in both neuroscience and computer science. An increasing amount of attention, funding, and development has been pushed toward technologies that mimic the brain in both hardware and software to create more efficient, high performance systems capable of advanced, fast learning.
One aspect of all the efforts toward more scalable, efficient, and practical neural networks and deep learning frameworks we have been tracking here at The Next Platform is how such systems might be implemented in research and enterprise over the next ten years. One of the missing elements, at least based on the conversations that make their way into various pieces here, for such eventual end users is reducing the complexity of the training process for neural networks to make them more practically useful–and without all of the computational overhead and specialized systems training requires now. Crucial then, is a whittling down of how neural networks are trained and implemented. And not surprisingly, the key answers lie in the brain, and specifically, functions in the brain and how it “trains” its own network that are still not completely understood, even by top neuroscientists.
In many senses, neural networks, cognitive hardware and software, and advances in new chip architectures are shaping up to be the next important platform. But there are still some fundamental gaps in knowledge about our own brains versus what has been developed in software to mimic them that are holding research at bay. Accordingly, the Intelligence Advanced Research Projects Activity (IARPA) in the U.S. is getting behind an effort spearheaded by Tai Sing Lee, a computer science professor at Carnegie Mellon University’s Center for the Neural Basis of Cognition, and researchers at Johns Hopkins University, among others, to make new connections between the brain’s neural function and how those same processes might map to neural networks and other computational frameworks. The project called the Machine Intelligence from Cortical Networks (MICRONS).
Continue reading “IARPA Project Targets Hidden Algorithms of the Brain” »
Feb 22, 2016
IARPA wants to improve human/machine forecasting
Posted by Karen Hurst in category: computing
Human and machine forecasting.
The agency’s Hybrid Forecasting Competition is intended to improve how humans and computers interact on geopolitical and geoeconomic analysis.
Feb 21, 2016
Robot chores: Machines tipped to take 15m Brit jobs in the next ten years
Posted by Karen Hurst in categories: computing, employment, habitats, robotics/AI
“No offense; but your robots are ugly”
Robots today (especially for home and care giver usage) will need to improve drastically. We’re still designing robots like the are a CPU for homes which frankly freaks some kids out, scares some of the elderly population that it’s too fragile to operate, and my own cat will not come near one. If robotics for home use is ever going to be adopted by the large mass of the population they will need to look less like they are a robot part of a manufacturers’s assembly line, will need a softer/ low noise sound with volume controls for those with hard of hearing, will need modifications for the deaf and blind, will all need to be a multi purpose robot that can do 2 or more types of work inside the home vacumn/ dust/ cook/ wash dishes/ wash clothes, etc., not complicated to set up and operate, reliable (not needing repairs all the time & not over heat), less bulky, better sensors to determine stairs and can climb stairs, etc.
From mowing the lawn to cooking dinner, experts say automatons are set to take over some of our most tedious tasks.
Feb 20, 2016
United Nations CITO: Artificial intelligence will be humanity’s final innovation
Posted by Karen Hurst in categories: computing, internet, quantum physics, robotics/AI, security
I hate to break the news to the UN’s CITO — has she ever heard of “Quantum Technology?” After AI flood into the scene; the next innovation that I and others are working on is Quantum Computing which will make AI, Internet, Cyber Security, devices, platforms, medical technology more advance with incredible performance.
The United Nations Chief Information Technology Officer spoke with TechRepublic about the future of cybersecurity, social media, and how to fix the internet and build global technology for social good.
Artificial intelligence, said United Nations chief information technology officer Atefeh Riazi, might be the last innovation humans create.
Feb 20, 2016
Gaming Chip Is Helping Raise Your Computer’s IQ
Posted by Karen Hurst in categories: computing, entertainment, mobile phones, robotics/AI
Using gaming chips to read people’s images, etc. definitely makes sense especially as we move more and more in the AI connected experience.
Facebook, Google and Microsoft are tapping the power of a vintage computer gaming chip to raise your smartphone’s IQ with artificially intelligent programs that recognize faces and voices, translate conversations on the fly and make searches faster and more accurate.
Feb 20, 2016
Basic income may be needed to combat robot-induced unemployment, leading AI expert says
Posted by Karen Hurst in categories: computing, economics, employment, robotics/AI
I do believe that there will be some level of expansion of social services to help employees to be retrained for the new positions that are coming as well as assist lower skill workers to be retrained. However, the larger question is who should pay. Some people are saying tech should assist governments in retooling since the AI technology created the situation; others say it’s a governments issue only, etc. It will be interesting to say the least how the retraining program and other services are covered.
A leading artificial intelligence (AI) expert believes that societies may have to consider issuing a basic income to all citizens, in order to combat the threat to jobs posed by increased automation in the workplace.
Dr Moshe Vardi, a computer science professor at Rice University in Texas, believes that a basic income may be needed in the future as advances in automation and AI put human workers out of jobs.