Blog

Archive for the ‘computing’ category: Page 810

Jan 21, 2016

Memory capacity of brain is 10 times more than previously thought

Posted by in categories: bioengineering, computing, neuroscience

In a computational reconstruction of brain tissue in the hippocampus, Salk and UT-Austin scientists found the unusual occurrence of two synapses from the axon of one neuron (translucent black strip) forming onto two spines on the same dendrite of a second neuron (yellow). Separate terminals from one neuron’s axon are shown in synaptic contact with two spines (arrows) on the same dendrite of a second neuron in the hippocampus. The spine head volumes, synaptic contact areas (red), neck diameters (gray) and number of presynaptic vesicles (white spheres) of these two synapses are almost identical. (credit: Salk Institute)

Salk researchers and collaborators have achieved critical insight into the size of neural connections, putting the memory capacity of the brain far higher than common estimates. The new work also answers a longstanding question as to how the brain is so energy efficient, and could help engineers build computers that are incredibly powerful but also conserve energy.

Continue reading “Memory capacity of brain is 10 times more than previously thought” »

Jan 20, 2016

Google is Back in the Virtual Reality Competition

Posted by in categories: business, computing, virtual reality

Google is shifting employee responsibilities and forming its own dedicated division for virtual reality computing.

Google is forming its own dedicated division for virtual reality computing, promising some intense competition for Facebook and Microsoft.

Not only has CEO Sundar Pichai moved over a key deputy to run it, but the move also signals Google’s intent to build a viable enterprise business. Because with the executive shift, Google’s massive consumer Web applications now fall under incoming SVP Diane Greene.

Read more

Jan 20, 2016

Breakthrough in direct conversion of human cells from one type to another

Posted by in categories: biotech/medical, computing, innovation

Scientists have developed a computer system that predicts the reprogramming factors necessary to convert human cells from one type to another, without the need for trial and error.

Read more

Jan 20, 2016

Alibaba Teams With Nvidia in $1 Billion Bet on Cloud Computing

Posted by in categories: computing, quantum physics, robotics/AI

The excitement keeps growing on Quantum. Now, the “MAGIC” will happen.


Alibaba Group Holding Ltd. will work with Nvidia Corp. on cloud computing and artificial intelligence, and plans to enlist about 1,000 developers to work on its big-data platform during the next three years.

The arm of China’s biggest e-commerce operator, known as AliCloud, will boost investment in data analysis and machine learning, it said in a statement Wednesday. AliCloud is staking $1 billion on the belief that demand for processing and storage from governments and companies will boost growth during the next decade as its tries to compete with Amazon.com Inc. in computing services.

Continue reading “Alibaba Teams With Nvidia in $1 Billion Bet on Cloud Computing” »

Jan 20, 2016

Evidence of a real ninth planet discovered

Posted by in categories: computing, space

Caltech researchers have found evidence of a giant planet tracing a bizarre, highly elongated orbit in the outer solar system. The object, which the researchers have nicknamed Planet Nine, has a mass about 10 times that of Earth and orbits about 20 times farther from the sun on average than does Neptune (which orbits the sun at an average distance of 2.8 billion miles). In fact, it would take this new planet between 10,000 and 20,000 years to make just one full orbit around the sun.

The researchers, Konstantin Batygin and Mike Brown, discovered the planet’s existence through mathematical modeling and computer simulations but have not yet observed the object directly.

“This would be a real ninth planet,” says Brown, the Richard and Barbara Rosenberg Professor of Planetary Astronomy. “There have only been two true planets discovered since ancient times, and this would be a third. It’s a pretty substantial chunk of our solar system that’s still out there to be found, which is pretty exciting.”

Read more

Jan 20, 2016

Switchable material could enable new memory chips

Posted by in categories: computing, materials

Retaining information even when loss of power.


Small voltage can flip thin film between two crystal states — one metallic, one semiconducting — new research indicates. The research involves a thin-film material called a strontium cobaltite, or SrCoOx.”

Read more

Jan 20, 2016

Quantum computing is coming — are you prepared for it?

Posted by in categories: business, computing, economics, quantum physics

2 weeks ago, I posted a big announcement was coming; well we have officially received it. Now, the question is “WILL YOU BE READY?” Within less than 4 years (2020) Quantum will be available. Everyone needs to be planning and getting budgets and resources in place for this massive transformation that is coming within 4 years. It will be expensive, time consuming, and a lot of prep work around business and it needs to be assessed, planned, and position to onboard quickly to quantum because other countries (including hackers) are going to be on quantum as well meaning more powerful network and platforms to attack older systems. https://lnkd.in/baSZrBY


Quantum computing will change lives, society and the economy and a working system is expected to be developed by 2020 according to a leading figure in the world of quantum computing, who will talk tomorrow Jan. 21, 2016 at the World Economic Forum (WEF) in Davos, Switzerland.

Professor O’Brien, Director of the Centre for Quantum Photonics at the University of Bristol and Visiting Fellow at Stanford University, is part of a European Research Council (ERC) Ideas Lab delegation who have been invited to talk at the annual meeting to industrial and political leaders of the world, including Prime Minister David Cameron. The session will discuss the future of computing and how new fields of computer sciences are paving the way for the next digital revolution.

Continue reading “Quantum computing is coming -- are you prepared for it?” »

Jan 20, 2016

Graphene ‘optical capacitors’ can make chips that mesh biophysics and semiconductors

Posted by in categories: computing, materials, physics

Graphene’s properties make it a tantalizing target for semiconductor research. Now a team from Princeton has showed that flakes of graphene can work as fast, accurate optical capacitors for laser transistors in neuromorphic circuits.

Read more

Jan 20, 2016

Open-Source GPU Could Push Computing Power to the Next Level

Posted by in category: computing

Researchers at Binghamton University are using an open-source graphics processor unit (GPU) to push the devices’ performance and application.

Binghamton University computer science assistant professor Timothy Miller, assistant professor Aaron Carpenter, and graduate student Philip Dexterm, along with co-author Jeff Bush, have developed Nyami, a synthesizable GPU architectural model for general-purpose and graphics-specific workloads. This marks the first time a team has taken an open-source GPU design and run a series of experiments on it to see how different hardware and software configurations would affect the circuit’s performance.

According to Miller, the results will help other scientists make their own GPUs and push computing power to the next level.

Read more

Jan 19, 2016

The US Military Wants a Chip to Translate Your Brain Activity Into Binary Code

Posted by in categories: biotech/medical, computing, engineering, military, neuroscience, supercomputing

It’s been a weird day for weird science. Not long after researchers claimed victory in performing a head transplant on a monkey, the US military’s blue-sky R&D agency announced a completely insane plan to build a chip that would enable the human brain to communicate directly with computers. What is this weird, surreal future?

It’s all real, believe it or not. Or at least DARPA desperately wants it to be. The first wireless brain-to-computer interface actually popped up a few years ago, and DARPA’s worked on various brain chip projects over the years. But there are shortcomings to existing technology: According to today’s announcement, current brain-computer interfaces are akin to “two supercomputers trying to talk to each other using an old 300-baud modem.” They just aren’t fast enough for truly transformative neurological applications, like restoring vision to a blind person. This would ostensibly involve connect a camera that can transmit visual information directly to the brain, and the implant would translate the data into neural language.

To accomplish this magnificent feat, DARPA is launching a new program called Neural Engineering System Design (NESD) that stands to squeeze some characteristically bonkers innovation out of the science community. In a press release, the agency describes what’s undoubtedly the closest thing to a Johnny Mneumonic plot-line you’ve ever seen in real life. It reads:

Read more