Blog

Archive for the ‘supercomputing’ category: Page 84

Nov 14, 2013

The Disruptional Singularity

Posted by in categories: business, climatology, complex systems, cosmology, counterterrorism, cybercrime/malcode, defense, economics, education, engineering, ethics, existential risks, finance, futurism, nanotechnology, physics, policy, robotics/AI, science, singularity, supercomputing, sustainability, transparency

(Excerpt)

Beyond the managerial challenges (downside risks) presented by the exponential technologies as it is understood in the Technological Singularity and its inherent futuristic forces impacting the present and the future now, there are also some grave global risks that many forms of management have to tackle with immediately.

These grave global risks have nothing to do with advanced science or technology. Many of these hazards stem from nature and some are, as well, man made.

For instance, these grave global risks ─ embodying the Disruptional Singularity ─ are geological, climatological, political, geopolitical, demographic, social, economic, financial, legal and environmental, among others. The Disruptional Singularity’s major risks are gravely threatening us right now, not later.

Read the full document at http://lnkd.in/bYP2nDC

May 31, 2013

How Could WBE+AGI be Easier than AGI Alone?

Posted by in categories: complex systems, engineering, ethics, existential risks, futurism, military, neuroscience, singularity, supercomputing

This essay was also published by the Institute for Ethics & Emerging Technologies and by Transhumanity under the title “Is Price Performance the Wrong Measure for a Coming Intelligence Explosion?”.

Introduction

Most thinkers speculating on the coming of an intelligence explosion (whether via Artificial-General-Intelligence or Whole-Brain-Emulation/uploading), such as Ray Kurzweil [1] and Hans Moravec [2], typically use computational price performance as the best measure for an impending intelligence explosion (e.g. Kurzweil’s measure is when enough processing power to satisfy his estimates for basic processing power required to simulate the human brain costs $1,000). However, I think a lurking assumption lies here: that it won’t be much of an explosion unless available to the average person. I present a scenario below that may indicate that the imminence of a coming intelligence-explosion is more impacted by basic processing speed – or instructions per second (ISP), regardless of cost or resource requirements per unit of computation, than it is by computational price performance. This scenario also yields some additional, counter-intuitive conclusions, such as that it may be easier (for a given amount of “effort” or funding) to implement WBE+AGI than it would be to implement AGI alone – or rather that using WBE as a mediator of an increase in the rate of progress in AGI may yield an AGI faster or more efficiently per unit of effort or funding than it would be to implement AGI directly.

Loaded Uploads:

Continue reading “How Could WBE+AGI be Easier than AGI Alone?” »

Mar 31, 2013

American Physical Society (APS) Conference in Denver

Posted by in categories: cosmology, defense, education, engineering, events, general relativity, nuclear energy, particle physics, philosophy, physics, policy, scientific freedom, space, supercomputing

The APS April Meeting 2013, Vol. 58 #4 will be held Saturday–Tuesday, April 13–16, 2013; Denver, Colorado.

I am very pleased to announce that my abstract was accepted and I will be presenting “Empirical Evidence Suggest A Need For A Different Gravitational Theory” at this prestigious conference.

For those of you who can make it to Denver, April 13–16, and are interested in alternative gravitational theories, lets meet up.

I am especially interested in physicists and engineers who have the funding to test gravity modification technologies, proposed in my book An Introduction to Gravity Modification.

Continue reading “American Physical Society (APS) Conference in Denver” »

Mar 19, 2013

Ten Commandments of Space

Posted by in categories: asteroid/comet impacts, biological, biotech/medical, cosmology, defense, education, engineering, ethics, events, evolution, existential risks, futurism, geopolitics, habitats, homo sapiens, human trajectories, life extension, lifeboat, military, neuroscience, nuclear energy, nuclear weapons, particle physics, philosophy, physics, policy, robotics/AI, singularity, space, supercomputing, sustainability, transparency

1. Thou shalt first guard the Earth and preserve humanity.

Impact deflection and survival colonies hold the moral high ground above all other calls on public funds.

2. Thou shalt go into space with heavy lift rockets with hydrogen upper stages and not go extinct.

Continue reading “Ten Commandments of Space” »

Mar 4, 2013

Human Brain Mapping & Simulation Projects: America Wants Some, Too?

Posted by in categories: biological, biotech/medical, complex systems, ethics, existential risks, homo sapiens, neuroscience, philosophy, robotics/AI, singularity, supercomputing

YANKEE.BRAIN.MAP
The Brain Games Begin
Europe’s billion-Euro science-neuro Human Brain Project, mentioned here amongst machine morality last week, is basically already funded and well underway. Now the colonies over in the new world are getting hip, and they too have in the works a project to map/simulate/make their very own copy of the universe’s greatest known computational artifact: the gelatinous wad of convoluted electrical pudding in your skull.

The (speculated but not yet public) Brain Activity Map of America
About 300 different news sources are reporting that a Brain Activity Map project is outlined in the current administration’s to-be-presented budget, and will be detailed sometime in March. Hoards of journalists are calling it “Obama’s Brain Project,” which is stoopid, and probably only because some guy at the New Yorker did and they all decided that’s what they had to do, too. Or somesuch lameness. Or laziness? Deference? SEO?

For reasons both economic and nationalistic, America could definitely use an inspirational, large-scale scientific project right about now. Because seriously, aside from going full-Pavlov over the next iPhone, what do we really have to look forward to these days? Now, if some technotards or bible pounders monkeywrench the deal, the U.S. is going to continue that slide toward scientific… lesserness. So, hippies, religious nuts, and all you little sociopathic babies in politics: zip it. Perhaps, however, we should gently poke and prod the hard of thinking toward a marginally heightened Europhobia — that way they’ll support the project. And it’s worth it. Just, you know, for science.

Going Big. Not Huge, But Big. But Could be Massive.
Both the Euro and American flavors are no Manhattan Project-scale undertaking, in the sense of urgency and motivational factors, but more like the Human Genome Project. Still, with clear directives and similar funding levels (€1 billion Euros & $1–3 billion US bucks, respectively), they’re quite ambitious and potentially far more world changing than a big bomb. Like, seriously, man. Because brains build bombs. But hopefully an artificial brain would not. Spaceships would be nice, though.

Continue reading “Human Brain Mapping & Simulation Projects: America Wants Some, Too?” »

Feb 8, 2013

Machine Morality: a Survey of Thought and a Hint of Harbinger

Posted by in categories: biological, biotech/medical, engineering, ethics, evolution, existential risks, futurism, homo sapiens, human trajectories, robotics/AI, singularity, supercomputing

KILL.THE.ROBOTS
The Golden Rule is Not for Toasters

Simplistically nutshelled, talking about machine morality is picking apart whether or not we’ll someday have to be nice to machines or demand that they be nice to us.

Well, it’s always a good time to address human & machine morality vis-à-vis both the engineering and philosophical issues intrinsic to the qualification and validation of non-biological intelligence and/or consciousness that, if manifested, would wholly justify consideration thereof.

Uhh… yep!

But, whether at run-on sentence dorkville or any other tech forum, right from the jump one should know that a single voice rapping about machine morality is bound to get hung up in and blinded by its own perspective, e.g., splitting hairs to decide who or what deserves moral treatment (if a definition of that can even be nailed down), or perhaps yet another justification for the standard intellectual cul de sac:
“Why bother, it’s never going to happen.“
That’s tired and lame.

Continue reading “Machine Morality: a Survey of Thought and a Hint of Harbinger” »

Sep 6, 2012

GENCODE Apocalypse

Posted by in categories: biological, biotech/medical, business, chemistry, complex systems, counterterrorism, defense, ethics, events, evolution, existential risks, futurism, geopolitics, habitats, homo sapiens, human trajectories, life extension, lifeboat, media & arts, military, open source, policy, space, supercomputing, sustainability, transparency

http://www.sciencedaily.com/releases/2012/09/120905134912.htm

It is a race against time- will this knowledge save us or destroy us? Genetic modification may eventually reverse aging and bring about a new age but it is more likely the end of the world is coming.

The Fermi Paradox informs us that intelligent life may not be intelligent enough to keep from destroying itself. Nothing will destroy us faster or more certainly than an engineered pathogen (except possibly an asteroid or comet impact). The only answer to this threat is an off world survival colony. Ceres would be perfect.

Aug 19, 2012

Artilects Soon to Come

Posted by in categories: complex systems, counterterrorism, cybercrime/malcode, defense, engineering, ethics, events, evolution, existential risks, futurism, information science, military, neuroscience, supercomputing

Whether via spintronics or some quantum breakthrough, artificial intelligence and the bizarre idea of intellects far greater than ours will soon have to be faced.

http://www.sciencedaily.com/releases/2012/08/120819153743.htm

Aug 13, 2012

The Electric Septic Spintronic Artilect

Posted by in categories: biological, biotech/medical, business, chemistry, climatology, complex systems, counterterrorism, defense, economics, education, engineering, ethics, events, evolution, existential risks, futurism, geopolitics, homo sapiens, human trajectories, information science, military, neuroscience, nuclear weapons, policy, robotics/AI, scientific freedom, singularity, space, supercomputing, sustainability, transparency

AI scientist Hugo de Garis has prophesied the next great historical conflict will be between those who would build gods and those who would stop them.

It seems to be happening before our eyes as the incredible pace of scientific discovery leaves our imaginations behind.

We need only flush the toilet to power the artificial mega mind coming into existence within the next few decades. I am actually not intentionally trying to write anything bizarre- it is just this strange planet we are living on.

http://www.sciencedaily.com/releases/2012/08/120813155525.htm

http://www.sciencedaily.com/releases/2012/08/120813123034.htm

Jun 9, 2012

The Crisis in Education in Korea and the World

Posted by in categories: education, philosophy, supercomputing, sustainability

Emanuel Pastreich

Professor

Kyung Hee University

June 9, 2012

Continue reading “The Crisis in Education in Korea and the World” »

Page 84 of 85First7879808182838485