Toggle light / dark theme

Our interstellar challenge is, how do we as a planet confined humans, become an interstellar species? This encompasses all human endeavors, and is vitally dependent upon interstellar propulsion physics to realize our coming of age as an interstellar species.

There are so many competing ideas on how to realize interstellar propulsion. These include chemical rockets, ion propulsion, nuclear engines, solar sails, atomic bomb pulse detonation, antimatter drives, small black holes, warp drives and much more.

How do we sift through all these competing ideas?

For his objectivity and courage in stating that mathematics has become so sophisticated that it can now be used to prove anything, I have named the approach to solving this interstellar challenge the Kline Directive, in honor of the late Prof. Morris Kline.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, and to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, (1) Legal Standing, (2) Safety Awareness, (3) Economic Viability, (4) Theoretical-Empirical Relationship and (5) Technological Feasibility.

Legal Standing: Do we have the protection of the law?

Mr. Gregory W. Nemitz of The Eros Project is the first person I know, who pushed the limits of the law. As a US taxpayer, Nemitz claimed ownership of Asteroid 433, Eros, and invoiced NASA $20,000 for parking and storage of the NEAR Shoemaker spacecraft. Citing faulty interpretation of the Outer Space Treaty of 1967, NASA refused to pay. On April 26, 2004 U.S. District Court Judge Howard McKibben dismissed the case. We have to address this. What is to stop other governments from imposing taxes on our outer space commercial activities that is “for the benefit and in the interests of all countries”?

Safety Awareness: Can we protect our crew and our planet?

In the heady rush to propose ideas for new propulsions systems or star drives it is very easy to overlook safety considerations. Quoting E.J. Opik, “Is Interstellar Travel Possible?” Irish Astronomical Journal, Vol 6, page 299. “The exhaust power of the antimatter rocket would equal the solar energy power received by the earth — all in gamma rays”. And Opik quotes the eminent Carl Sagan, Planet. Space Sci., pp. 485–498, 1963, “So the problem is not to shield the payload, the problem is to shield the earth”.

Economic Viability: Can realistic commercial viability be determined?

Space exploration economic viability is not an accounting problem that can be solved by CFOs and accountants. This economic viability is a physics and engineering problem. For example, chemical rocket propulsion to Alpha Centauri, our nearest star, would cost about $1.19x10^14 or 23x 2011 world GDP.

Theoretical-Empirical Relationship: Is the hypothesis empirically sustainable?

String theories are a good example of a theoretical-empirical relationship that is yet to be proved. Let’s remember Prof. Morris Kline’s words when theorist claim a velocity of 1032 x c (velocity of light) is achievable. Don’t get me wrong. Mathematics is vital to the progress of the sciences, but it needs to be tempered with real world experimental evidence, otherwise it is just conjecture, and ties up our search for interstellar propulsion technologies.

The reverse is equally valid. Without the theoretical underpinnings, there will not be much experimental progress. Podkletnov’s gravity shielding experiments are a good example. In 2 decades since Podkletnov published his experiments, there has not been any experimental progress. My inference is that none of the proposed theoretical explanations addressed all the observations and therefore, could not guide future experiments.

Technological Feasibility: Does it work?

Technological feasibility in a realistic and finite time frame is vital. Technological feasibility quickly leads back to the question of commercial viability. Developing future feasible technologies is an iterative process between technological feasibility and commercial viability, until we can reach the stars without having to ask the question, whom do we select to leave Earth?

Applying the Kline Directive, a quick method of eliminating competing technologies is to construct the Interstellar Challenge Matrix that compares the pros and cons of each competing propulsion technology.

Can we hasten the development of interstellar propulsion technologies? Yes.

Since disproving the validity of Alcubierre-type warp drives, interstellar propulsion physics is currently non-existent. To birth this propulsion physics, in 2012, I classified physical hypotheses/theories into 3 categories (1) Type 1: The Millennium Theories, (2) Type 2: The 100-Year Theories and (3) Type 3: The Engineering Feasible Theories.

Type 1, Millennium Theories require more than a 100 years and maybe up to 1,000 years to prove or disprove. Mathematically correct but inscrutable with physically verifiable experiments, even in the distant future. String and quantum gravity theories fall into this category. Why? If we cannot even figure out how to engineer-modify 4-dimensional spacetime, how are we going to engineer-modify a 5-, 6-, 9-, 11- or 23-dimensional universe?

Type 2, 100-Year Theories show promise of being verified with technologies that would require several decades to engineer, test and validate, and do not lend themselves to an immediate engineering solution. The engineering solution is theoretically feasible but a working experiment or technology is some decades away as the experimental or physical implementation is not fully understood.

Type 3, Engineering Feasible Theories lend themselves to engineering solutions, today. They are falsifiable today, with our current engineering technologies, if one knows what to test for and how to test for these experimental observations.

We as a society need to apply the Kline Directive to invent new propulsion physics theories, that are at best Engineering Feasible and at worst 100-Year theories.

We now have the tools to quickly eliminate both theoretical and experimental proposals that are highly likely to be unproductive, and focus on those that truly have potential of delivering commercial interstellar propulsion technologies. In the US money is no object, as the combined 2015 DARPA and NSF budgets, is $10.26 billion. Allocating a very small slice of these budgets for propulsion physics would be an enormous step forward.

(This article was originally published in the Huffington Post.)

(Phys.org)—A trio of physicists with the Autonomous University of Barcelona has built what they claim is the first artificial magnetic wormhole. In their paper published in the journal Scientific Reports, Jordi Prat-Camps, Carles Navau and Alvaro Sanchez describe how they built the device and why they believe it might prove useful in building a more user-friendly MRI machine.

Read more

One of the unsung workhorses of modern technology is the humble interconnect. This is essentially a wire or set of wires that link one part of an electronic system to another. In ordinary silicon chips, interconnect can take up most of the area of a chip; and the speed and efficiency with which information can travel along these interconnects, is a major limiting factor in computing performance.

So it’s no wonder that physicists and engineers are creating new generations of interconnect that will become the backbone of information processing machines of the future.

Read more


July, 2015; as you know.. was the all systems go for the CERNs Large Hadron Collider (LHC). On a Saturday evening, proton collisions resumed at the LHC and the experiments began collecting data once again. With the observation of the Higgs already in our back pocket — It was time to turn up the dial and push the LHC into double digit (TeV) energy levels. From a personal standpoint, I didn’t blink an eye hearing that large amounts of Data was being collected at every turn. BUT, I was quite surprised to learn at the ‘Amount’ being collected and processed each day — About One Petabyte.

Approximately 600 million times per second, particles collide within the (LHC). The digitized summary is recorded as a “collision event”. Physicists must then sift through the 30 petabytes or so of data produced annually to determine if the collisions have thrown up any interesting physics. Needless to say — The Hunt is On!

The Data Center processes about one Petabyte of data every day — the equivalent of around 210,000 DVDs. The center hosts 11,000 servers with 100,000 processor cores. Some 6000 changes in the database are performed every second.

With experiments at CERN generating such colossal amounts of data. The Data Center stores it, and then sends it around the world for analysis. CERN simply does not have the computing or financial resources to crunch all of the data on site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The Worldwide LHC Computing Grid (WLCG) – a distributed computing infrastructure arranged in tiers – gives a community of over 8000 physicists near real-time access to LHC data. The Grid runs more than two million jobs per day. At peak rates, 10 gigabytes of data may be transferred from its servers every second.

By early 2013 CERN had increased the power capacity of the centre from 2.9 MW to 3.5 MW, allowing the installation of more computers. In parallel, improvements in energy-efficiency implemented in 2011 have led to an estimated energy saving of 4.5 GWh per year.

Image: CERN

PROCESSING THE DATA (processing info via CERN)> Subsequently hundreds of thousands of computers from around the world come into action: harnessed in a distributed computing service, they form the Worldwide LHC Computing Grid (WLCG), which provides the resources to store, distribute, and process the LHC data. WLCG combines the power of more than 170 collaborating centres in 36 countries around the world, which are linked to CERN. Every day WLCG processes more than 1.5 million ‘jobs’, corresponding to a single computer running for more than 600 years.

Racks of servers at the CERN Data Centre (Image: CERN)

CERN DATA CENTER: The server farm in the 1450 m2 main room of the DC (pictured) forms Tier 0, the first point of contact between experimental data from the LHC and the Grid. As well as servers and data storage systems for Tier 0 and further physics analysis, the DC houses systems critical to the daily functioning of the laboratory. (Image: CERN)

The data flow from all four experiments for Run 2 is anticipated to be about 25 GB/s (gigabyte per second)

  • ALICE: 4 GB/s (Pb-Pb running)
  • ATLAS: 800 MB/s – 1 GB/s
  • CMS: 600 MB/s
  • LHCb: 750 MB/s

In July, the LHCb experiment reported observation of an entire new class of particles:
Exotic Pentaquark Particles (Image: CERN)

Possible layout of the quarks in a pentaquark particle. The five quarks might be tightly bound (left). The five quarks might be tightly bound. They might also be assembled into a meson (one quark and one anti quark) and a baryon (three quarks), weakly bound together.

The LHCb experiment at CERN’s LHC has reported the discovery of a class of particles known as pentaquarks. In short, “The pentaquark is not just any new particle,” said LHCb spokesperson Guy Wilkinson. “It represents a way to aggregate quarks, namely the fundamental constituents of ordinary protons and neutrons, in a pattern that has never been observed before in over 50 years of experimental searches. Studying its properties may allow us to understand better how ordinary matter, the protons and neutrons from which we’re all made, is constituted.”

Our understanding of the structure of matter was revolutionized in 1964 when American physicist Murray Gell-Mann proposed that a category of particles known as baryons, which includes protons and neutrons, are comprised of three fractionally charged objects called quarks, and that another category, mesons, are formed of quark-antiquark pairs. This quark model also allows the existence of other quark composite states, such as pentaquarks composed of four quarks and an antiquark.

Until now, however, no conclusive evidence for pentaquarks had been seen.
Earlier experiments that have searched for pentaquarks have proved inconclusive. The next step in the analysis will be to study how the quarks are bound together within the pentaquarks.

The quarks could be tightly bound,” said LHCb physicist Liming Zhang of Tsinghua University, “or they could be loosely bound in a sort of meson-baryon molecule, in which the meson and baryon feel a residual strong force similar to the one binding protons and neutrons to form nuclei.” More studies will be needed to distinguish between these possibilities, and to see what else pentaquarks can teach us!

August 18th, 2015
CERN Experiment Confirms Matter-Antimatter CPT Symmetry
For
Light Nuclei, Antinuclei (Image: CERN)

Days after scientists at CERN’s Baryon-Antibaryon Symmetry Experiment (BASE) measured the mass-to-charge ratio of a proton and its antimatter particle, the antiproton, the ALICE experiment at the European organization reported similar measurements for light nuclei and antinuclei.

The measurements, made with unprecedented precision, add to growing scientific data confirming that matter and antimatter are true mirror images.

Antimatter shares the same mass as its matter counterpart, but has opposite electric charge. The electron, for instance, has a positively charged antimatter equivalent called positron. Scientists believe that the Big Bang created equal quantities of matter and antimatter 13.8 billion years ago. However, for reasons yet unknown, matter prevailed, creating everything we see around us today — from the smallest microbe on Earth to the largest galaxy in the universe.

Last week, in a paper published in the journal Nature, researchers reported a significant step toward solving this long-standing mystery of the universe. According to the study, 13,000 measurements over a 35-day period show — with unparalleled precision – that protons and antiprotons have identical mass-to-charge ratios.

The experiment tested a central tenet of the Standard Model of particle physics, known as the Charge, Parity, and Time Reversal (CPT) symmetry. If CPT symmetry is true, a system remains unchanged if three fundamental properties — charge, parity, which refers to a 180-degree flip in spatial configuration, and time — are reversed.

The latest study takes the research over this symmetry further. The ALICE measurements show that CPT symmetry holds true for light nuclei such as deuterons — a hydrogen nucleus with an additional neutron — and antideuterons, as well as for helium-3 nuclei — two protons plus a neutron — and antihelium-3 nuclei. The experiment, which also analyzed the curvature of these particles’ tracks in ALICE detector’s magnetic field and their time of flight, improve on the existing measurements by a factor of up to 100.

IN CLOSING..

A violation of CPT would not only hint at the existence of physics beyond the Standard Model — which isn’t complete yet — it would also help us understand why the universe, as we know it, is completely devoid of antimatter.

UNTIL THEN…

ORIGINAL ARTICLE POSTING via Michael Phillips LinkedIN Pulse @

  • The universe radiates only half as much energy as 2 billion years ago
  • New findings establish cosmos’ decline with unprecedented precision


From CNN
—The universe came in with the biggest bang ever. But now, with a drooping fizzle, it is in its swan song. The conclusion of a new astronomical study pulls no punches on this: “The Universe is slowly dying,” it reads.

Astronomers have believed as much for years, but the new findings establish the cosmos’ decline with unprecedented precision. An international team of 100 scientists used data from the world’s most powerful telescopes — based on land and in space — to study energy coming from more than 200,000 galaxies in a large sliver of the observable universe. [Full story below or at CNN.com]…

Based on those observations, they have confirmed the cosmos is radiating only half as much energy as it was 2 billion years ago. The astronomers published their study on Monday on the website of the European Southern Observatory.

Analysis across many wavelengths shows the universe's electromagnetic energy output is dropping.The team checked the energy across a broad spectrum of lightwaves and other electromagnetic radiation and says it is fading through all wavelengths, from ultraviolet to far infrared.

Analysis across many wavelengths shows the universe’s electromagnetic energy output is dropping.

‘A cold, dark and desolate place’

At the ripe old age of nearly 13.8 billion years, the universe has arrived in its sunset years.

“The universe has basically sat down on the sofa, pulled up a blanket and is about to nod off for an eternal doze,” said astronomer Simon Driver, who led the team.

Death does not mean the universe will go away. It will still be there, but its stars and all else that produces light and stellar fire will fizzle out.

“It will just grow old forever, slowly converting less and less mass into energy as billions of years pass by until eventually, it will become a cold, dark and desolate place, where all of the lights go out,” said astronomer Luke Davies.

But don’t cry for the universe anytime soon. Astrophysicists say this will take trillions of years.

Bursting with energy

Go all the way back to its birth, and you find a vast contrast. In an infinitesimal fraction of a second, our entire cosmos blasted into existence in the Big Bang.

And the totality of the energy and mass in the universe originates from that moment, astronomers say.

Since that natal explosion, the cosmos has generated other sources of brilliant radiation — most notably stars — by converting some of the mass into energy when extreme gravity causes matter to burst into nuclear fusion.

But the universe is speckled by radiance from seething gas clouds, supernovas and, most spectacularly, the discs of hot matter that rotate around black holes to form quasars, which can be as bright as whole galaxies.

“While most of the energy sloshing around in the universe arose in the aftermath of the Big Bang, additional energy is constantly being generated by stars as they fuse elements like hydrogen and helium together,” Driver said.

Fizzling into space

The size and number of those sources of radiation so boggle the mind that it might be hard to imagine that the entirety of that vividness appears to be fading, as its energy flies off through space.

“This new energy is either absorbed by dust as it travels through the host galaxy, or escapes into intergalactic space and travels until it hits something, such as another star, a planet, or, very occasionally, a telescope mirror,” Driver said.

His team observed it from seven of the world’s mammoth telescopes spread out between Australia, the United States, Chile and Earth’s orbit. Many of the instruments specialize in receiving certain wavelengths of light and other electromagnetic waves.

Compiling the data from the collective wavelengths gives the scientists a more complete picture from across a broad spectrum of energy.

Their findings on the universe’s energy slump were part of the larger Galaxy And Mass Assembly, or GAMA, project to study how galaxies are formed. It has mapped out the position of 4 million galaxies so far.

Serendipity has as much a place in science as in love. That’s what Northeastern physicists Swastik Kar and Srinivas Sridhar found during their four-year project to modify graphene, a stronger-than-steel infinitesimally thin lattice of tightly packed carbon atoms. Primarily funded by the Army Research Laboratory and Defense Advanced Research Projects Agency, or DARPA, the researchers were charged with imbuing the decade-old material with thermal sensitivity for use in infrared imaging devices such as night-vision goggles for the military.

What they unearthed, published Friday in the journal Science Advances, was so much more: an entirely new material spun out of boron, nitrogen, carbon, and oxygen that shows evidence of magnetic, optical, and electrical properties as well as DARPA’s sought-after thermal ones. Its potential applications run the gamut: from 20-megapixel arrays for cellphone cameras to photo detectors to atomically thin transistors that when multiplied by the billions could fuel computers.

Read more