Blog

Archive for the ‘augmented reality’ category: Page 36

Aug 23, 2020

Stanford Scientists Slow Light Down and Steer It With Resonant Nanoantennas

Posted by in categories: augmented reality, biotech/medical, computing, internet, nanotechnology, quantum physics, virtual reality

Researchers have fashioned ultrathin silicon nanoantennas that trap and redirect light, for applications in quantum computing, LIDAR and even the detection of viruses.

Light is notoriously fast. Its speed is crucial for rapid information exchange, but as light zips through materials, its chances of interacting and exciting atoms and molecules can become very small. If scientists can put the brakes on light particles, or photons, it would open the door to a host of new technology applications.

Now, in a paper published on August 17, 2020, in Nature Nanotechnology, Stanford scientists demonstrate a new approach to slow light significantly, much like an echo chamber holds onto sound, and to direct it at will. Researchers in the lab of Jennifer Dionne, associate professor of materials science and engineering at Stanford, structured ultrathin silicon chips into nanoscale bars to resonantly trap light and then release or redirect it later. These “high-quality-factor” or “high-Q” resonators could lead to novel ways of manipulating and using light, including new applications for quantum computing, virtual reality and augmented reality; light-based WiFi; and even the detection of viruses like SARS-CoV-2.

Aug 18, 2020

Scientists slow and steer light with resonant nanoantennas

Posted by in categories: augmented reality, biotech/medical, computing, internet, nanotechnology, quantum physics, virtual reality

Light is notoriously fast. Its speed is crucial for rapid information exchange, but as light zips through materials, its chances of interacting and exciting atoms and molecules can become very small. If scientists can put the brakes on light particles, or photons, it would open the door to a host of new technology applications.

Now, in a paper published on Aug. 17, in Nature Nanotechnology, Stanford scientists demonstrate a new approach to slow light significantly, much like an echo chamber holds onto sound, and to direct it at will. Researchers in the lab of Jennifer Dionne, associate professor of materials science and engineering at Stanford, structured ultrathin silicon chips into nanoscale bars to resonantly trap light and then release or redirect it later. These “high-quality-factor” or “high-Q” resonators could lead to novel ways of manipulating and using light, including new applications for quantum computing, virtual reality and augmented reality; light-based WiFi; and even the detection of viruses like SARS-CoV-2.

“We’re essentially trying to trap light in a tiny box that still allows the light to come and go from many different directions,” said postdoctoral fellow Mark Lawrence, who is also lead author of the paper. “It’s easy to trap light in a box with many sides, but not so easy if the sides are transparent—as is the case with many Silicon-based applications.”

Jun 27, 2020

Nissan Invisible-To-Visible Technology

Posted by in categories: augmented reality, futurism

Click on photo to start video.


Nissan’s new AR technology could help drivers prevent blind spot accidents in the future.

Jun 27, 2020

Augmented Reality Pool Table

Posted by in category: augmented reality

Click on photo to start video.

Every pool table should have the ability to turn on augmented reality guidance!

May 26, 2020

‘Digital smell’ technology could let us transmit odors in online chats

Posted by in categories: augmented reality, food, internet, neuroscience, virtual reality

“It’s not just about the smell,” said Adrian Cheok, one of the scientists behind the experiments. “It is part of a whole, integrated virtual reality or augmented reality. So, for example, you could have a virtual dinner with your friend through the internet. You can see them in 3D and also share a glass of wine together.”

In real life, odors are transmitted when airborne molecules waft into the nose, prompting specialized nerve cells in the upper airway to fire off impulses to the brain. In the recent experiments, performed on 31 test subjects at the Imagineering Institute in the Malaysian city of Nusajaya, researchers used electrodes in the nostrils to deliver weak electrical currents above and behind the nostrils, where these neurons are found.

The researchers were able to evoke 10 different virtual odors, including fruity, woody and minty.

May 20, 2020

Huge leak reveals the release date and price of Apple’s AR glasses

Posted by in categories: augmented reality, biotech/medical, health, mobile phones

Apple’s AR glasses are supposedly called Apple Glass, a leaker revealed, and the product is set to be unveiled during the iPhone 12 launch event. The coronavirus health crisis might force Apple to postpone the reveal to the first quarter of next year.

May 15, 2020

New Deep Learning Research Breaks Records In Image Recognition Ability Of Self-Driving Cars

Posted by in categories: augmented reality, biotech/medical, robotics/AI

People, bicycles, cars or road, sky, grass: Which pixels of an image represent distinct foreground persons or objects in front of a self-driving car, and which pixels represent background classes?

This task, known as panoptic segmentation, is a fundamental problem that has applications in numerous fields such as self-driving cars, robotics, augmented reality and even in biomedical image analysis.

At the Department of Computer Science at the University of Freiburg Dr. Abhinav Valada, Assistant Professor for Robot Learning and member of BrainLinks-BrainTools focuses on this research question. Valada and his team have developed the state-of-the-art “EfficientPS” artificial intelligence (AI) model that enables coherent recognition of visual scenes more quickly and effectively.

May 4, 2020

Copy and paste the real world with your phone using augmented reality

Posted by in categories: augmented reality, mobile phones, robotics/AI

Augmented reality has been the next big thing for a while, but we haven’t seen many practical applications. Here’s a tool that looks useful, though: using AR and AI to copy and paste objects from the real world to your computer using just your phone.

Apr 24, 2020

Mojo Vision’s Augmented Reality Contact Lenses Kick off a Race to AR on Your Eye

Posted by in category: augmented reality

In what may be the starting pistol in a race to AR for your eyes, startup Mojo Vision unveiled their augmented reality contact lenses for the first time.

Apr 16, 2020

AR vision system for quiet supersonic X-59 plane gets put to the test

Posted by in categories: augmented reality, transportation

A key component of NASA’s X-59 Quiet SuperSonic Technology (QueSST) aircraft is undergoing vibration tests at the space agency’s Langley Research Center in Hampton, Virginia. The eXternal Vision System (XVS) is a special camera system that the pilot of the X-plane will use to see forward while the experimental supersonic craft is in flight.

When the X-59 takes to the skies in 2021, the pilot will be faced with a problem not often encountered since the Concorde fleet of supersonic passenger jetliners was retired. The X-59 is meant to test new technologies to build a new generation of supersonic commercial aircraft and, while it promises to overcome some of the drawbacks of Concorde, it will still share some of its difficulties.

One is that the ideal design of a long-range supersonic liner is essentially that of a needle-nosed dart. The annoying thing is that, though this shape may be fine from an aerodynamic point of view, it makes it extremely difficult for the pilot to see forward without a lot of complex mechanics, like Concorde’s droop nose and special sliding windscreen.

Page 36 of 67First3334353637383940Last