Toggle light / dark theme

Ahead of the fund’s launch on Saturday, Mr. Son said it would focus on investing in sectors including artificial intelligence, smart devices and semiconductors. “We already have lots in the pipeline,” he said. “We are investing into genome sequencing. We are investing in virtual-reality simulations, the games, and so on.”


Japan’s SoftBank Group and Saudi Arabia’s sovereign-wealth fund on Saturday launched the world’s largest technology fund, a nearly $100 billion vehicle that will steer capital to cutting-edge technologies in U.S. startups and other global firms.

In a statement, SoftBank said the fund secured $93 billion of committed capital. The so-called SoftBank Vision Fund is targeting a total of $100 billion within 6 months. The fund’s creation coincides with U.S. President Donald Trump’s two-day visit to Saudi Arabia, where he is…

To Read the Full Story.

N” Mobile phone apps took center stage at Google’s annual developer conference on Wednesday as the search giant announced new features for its digital assistant and its popular photo app while devoting little time to the Android mobile operating system.

Addressing an audience of thousands of developers in Mountain View, California, Google executives delivered a broad-based update to their product portfolio which also included a slate of new features for the Google Home speaker, a job search tool and even a set of new virtual reality headsets.

In a sign of the ongoing strategic importance of Google Assistant, the company’s artificial intelligence-driven, voice-controlled digital assistant, Google announced it would make the product available on Apple Inc’s (AAPL.O) iPhone, making a play for the higher end of the smartphone market and challenging Apple’s Siri feature on its own devices.

Read more

  • Second Livestock is a unique application of virtual reality (VR) that could change animal husbandry and livestock farming.
  • Developed by design professor Austin Stewart, this VR free-range farm world is a safe haven for chickens.

Free-range livestock is going to the next level, thanks to a unique, if seemingly silly idea that has recently gone viral. Second Livestock is a free range world for chickens in virtual reality (VR). And yes, just like most of VR’s current applications, it actually works like a game — a massively-multiplayer one full of chickens and with no AI bots.

Read more

Last month, we showed an earlier version of this robot where we’d trained its vision system using domain randomization, that is, by showing it simulated objects with a variety of color, backgrounds, and textures, without the use of any real images.

Now, we’ve developed and deployed a new algorithm, one-shot imitation learning, allowing a human to communicate how to do a new task by performing it in VR. Given a single demonstration, the robot is able to solve the same task from an arbitrary starting configuration.

Caption: Our system can learn a behavior from a single demonstration delivered within a simulator, then reproduce that behavior in different setups in reality.

Read more

Startup Vivid Vision said today it has raised $2.2 million in a seed round to build VR tools that could be used to treat eye problems known as “lazy eye.”

San Francisco-based Vivid Vision raised the money from SoftTech VC’s Jeff Clavier, as well as The Venture Reality Fund (The VR Fund), CRCM Ventures, SOS Ventures, Anorak Ventures, and Liquid 2 Ventures, a seed-stage venture capital firm cofounded by Hall of Fame NFL quarterback Joe Montana.

The company’s VR treatment for binocular vision disorders is now available at more than 90 clinics across the world. More than 10 percent of Americans suffer from one or more binocular vision disorders, such as amblyopia, strabismus, or convergence insufficiency. These disorders, commonly known as “lazy eye” and “crossed eyes,” can cause issues with driving and playing sports, and can even limit career choices.

Read more

The “Watchsense” prototype uses a small depth camera attached to the arm, mimicking a depth camera on a smartwatch. It could make it easy to type, or in a music program, volume could be increased by simply raising a finger. (credit: Srinath Sridhar et al.)

If you wear a smartwatch, you know how limiting it is to type it on or otherwise operate it. Now European researchers have developed an input method that uses a depth camera (similar to the Kinect game controller) to track fingertip touch and location on the back of the hand or in mid-air, allowing for precision control.

The researchers have created a prototype called “WatchSense,” worn on the user’s arm. It captures the movements of the thumb and index finger on the back of the hand or in the space above it. It would also work with smartphones, smart TVs, and virtual-reality or augmented reality devices, explains Srinath Sridhar, a researcher in the Graphics, Vision and Video group at the Max Planck Institute for Informatics.

Read more

Security analysts could soon become the first employees asked to show up to work inside virtual reality.

Thanks to a new virtual reality tool built by the Colorado-based startup ProtectWise, cybersecurity professionals may soon be patrolling computer networks — like real world beat cops — inside a three-dimensional video game world.

Scott Chasin, CEO and co-founder of ProtectWise, sees a future in which companies might even have war-rooms of Oculus Rift-wearing security analysts who patrol their networks in VR. “I see an opportunity in the not-too-distant future in which a large organization who has a lot of IT infrastructure might have rooms full of security analysts with augmented reality and VR headsets on,” he told me.

Read more

Hardly a day has gone by this month without the announcement of a new virtual reality (VR) camera system. Facebook, Google and GoPro all aim to make VR more immersive with new cameras, some of which won’t be commercially released for the foreseeable future. However, researchers at Adobe believe that you may not need new camera hardware at all for a big leap in immersion.

Adobe’s head of research Gavin Miller is going to present new cutting-edge technology at NAB in Las Vegas this Tuesday that could one day be used to turn flat, monoscopic 360-degree videos shot with consumer-grade spherical cameras into fully immersive VR video, complete with the ability to lean into the video — something that’s being called six degrees of freedom (6DoF) among industry insiders.

The difference between monoscopic 360-degree video and VR experiences offering six degrees of freedom is especially important for users of high-end VR headsets like the Oculus Rift and HTC Vive. These headsets offer room-scale tracking, which means that the headset knows where in the room the viewer is, accurately translating a motion like “leaning forward” into corresponding visuals.

Read more