Microsoft Drones Augment Virtual Reality

Let’s face it: virtual reality can only go so far. Your mind may be immersed in another universe, but your body hasn’t left the room. You might see a dragon, but if you reach out to pet its nose, all you will touch is air. How do you increase your suspension of belief? How about being able to interact with physical objects you see in your VR goggles?

This is where Microsoft’s invention comes in: Tactile Autonomous Drone (TAD), a system that is aware of where you are, what you see, and what you should be able to interact with. The user doesn’t see the TAD, only what it’s carrying. This adds to the illusion of being in another world. The TADs know your position in a 3D space thanks to a wide variety of motion-tracking sensors, and “one or more TADs apply the resulting tracking information to automatically position themselves, or one or more physical surfaces or objects carried by the TADs, in a way that enables physical contact between those surfaces or objects and one or more portions of the user’s body.”

In the example of FIG. 1, the user 160 is reaching out to contact the hand of a virtual character in the immersive virtual environment. In response, TAD 100 positions an artificial hand contained in the plurality of surfaces and objects 110 of that TAD. In various implementations, this artificial hand may be passive (e.g., no moving parts), or it may be robotic in the sense that it can move, high-five or grip the user’s hand, etc., so as to provide lifelike tactile feedback to the user when the user reaches out to contact the hand of a virtual character in the immersive virtual environment.”

The TAD does not have to be a flying multi-copter: it can be a wheeled or tracked vehicle with the tactile object mounted on it, or an array of objects on an “optionally rotatable and optionally telescoping turret 430 extending from the TAD. This turret 430 includes a plurality of surfaces and objects 420 that may be positioned by moving the TAD 400 and rotating and/or extending the turret 430 to position one or more objects or surfaces 420 relative to the user to provide the aforementioned tactile feedback upon contact or interaction with the user.”

Query whether the down-force of air from a flying drone will disrupt the user’s illusion that they are shaking the hand of anything other than a ghost. The first claim is a little on the bulky side,  but captures the invention well.

  1. A system, comprising:
  • a real autonomous mobile drone having one or more tactile virtual extensions;
  • one or more sensors that capture real-time tracking information relating to real motions and positions of a user immersed in a real-time rendering of a virtual environment, and further relating to motions and positions of the autonomous mobile drone in a real-world environment around the user;
  • applying the real-time tracking information to cause the autonomous mobile drone to move relative to the user based on the tracking information;
  • the autonomous mobile drone automatically positioning itself relative to the user in a manner that causes one of tactile virtual extensions of the autonomous mobile drone to move relative to the user and to contact one or more user body parts based on real user motions while the user interacts with the virtual environment;
  • and wherein the contact between the tactile virtual extension and the one or more user body parts generates a physically tactile sensation for one or more virtual elements of the virtual environment.

 

Title: “AUTONOMOUS DRONES FOR TACTILE FEEDBACK IN IMMERSIVE VIRTUAL REALITY”

US Patent Application No: 20160349835

Filed (USA Reg): May 28, 2015

Published: December 1, 2016

Applicant: Microsoft Technology Licensing, LLC