Artwork by Adi Granov, Source: Facebook

Hand-Rocket Propulsion šŸš€Soar Through The Skies Like Iron-Man

Radu Coandă
7 min readJun 23, 2021

--

Gaze Steering, Teleportation, and Joysticks are the most common movement techniques used in todayā€™s virtual reality (VR) applications. Heck, both Unity and Unreal Engine, the top game engines that small studios and indie devs use for developing VR apps, come with these techniques as default. But, there are many more locomotion techniques out there, especially coming from academia, that new and old devs might get inspired by.

In this post, Iā€™ll explore one technique called Hand-Rocket Locomotion aka. Iron-Man propulsion. This technique has been featured in Iron Man VR, Rocket Skate VR, and Megaton Rainfall. Imagine rockets strapped to your hands. Essentially, the user changes the movement direction by orienting their hand in said direction. This technique falls under the category of hand-directed steering.

Now, if we want to implement this, we should first get an intuition for it.

How it Works

Looking at Tony Stark in his first test flight, we notice jets coming out of his palms and soles. He rises as the jets are exhausted downwards.

This might remind us of rockets, which work the same way; they burn some fuel, push the exhaust backward, and in turn, the exhaust pushes the rocket forward. Summed up more neatly by Newton:

ā€œFor every action, there is an equal and opposite reaction.ā€

This law impacts both the velocity of Iron-Man, as well as his direction of travel. Increasing the exhaustā€™s force will increase the opposite reaction force (also called thrust) and thus increase Iron-Manā€™s velocity. Changing the orientation of the exhaust also modifies the direction of thrust.

So for our virtual Iron-Man propulsion, we need both (1) the magnitude of thrust to exert on our user so that he can lift from the ground and soar through the air, as well as (2) the direction of the thrust.

NOTE: If you donā€™t have any experience setting up an XR Rig with the XR Interaction Toolkit and configure Mappings and Actions with the new Input Action System, I highly recommend getting acquainted with their documentation. For the video tutorial folks, I can recommend tutorials from both LevelUp and VR with Andrew.

Thrust Direction

To compute the thrust direction, we have to figure out in which direction we push out the exhaust. In Iron-Manā€™s case, the exhaust comes out of his palms. Therefore, if we can figure out the

  1. orientation of our hands (via the rotation of the controllers)
  2. we can isolate the direction the palms are facing.
  3. To compute the thrust direction, just like in Newtonā€™s third law, we take the opposite direction, i.e., we multiply by minus one to reverse the palm facing direction.

First, to get the orientation of the left-hand controller (analogous for the right-hand controller), weā€™ll use Unityā€™s Input Action System. The XRI Default Input Actions map the rotation of the controller to the Rotation Action. Querying this action gives us a quaternion, representing the controller's rotation from the default upright orientation.

If we create a reference to this action in our script as an InputActionProperty _leftHandRotation; we can then access it in code as follows:

We need the null-coalescence operator in case we have no value returned from the binding.

In the second step, now that we have the orientation of the hands, we need to isolate the direction of the palms. As abovementioned, the returned quaternion represents the rotation the controller has been through from his default upright position. In particular, if we take the default palm facing direction and rotate it by this leftHandOrientation we should get the current palm-facing direction. The default palm-facing orientation of the left hand is right from the point of view of the user. To compute the current direction, we simply multiply the two:

In the third and last step, we multiply the current palm-facing direction by minus one to get the opposite direction.

Also, donā€™t forget to hook up the reference to the input action in the Inspector.

Thrust Magnitude

Tony and our user need some control to accelerate, i.e., increase the thrust force. Holding the Oculus Touch Controllers, it felt pretty intuitive for me to use the side grab buttons, and this is what I used. The XRI Default Input Actions map the grab button of the left-hand controller onto the Select action. Unfortunately, if we look at the properties of this action (see below Figure), the Action Type is Button. Reading the documentation, we'll figure out that this is a boolean value. We could use it, but then we would either go full power or have no thrust.

More interesting would be to have multiple values in between, and thus the user can control the acceleration to some extent. We therefore change the Action Type to Value and set the Control Type to Axis.

In code we get a reference to Select Action, just like we did for the Rotation Action. We then ask for the value as follows:

To finally get the thrust vector, we multiply the direction and the magnitude:

Constructing the Locomotion

All that remains is to apply the above computed thrust force to our player game object. Such that a game object interacts with the physics system, and therefore the force it needs a Rigidbody. Responsible of interacting with said Rigidbody and thus manage the XR Rig's position is the Locomotion System, which is part of the XR Interaction Toolkit. We then use a LocomotionProvider to access the locomotion system which then manages the access to the XR Rig.

Unity introduced this extra layer of indirection, for scenarios with multiple locomotion techniques all wanting to modify the XR Rig, e.g., an application with teleportation locomotion and span turning.

In our Update method we compute the thrust force. Although the HangPropulsionProvider is the only LocomotionProvider on our XR Rig, we still make sure to check, via CanBeginLocomotion() if we can take control of it's locomotion. In case the system isn't busy with another LocomotionProvider we lock the system to our HandPropulsionProvider with BeginLocomotion(). We apply our force to the rigidbody, and then unlock the LocomotionSystem.

Test Flight

Iā€™ve set up 3 scenes for testing various control aspects of the locomotion system. Additionally, Iā€™ve added the Oculus Hands for hand-models, to increase presence, and particle effects coming out of the palms to suggest the direction of travel based on the hand orientation.

Each scene has checkpoints denoted by red smoke clouds. Once the user is in the vicinity of the clouds, they change colour to a brigth white, to signal that the checkpoint is complet. After arriving at the last checkpoint, the level ends.

I completed all three levels. The locomotion technique has triggered little to no motion sickeness, with the large terrain level triggering the most. Small movements and rotations of the wrist are able to control the direction of movement with ease. This is noticed the most in the ring scene, where you have to take a sharp turn at a point. As there is drag acting on the user, once the acceleration is released the user object comes quickly to a halt. The user can then reorient his palms and thus change direction precisely.

If you, the reader, are interested in evaluating this technique you can grab the code on GitHub. Test all 3 levels and please complete this questionnaire.

Future Work

One issues remarked by other users, was a dificulty in knowing the orientation of the hands when they were at their side, i.e. similarly to Iron-Man pose. This could be fixed by adding some information as a HUD interface like fighter pilots have in their helmets. Another complaint was regarding the position of the propulsion. Some users suggested that it could move to the bottom of the motion controllers instead of the palms. Lastly, some users noticed the lack of audio and haptic feedback, which could increase immersion.

--

--

Radu Coandă
0 Followers

Student of XR, Thinkerer of CG; I like to think about and implement VR techniques and play with Procedural Generation