Mixedrealitytoolkit-unity: Simulated hands do not produce correct joint rotations

Created on 30 Apr 2019  路  3Comments  路  Source: microsoft/MixedRealityToolkit-Unity

Describe the bug

The joints on the simulated hands do not produce the same rotations as the ones in the detected hands on device.

Notably, the right and forward vectors from the palm transform generated by simulated hand are not representative of what's produced by the system.

To reproduce

Display the transforms returned by the simulated hands.
They will be different in a build on device.

Expected behavior

Simulated hands produce accurate transforms.

Screenshots

Your Setup (please complete the following information)

  • Unity Version 018.3.8f1]
  • MRTK Version v2.0 (latest dev branch)

Target Platform (please complete the following information)

  • HoloLens 2

Additional context

Bug ISV Urgency-Soon

All 3 comments

The joint rotations for the WindowsMixedRealityArticulatedHand are computed under the hood by Windows, whereas the simulated hand rotations are computed in MRTK. I'm going to try and track down the Windows code that calculates these and emulate it if possible.

Also note that some joint positions in the simulated hand data don't match the positions of the WMR hand, most notably the metacarpal joints (base of each finger). That's because the simulated hand data was partly recorded using a LeapMotion device, which has a slightly different hand model. Thankfully the more important joints (finger tips) are unambiguous enough that hand models don't diverge significantly.

On second thought: We should probably just explicitly emulate the HL2 hands. Simulation should interpolate the rotations as reported by the device. If/when LeapMotion is supported its controller in MRTK should then be made to conform to the official MRTK hand model, by mapping the LeapMotion data onto our own hand model.

Leap motion editor support for interaction prototyping sounds fantastic!

Was this page helpful?
0 / 5 - 0 ratings