Mixedrealitytoolkit-unity: Articulated hands do not allow custom interaction mappings

Created on 26 May 2020  路  10Comments  路  Source: microsoft/MixedRealityToolkit-Unity

Describe the bug

Articulated hands do not allow custom interaction mappings

To reproduce

To reproduce the behavior, notice that even if you define your own input system and controller mapping, you can't add new entries to articulated hands.

Expected behavior

If you go to the trouble to define custom mappings, that ability should be supported.

Note that this is increasingly painful if one wants to build upon articulated hands, since the lack of custom interaction mapping on the base class, and its enforced merging of behaviors as a result, forces creation of a new input source type in MRTK.

Screenshots

Your setup (please complete the following information)

  • Unity Version 2018.4.23f1
  • MRTK Version 2.3.0+

Target platform (please complete the following information)

  • HoloLens 2
  • Leap Motion (pre-2.4.0)
  • also Quest (see https://github.com/provencher/MRTK-Quest)

Additional context

Again, thanks for the excellent discussion in HoloDevelopers Slack
https://holodevelopers.slack.com/archives/CTW7K59U4/p1590468076355100?thread_ts=1590361318.325600&cid=CTW7K59U4

Bug

All 10 comments

Note it seems that there is this very feature being introduced to XRTK https://github.com/XRTK/XRTK-Core/pull/552

I'm working on making custom interaction mappings and custom controller / hand definitions more possible. I'll work on getting my proposal fully written up and proposed here.

@provencher I'm not sure I see what in that PR would enable this feature request. It looks like they're adding the same (well, a similar, with one additional) set of explicitly defined interactions in the MixedRealityHandController class as we defined a while back in MRTK.

@keveleigh yes, for the interaction parts you're correct, but there are a few useful elements in that PR that relate to this:

HandDataPostProcessor, which does a post process on hand joints, independent of the platform, to provide things like grip and pinch strength, which can be leveraged as interaction mappings.

HandPoseRecognizer allows you to leverage defined poses in the editor, to map to interactions. This part is less concretely utilized, and it remains to be seen how well this concept works in of itself.

does a post process on hand joints, independent of the platform, to provide things like grip and pinch strength, which can be leveraged as interaction mappings.

While we don't currently support a grip / pinch strength interaction, we do have something similar which is currently utilized in our Leap Motion support to determine grab / pinch (but is platform-independent):

https://github.com/microsoft/MixedRealityToolkit-Unity/blob/43e8b9d46fdbe7848d7540a6d580ce61fba56e9b/Assets/MRTK/Core/Providers/Hands/ArticulatedHandDefinition.cs#L131-L160

WMR articulated hands use this class as a base as well. The intent behind this definition is definitely for it to grow as more features are needed. It was the first step along the controller mapping rework I've been referring to, refactoring out the interaction definition into a single class so all classes that choose to have "articulated hand interactions" behave as similarly as possible and don't need to re-define identical interactions in each class. It will eventually also allow you to mark controller classes as the same physical controller via a definition class instead of via the SupportedControllerTypes enum, which is how it's handled to day and isn't extendable.

allows you to leverage defined poses in the editor, to map to interactions. This part is less concretely utilized, and it remains to be seen how well this concept works in of itself.

This one is interesting! It sounds like it uses some version of our ArticulatedHandPoses to match runtime poses against.

Super happy we're having these conversations, as there's plenty of room to grow our hand support!

On the note of the IsPinching code, I think there is value in generalizing the logic in that getter to do a few things:

  • Customize which fingers you're checking for pinch. The Oculus SDK lets you check is pinching between thumb and middle finger for instance. Why not expose that capability here as well?
  • Easily override the pinch enter and exit distances on a case by case basis, without changing a class variable.

You could leverage the same logic with say the palm joint, to determine if any given finger is closed, and from there, get an IsGripping pose. I've gotten feedback on MRTK Quest that folks want to easily grab objects by closing their hands, like they can on HL2, which isn't supported out of the box on Quest.

Glad these conversations are allowing the capabilities of Articulated hands to grow!

hi folks, take a look at https://github.com/provencher/MRTK-Quest/pull/52 and feel free to share your thoughts

The question of what do to with incomplete information has come up there, and @keveleigh while you are considering refactoring, I thought it may also be valuable to consider detecting, and behaving differently in response to, temporary tracking interruptions (flickers/blips) vs. longer-duration outages.

@provencher has merged https://github.com/provencher/MRTK-Quest/pull/52 - however the change is not Quest specific; @keveleigh it might be good to provide for all ArticulatedHand

This issue has been marked as stale by an automated process because it has not had any recent activity. It will be automatically closed in 30 days if no further activity occurs. If this is still an issue please add a new comment with more recent details and repro steps.

I dont believe this issue is resolved - I'd suggest not closing it yet.

Was this page helpful?
0 / 5 - 0 ratings