Mixedrealitytoolkit-unity: Oculus Touch controller can't rotate objects with one hand

Created on 8 Jan 2021  路  4Comments  路  Source: microsoft/MixedRealityToolkit-Unity

Describe the bug

I am using ObjectManipulator.cs to move and rotate objects. When using hand tracking, objects can be perfectly rotated with one hand. However, when I use the Oculus Touch controller one hand rotation is not possible.

I have identified the code segment that creates the issue at: ObjectManipulator.cs

 private bool TryGetGripRotation(IMixedRealityPointer pointer, out Quaternion rotation)
        {
            for (int i = 0; i < pointer.Controller.Interactions.Length; i++)
            {
                if (pointer.Controller.Interactions[i].InputType == DeviceInputType.SpatialGrip)
                {
                    rotation = pointer.Controller.Interactions[i].RotationData;
                    return true;
                }
            }
            rotation = Quaternion.identity;
            return false;
        }

Above can be seen that, only input providers who has the input type SpatialGrip can provide grip rotation. In case of Oculus Touch controllers, I couldn't find a way to define Spatial Grip in Input Mapping profile.

If I replace SpatialGrip with SpatialPointer in the above code, rotation works on both hand tracking and Oculus Touch controller.

Is there a way to define SpatialGrip for Oculus Touch controllers or would it be possible to update the code segment above with SpatialPointer?

To reproduce

Steps to reproduce the behavior:

  1. Place a cube in the scene
  2. Add ObjectManipulator component
  3. Play
  4. Manipulate object with one Oculus Touch Controller

Expected behavior

To be able to rotate objects with one Oculus Touch Controller.

Setup

  • Unity Version 2019.4.13f1
  • MRTK Version v2.5.1

Target platform

  • Oculus Quest
Bug

Most helpful comment

If I replace SpatialGrip with SpatialPointer in the above code, rotation works on both hand tracking and Oculus Touch controller.

Ah good callout on this. Definitely a few approaches here:

  1. Duplicate the SpatialPointer data in a new SpatialGrip interaction
  2. Fall back to SpatialPointer in ObjectManipulator if SpatialGrip isn't provided
  3. Re-write this code to listen for source pose change events, instead of iterating through the interactions, and cache that data. Source pose events should be agnostic between SpatialPointer and SpatialGrip, and each source should provide its own implementation of that

I'd lean towards number 3, since we really shouldn't be iterating through the interactions manually wherever possible.

All 4 comments

If I replace SpatialGrip with SpatialPointer in the above code, rotation works on both hand tracking and Oculus Touch controller.

Ah good callout on this. Definitely a few approaches here:

  1. Duplicate the SpatialPointer data in a new SpatialGrip interaction
  2. Fall back to SpatialPointer in ObjectManipulator if SpatialGrip isn't provided
  3. Re-write this code to listen for source pose change events, instead of iterating through the interactions, and cache that data. Source pose events should be agnostic between SpatialPointer and SpatialGrip, and each source should provide its own implementation of that

I'd lean towards number 3, since we really shouldn't be iterating through the interactions manually wherever possible.

If I may, I would also add a configuration to manipulate the object with different actions other than Select/Pointer - e.g. trigger, grip.
For example to enable the scenario: Select to perform an action on the object (like highlight it); Trigger/grip to move it. Currently, this is not possible as ObjectManipulator is using the Pointer callbacks (which are Select related) and one would have to extend it, block the OnPointer callbacks and implement actions handlers to call the OnPointer callbacks - not very clean. :)

@keveleigh thank you for your comment! Can you think of any work around until the new version comes out?

Was this page helpful?
0 / 5 - 0 ratings