The air tap gesture is not recognised on a HL1 device or if using Holographic Remoting in Unity. Using the emulator, everything seems to work fine, but as soon as the app is deployed to the device or run in Holographic Remoting mode in Unity, the "click" Unity event does not seem to be fired. Strangely, the cursor animation works fine, i.e. when air tapping (pressing down), the cursor gets smaller and when stopping the air tap gesture (releasing, pressing up), the cursor gets bigger again.
Air tap should correspond to "click" event
Follow instructions of community-written step-by-step guide linked in the official Docs. Link to step-by-step guide: here. Here's a link to the Github repository from the step-by-ste guide.
I tried with both 2018.3.8f1 and 2018.3.6f1
Microsoft Mixed Reality Toolkit v2.0.0 Beta 2 (0c0152f)
Seems like the opposite problem, but I wonder if this is related to #3652
I saw your issue post as well and was wondering how you got air-tapping to work on an actual HL device. Using the step-by-step guide referenced above, I didn't have any luck so far.
I just now created a bare bones project to test this.
With a new Unity project (2018.3.9f1) I imported the vNext beta 2 package. I used the Configure menu item to add the MixedRealityPlayspace and MixedRealityToolkit gameobjects but made no changes to them -- so just the DefaultMixedRealityConfigurationProfile.
In the build settings menu I switched the platform to Universal Windows Platform and in Player Settings | XR Settings I checked the Virtual Reality Supported checkbox (and made sure it shows the Windows Mixed Reality SDK and that it was using IL2CPP and .NET 4x in the Other Settings section).
Added a cube (with a material other than white so I could see the cursor) and created and attached a single script to it:
public class Clicker : MonoBehaviour, IMixedRealityPointerHandler {
The only code is the following implementation of the interface
public void OnPointerClicked(MixedRealityPointerEventData eventData) {
//do something
transform.localScale *= 1.2f;
}
public void OnPointerDown(MixedRealityPointerEventData eventData) { }
public void OnPointerUp(MixedRealityPointerEventData eventData) { }
With that, I ran in the Holographic Emulator | Remote to Device and I can focus on the cube, see the cursor change, and then air tap and the OnPointerClicked gets called (well, twice as per my other issue).
A few differences to the guide you linked: I didn't change the target to HoloLens, I didn't change the graphics settings. But probably the biggest is I just used the DefaultMixedRealityConfigurationProfile.
I've been able to clone the profile (Copy & Customize) and then make minor changes to my local versions, but a couple times I created a new version using the "Create new profiles" and ended up -- well mostly confused -- trying to get it set up correctly.
Thanks a lot for your effort! I tried to follow your setup, however, when starting the application, I see the view of the default Unity camera with the cube in the center. However, it is static although I turn my head, so apparently it's not recognised as a Mixed Reality camera. You mention that you use the "vNext beta 2 package", is that the latest one from the Releases, i.e. "Microsoft Mixed Reality Toolkit v2.0.0 Beta 2", or another one? That's the only difference I see at the moment.
Yes, I think you have to use the Mixed Reality Camera detup. Usually this can be brought in from the Mixed Reality Toolkit | Configure menu item. However, in some cases this fails for me and doesn't bring in the camera rig gameobject (MixedRealityPlayspace), though it does still bring in the Toolkit gameobject.
I'm not sure what causes it to fail, but when it happens I recreate the camera rig, copying the settings from another project:
MixedRealityPlayspace (empty gameobject at 0,0,0)
Main Camera (camera Solid color, black background, 0.1 clip, Audio Listener, Event System, Standalone Input Module, Gaze Provider)
UIRaycastCamera (camera, no Audio Listener)
It would be nice if this was a prefab for cases where you need to recreate it but I suppose, because it is created on the fly from your Toolkit settings, it wouldn't make sense.
For the import, I usually just use the latest code from github, but in this case, to make sure it wasn't an issue that had been fixed in the current code, I used the beta 2 package:
Microsoft.MixedReality.Toolkit.Unity.Foundation-v2.0.0-Beta2.unitypackage
With your settings, everything works fine! I'm still wondering though if this is actually the intended way of working with MRTK v2, or if configuring the profiles is preferred. Anyways, thanks a lot for your effort!
I think, this is fixed in RC1. closing.
Please reopen if you hit this again.
It isn't, there's #3909 with lots more information about the problem. A couple of issues were closed related to that but still the last version with working Air Tap on HL1 in builds / remoting is the last beta, none of the RCs work.
Most helpful comment
It isn't, there's #3909 with lots more information about the problem. A couple of issues were closed related to that but still the last version with working Air Tap on HL1 in builds / remoting is the last beta, none of the RCs work.