The joints on the simulated hands do not produce the same rotations as the ones in the detected hands on device.
Notably, the right and forward vectors from the palm transform generated by simulated hand are not representative of what's produced by the system.
Display the transforms returned by the simulated hands.
They will be different in a build on device.
Simulated hands produce accurate transforms.
The joint rotations for the WindowsMixedRealityArticulatedHand are computed under the hood by Windows, whereas the simulated hand rotations are computed in MRTK. I'm going to try and track down the Windows code that calculates these and emulate it if possible.
Also note that some joint positions in the simulated hand data don't match the positions of the WMR hand, most notably the metacarpal joints (base of each finger). That's because the simulated hand data was partly recorded using a LeapMotion device, which has a slightly different hand model. Thankfully the more important joints (finger tips) are unambiguous enough that hand models don't diverge significantly.
On second thought: We should probably just explicitly emulate the HL2 hands. Simulation should interpolate the rotations as reported by the device. If/when LeapMotion is supported its controller in MRTK should then be made to conform to the official MRTK hand model, by mapping the LeapMotion data onto our own hand model.
Leap motion editor support for interaction prototyping sounds fantastic!