Mixedrealitytoolkit-unity: UX and Input feature areas

Created on 22 Jan 2018  路  5Comments  路  Source: microsoft/MixedRealityToolkit-Unity

Overview

There's a lot of overlap and fuzziness on what belongs where in the UX vs Input feature folders.

I think we should have a discussion to clearly delineate between them and provide a path to organize.

Most helpful comment

I think the description about the Feature areas is clear: Input should include all inputs, gaze, voice, motion, etc. . For me this excludes visual. UX should provide building blocks for common controls to provide user feedback, so the visuals for the components. Of course UX needs Input to work, otherwise it would be just disabled controlls.
So I think cursors (that are currently in Input) belong to UX according to this definition.

All 5 comments

Anything I put into UX is focused more on design than basic functionality. It's definitely a fuzzy line, though.

I think the description about the Feature areas is clear: Input should include all inputs, gaze, voice, motion, etc. . For me this excludes visual. UX should provide building blocks for common controls to provide user feedback, so the visuals for the components. Of course UX needs Input to work, otherwise it would be just disabled controlls.
So I think cursors (that are currently in Input) belong to UX according to this definition.

@StephenHodgson What are some concrete examples of the overlap you're seeing?

Stuff in input script folder that probably belongs in UX like Cursors and the like.
I agree with the line that @brean has laid out.

will address in vNEXT

Was this page helpful?
0 / 5 - 0 ratings