Mixed Reality Toolkit Input System Overview

Mixed Reality Toolkit Input System Overview

The Mixed Reality Toolkit contains a fully-featured input system, which allows you to handle various types of input and send them to any GameObject being currently gazed at, or any fallback object. It also includes a few example cursors that fully leverages the Unity’s animation system.

  • It uses and extends Unity’s default EventSystem so there’s no need for a custom input module.
  • It is cross platform and can be easily extended with custom input sources, events, and input handlers.
  • It works with Unity’s native uGUI.

Input Module Design

Each input source (Motion Controllers, Hands, Gestures, Mouse, Keyboard, etc.) implements an IInputSource interface. The interface defines various events that the input sources can trigger. The input sources register themselves with the InputManager, whose role it is to forward input events to the appropriate GameObjects. Input sources can be dynamically enabled / disabled as necessary, and new input sources can be created to support different input devices.

The InputManager listens to the various events coming from the input sources, and also takes into account which GameObjectthe current gaze and pointers are focused on.

By default, input events are sent to the currently focused GameObject, if that object implements the appropriate interface. Any event sent by the InputManager always bubbles up from the focused GameObject, to each of its ancestors.

If you wish to be able to send input events to GameObjects that do not require gaze (such as voice commands), you will need to register the handler as a “Global Listener”.

Modal input handlers can also be added to the InputManager. These modal handlers will take priority over the currently focused object. Fallback handlers can also be defined, so that the application can react to global inputs that aren’t targeting a specific element.

The InputManager forwards the various input sources events to the appropriate GameObject, using the following order:

  1. Registered Global Listeners
  2. The registered modal input handlers, in LIFO (Last-In First-Out) order of registration
  3. The currently focused GameObject
  4. The fallback input handlers, in LIFO (Last-In First-Out) order of registration

GameObjects that want to consume input events can implement one or more input interfaces. These interface handlers follow a specific hierarchy:

Generic Cross Platform Interfaces

  • ISourceStateHandler for all source detected/source lost events.
  • IFocusable for focus enter and exit. The focus can be triggered by the user’s gaze or any other gaze source.
  • IInputHandler for source up, down, and pressed events.
    • Similar to Unity’s Input class GetButtonDownGetButtonUp, and GetButton methods.
    • Only Pointers will raise UI events for Source Up and Source Down.
  • IInputClickHandler for source “clicked”. The Default Pointer Sources that raise the click handler include:
    • Mouse Clicks
    • When a user says “Select” (UWP Only)
    • Hand Taps
    • Clicker Presses
    • Motion Controller Trigger Presses
  • IXboxControllerHandler for Cross Platform Xbox One Controller events.

Windows Specific Interfaces

  • ISpeechHandler for voice commands.
  • IDictationHandler for speech to text dictation.

Universal Windows Platform Specific Interfaces

  • IHoldHandler for the Windows hold gesture.
  • IManipulationHandler for the Windows manipulation gesture.
  • INavigationHandler for the Windows navigation gesture.

Motion Controller Specific Interfaces

  • IControllerInputHandler for input position events from the thumbstick and TouchPad.
  • ISelectHandler for selection pressed amount changes.
  • IControllerTouchpadHandler for TouchPad press and touch events.
  • ISourcePositionHandler to get pointer and grip position events.
  • ISourceRotationHandler to get pointer and grip rotation events.

Next, check out how to get started implementing these interfaces in your own projects.

Comments are closed.