Category: Tutorial

Motion Controller Input With The Mixed Reality Toolkit

Motion Controller Input With The Mixed Reality Toolkit

What you’ll need

If you haven’t done so already, be sure you’ve properly setup your development environment and you’ve imported the Mixed Reality Toolkit into your project.  You’ll also need to be familiar with the Unity Editor and its interface controls.  If you are not, there is a great tutorial series to get you started.

Getting Started

Note: Only availible for the UWP Build Target.

  1. Create a new scene
  2. Run the MRTK scene wizard via:
    MixedRealityToolkit/Configure/Apply Scene Settings
  3. Create an empty GameObject
  4. Rename the GameObject to MotionControllerHandler
  5. Create a new script named MotionControllerHandler
  6. Attach the new MotionControllerHandler script to your MotionControllerHandler GameObject
  7. Open the new script in any text editor
  8. Implement the ISelectHandlerIControllerTouchpadHandlerISourcePositionHandler, and ISourceRotationHandlerinterfaces
  9. Add logic for each event
using UnityEngine;
using HoloToolkit.Unity.InputModule;

public class MotionControllerHandler : MonoBehaviour, ISelectHandler, IControllerTouchpadHandler, ISourcePositionHandler, ISourceRotationHandler
{
    void IControllerInputHandler.OnInputPositionChanged(InputPositionEventData eventData)
    {
        Debug.LogFormat("OnRotationChanged\r\nSource: {0}  SourceId: {1}  Input Position: {2}",
            eventData.InputSource, eventData.SourceId, eventData.Position);
    }

    void ISelectHandler.OnSelectPressedAmountChanged(SelectPressedEventData eventData)
    {
        Debug.LogFormat("OnRotationChanged\r\nSource: {0}  SourceId: {1}  Select Press Amount: {2}",
            eventData.InputSource, eventData.SourceId, eventData.PressedAmount);
    }

    void IControllerTouchpadHandler.OnTouchpadTouched(InputEventData eventData)
    {
        Debug.LogFormat("OnRotationChanged\r\nSource: {0}  SourceId: {1}  InteractionPressKind: {2}",
            eventData.InputSource, eventData.SourceId, eventData.PressType);
    }

    void IControllerTouchpadHandler.OnTouchpadReleased(InputEventData eventData)
    {
        Debug.LogFormat("OnRotationChanged\r\nSource: {0}  SourceId: {1}  InteractionPressKind: {2}",
            eventData.InputSource, eventData.SourceId, eventData.PressType);
    }

    void ISourcePositionHandler.OnPositionChanged(SourcePositionEventData eventData)
    {
        Debug.LogFormat("OnRotationChanged\r\nSource: {0}  SourceId: {1}  Pointer Rotation: {2}  Grip Rotation {3}",
            eventData.InputSource, eventData.SourceId, eventData.PointerPosition, eventData.GripPosition);
    }

    void ISourceRotationHandler.OnRotationChanged(SourceRotationEventData eventData)
    {
        Debug.LogFormat("OnRotationChanged\r\nSource: {0}  SourceId: {1}  Pointer Rotation: {2}  Grip Rotation {3}",
            eventData.InputSource, eventData.SourceId, eventData.PointerRotation, eventData.GripRotation);
    }
}
Hold, Navigation, and Manipulation Input With The Mixed Reality Toolkit

Hold, Navigation, and Manipulation Input With The Mixed Reality Toolkit

What you’ll need

If you haven’t done so already, be sure you’ve properly setup your development environment and you’ve imported the Mixed Reality Toolkit into your project.  You’ll also need to be familiar with the Unity Editor and its interface controls.  If you are not, there is a great tutorial series to get you started.

Getting Started

  1. Create a new scene
  2. Run the MRTK scene wizard via:
    MixedRealityToolkit/Configure/Apply Scene Settings
  3. Create a Cube
  4. Create a new script named InputHandler
  5. Attach the InputHandler to your Cube
  6. Open the new script in any text editor
  7. Implement the IHoldHandler,IManipulationHandler, and INavigationHandler interfaces
  8. Add logic for each event
using UnityEngine;
using HoloToolkit.Unity.InputModule;

public class InputHandler : MonoBehaviour, IHoldHandler, IManipulationHandler, INavigationHandler
{
    void IHoldHandler.OnHoldStarted(HoldEventData eventData)
    {
        Debug.LogFormat("OnHoldStarted\r\nSource: {0}  SourceId: {1}", eventData.InputSource, eventData.SourceId);
        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void IHoldHandler.OnHoldCompleted(HoldEventData eventData)
    {
        Debug.LogFormat("OnHoldCompleted\r\nSource: {0}  SourceId: {1}", eventData.InputSource, eventData.SourceId);
        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void IHoldHandler.OnHoldCanceled(HoldEventData eventData)
    {
        Debug.LogFormat("OnHoldCanceled\r\nSource: {0}  SourceId: {1}", eventData.InputSource, eventData.SourceId);
        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void IManipulationHandler.OnManipulationStarted(ManipulationEventData eventData)
    {
        Debug.LogFormat("OnManipulationStarted\r\nSource: {0}  SourceId: {1}\r\nCumulativeDelta: {2} {3} {4}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.CumulativeDelta.x,
            eventData.CumulativeDelta.y,
            eventData.CumulativeDelta.z);

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void IManipulationHandler.OnManipulationUpdated(ManipulationEventData eventData)
    {
        Debug.LogFormat("OnManipulationUpdated\r\nSource: {0}  SourceId: {1}\r\nCumulativeDelta: {2} {3} {4}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.CumulativeDelta.x,
            eventData.CumulativeDelta.y,
            eventData.CumulativeDelta.z);

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void IManipulationHandler.OnManipulationCompleted(ManipulationEventData eventData)
    {
        Debug.LogFormat("OnManipulationCompleted\r\nSource: {0}  SourceId: {1}\r\nCumulativeDelta: {2} {3} {4}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.CumulativeDelta.x,
            eventData.CumulativeDelta.y,
            eventData.CumulativeDelta.z);

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void IManipulationHandler.OnManipulationCanceled(ManipulationEventData eventData)
    {
        Debug.LogFormat("OnManipulationCanceled\r\nSource: {0}  SourceId: {1}\r\nCumulativeDelta: {2} {3} {4}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.CumulativeDelta.x,
            eventData.CumulativeDelta.y,
            eventData.CumulativeDelta.z);

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void INavigationHandler.OnNavigationStarted(NavigationEventData eventData)
    {
        Debug.LogFormat("OnNavigationStarted\r\nSource: {0}  SourceId: {1}\r\nCumulativeDelta: {2} {3} {4}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.NormalizedOffset.x,
            eventData.NormalizedOffset.y,
            eventData.NormalizedOffset.z);

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void INavigationHandler.OnNavigationUpdated(NavigationEventData eventData)
    {
        Debug.LogFormat("OnNavigationUpdated\r\nSource: {0}  SourceId: {1}\r\nCumulativeDelta: {2} {3} {4}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.NormalizedOffset.x,
            eventData.NormalizedOffset.y,
            eventData.NormalizedOffset.z);

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void INavigationHandler.OnNavigationCompleted(NavigationEventData eventData)
    {
        Debug.LogFormat("OnNavigationCompleted\r\nSource: {0}  SourceId: {1}\r\nCumulativeDelta: {2} {3} {4}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.NormalizedOffset.x,
            eventData.NormalizedOffset.y,
            eventData.NormalizedOffset.z);

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }

    void INavigationHandler.OnNavigationCanceled(NavigationEventData eventData)
    {
        Debug.LogFormat("OnNavigationCanceled\r\nSource: {0}  SourceId: {1}\r\nCumulativeDelta: {2} {3} {4}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.NormalizedOffset.x,
            eventData.NormalizedOffset.y,
            eventData.NormalizedOffset.z);

        eventData.Use(); // Mark the event as used, so it doesn't fall through to other handlers.
    }
}

Last, we will cover how to get input from the Windows Mixed Reality Motion Controllers.

Dictation With The Mixed Reality Toolkit

Dictation With The Mixed Reality Toolkit

What you’ll need

If you haven’t done so already, be sure you’ve properly setup your development environment and you’ve imported the Mixed Reality Toolkit into your project.  You’ll also need to be familiar with the Unity Editor and its interface controls.  If you are not, there is a great tutorial series to get you started.

Getting Started

Note: Only availible for Windows Standalone and UWP Build Targets.

  1. Create a new scene
  2. Run the MRTK scene wizard via:
    MixedRealityToolkit/Configure/Apply Scene Settings
  3. Create an empty GameObject
  4. Rename the new GameObject to DictationHandler
  5. Create a new script named DictationHandler
  6. Attach the new DictationHandler script to your DictationHandler GameObject
  7. Open the new script in any text editor
  8. Implement the IInputClickHandler and IDictationHandler interfaces
  9. Add fields for the initial silence timeout, auto silence timeout, and total allowable recording time.
  10. Add fields for the text output
  11. Add a flag for recording
using UnityEngine;
using HoloToolkit.Unity.InputModule;

public class DictationHandler : MonoBehaviour, IInputClickHandler, IDictationHandler
{
    [SerializeField]
    [Range(0.1f, 5f)]
    [Tooltip("The time length in seconds before dictation recognizer session ends due to lack of audio input in case there was no audio heard in the current session.")]
    private float initialSilenceTimeout = 5f;

    [SerializeField]
    [Range(5f, 60f)]
    [Tooltip("The time length in seconds before dictation recognizer session ends due to lack of audio input.")]
    private float autoSilenceTimeout = 20f;

    [SerializeField]
    [Range(1, 60)]
    [Tooltip("Length in seconds for the manager to listen.")]
    private int recordingTime = 10;

    private string lastOutput;
    private string speechToTextOutput = string.Empty;
    public string SpeechToTextOutput { get { return speechToTextOutput; } }

    private bool isRecording;
}
  1. Add logic for handling the recording toggle when DictationHander GameObject is clicked
public void OnInputClicked(InputClickedEventData eventData)
{
    ToggleRecording();
}

private void ToggleRecording()
{
    if (isRecording)
    {
        isRecording = false;
        StartCoroutine(DictationInputManager.StopRecording());
    }
    else
    {
        isRecording = true;
        StartCoroutine(DictationInputManager.StartRecording(initialSilenceTimeout, autoSilenceTimeout, recordingTime));
    }
}
  1. Add logic for handling dictation results
void IDictationHandler.OnDictationHypothesis(DictationEventData eventData)
{
    speechToTextOutput = eventData.DictationResult;
}

void IDictationHandler.OnDictationResult(DictationEventData eventData)
{
    speechToTextOutput = eventData.DictationResult;
}

void IDictationHandler.OnDictationComplete(DictationEventData eventData)
{
    speechToTextOutput = eventData.DictationResult;
}

void IDictationHandler.OnDictationError(DictationEventData eventData)
{
    isRecording = false;
    speechToTextOutput = eventData.DictationResult;
    Debug.LogError(eventData.DictationResult);
    StartCoroutine(DictationInputManager.StopRecording());
}
  1. Add logic for displaying the results
private void Update()
{
    if (!string.IsNullOrEmpty(speechToTextOutput) && !lastOutput.Equals(speechToTextOutput))
    {
        Debug.Log(speechToTextOutput);
        lastOutput = speechToTextOutput;
    }
}

Next, we’ll take a look at handling the Hold, Navigation, and Manipulation Inputs.

Voice Commands With the Mixed Reality Toolkit

Voice Commands With the Mixed Reality Toolkit

What you’ll need

If you haven’t done so already, be sure you’ve properly setup your development environment and you’ve imported the Mixed Reality Toolkit into your project.  You’ll also need to be familiar with the Unity Editor and its interface controls.  If you are not, there is a great tutorial series to get you started.

Getting Started

Note: This Feature is only availible for Windows Standalone and UWP Build Targets.

  1. Voice Input
  2. Create a new scene
  3. Run the MRTK scene wizard via:
    MixedRealityToolkit/Configure/Apply Scene Settings
  4. Create a Cube
  5. Create a new script named SpeechHandler
  6. Attach the SpeechHandler to your Cube
  7. Open the new script in any text editor
  8. Implement the ISpeechHandler interface
  9. Add switch statement for eventData.RecognizedText with a case for each command you wish to use.

Note: Select and other Voice Commands are reserved by the OS and cannot be used.

ProTip: Use simple Voice Commands that are one or two words.

using UnityEngine;
using HoloToolkit.Unity.InputModule;

public class SpeechHandler : MonoBehaviour, ISpeechHandler
{
    void ISpeechHandler.OnSpeechKeywordRecognized(SpeechEventData eventData)
    {
        switch (eventData.RecognizedText.ToLower())
        {
            case "your voice command":
                DoAction();
                break;
        }
    }

    public void DoAction()
    {
        // TODO: Action
    }
}

Now, let’s take it a step further and use Dictation to input text.

Xbox Controller Input With The Mixed Reality Toolkit

Xbox Controller Input With The Mixed Reality Toolkit

What you’ll need

If you haven’t done so already, be sure you’ve properly setup your development environment and you’ve imported the Mixed Reality Toolkit into your project.  You’ll also need to be familiar with the Unity Editor and its interface controls.  If you are not, there is a great tutorial series to get you started.

Getting Started

  1. Create a new scene
  2. Run the MRTK scene wizard via:
    MixedRealityToolkit/Configure/Apply Scene Settings
  3. Create an empty GameObject
  4. Rename the GameObject to XboxControllerSource
  5. Attach the XboxControllerSource to the XboxControllerSource GameObject
  6. Create a new script named XboxControllerHandler
  7. Attach the XboxControllerHandler to the XboxControllerSource GameObject
  8. Open the new script in any text editor
  9. Your new XboxControllerHandler class should inherit from XboxControllerHandlerBase
  10. Add logic for handling source detection/loss and XboxInputUpdate
using UnityEngine;
using HoloToolkit.Unity.InputModule;

public class XboxInputHandler : XboxControllerHandlerBase
{
    public override void OnSourceDetected(SourceStateEventData eventData)
    {
        base.OnSourceDetected(eventData);
        Debug.LogFormat("Joystick {0} with id: \"{1}\" Connected", GamePadName, eventData.SourceId);
    }

    public override void OnSourceLost(SourceStateEventData eventData)
    {
        Debug.LogFormat("Joystick {0} with id: \"{1}\" Disconnected", GamePadName, eventData.SourceId);
        base.OnSourceLost(eventData);
    }

    public override void OnXboxInputUpdate(XboxControllerEventData eventData)
    {
        // FYI, eventData.Use() is called in the base implementation.
        base.OnXboxInputUpdate(eventData);

        Debug.LogFormat(
            "{19}\n" +
            "LS Horizontal: {0:0.000} Vertical: {1:0.000}\n" +
            "RS Horizontal: {2:0.000} Vertical: {3:0.000}\n" +
            "DP Horizontal: {4:0.000} Vertical: {5:0.000}\n" +
            "Left Trigger:  {6:0.000} Right Trigger: {7:0.000} Shared Trigger: {8:0.00}\n" +
            "A: {9} B: {10} X: {11} Y: {12}\n" +
            "LB: {13} RB: {14} " +
            "LS: {15} RS: {16}\n" +
            "View: {17} Menu: {18}\n",
            eventData.XboxLeftStickHorizontalAxis, eventData.XboxLeftStickVerticalAxis,
            eventData.XboxRightStickHorizontalAxis, eventData.XboxRightStickVerticalAxis,
            eventData.XboxDpadHorizontalAxis, eventData.XboxDpadVerticalAxis,
            eventData.XboxLeftTriggerAxis, eventData.XboxRightTriggerAxis, eventData.XboxSharedTriggerAxis,
            eventData.XboxA_Pressed, eventData.XboxB_Pressed, eventData.XboxX_Pressed, eventData.XboxY_Pressed,
            eventData.XboxLeftBumper_Pressed, eventData.XboxRightBumper_Pressed,
            eventData.XboxLeftStick_Pressed, eventData.XboxRightStick_Pressed,
            eventData.XboxView_Pressed, eventData.XboxMenu_Pressed,
            GamePadName);
    }
}

Next, let’s implement Voice Commands.

Generic Input and Pointer Clicks With The Mixed Reality Toolkit

Generic Input and Pointer Clicks With The Mixed Reality Toolkit

What you’ll need

If you haven’t done so already, be sure you’ve properly setup your development environment and you’ve imported the Mixed Reality Toolkit into your project.  You’ll also need to be familiar with the Unity Editor and its interface controls.  If you are not, there is a great tutorial series to get you started.

Getting Started

  1. Create a new scene
  2. Run the MRTK scene wizard via:
    MixedRealityToolkit/Configure/Apply Scene Settings
  3. Create a Cube
  4. Create a new script named InputHandler
  5. Attach the InputHandler to your Cube
  6. Open the new script in any text editor
  7. Implement the IInputClickHandler and IInputHandler interfaces
  8. Add logic for when Input Click events and Input Up/Down events
using UnityEngine;
using HoloToolkit.Unity.InputModule;

public class InputHandler : MonoBehaviour, IInputClickHandler, IInputHandler
{
    void IInputHandler.OnInputUp(InputEventData eventData)
    {
        Debug.LogFormat("OnInputUp\r\nSource: {0}  SourceId: {1}  InteractionPressKind: {2}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.PressType);
        // Mark the event as used, so it doesn't fall through to other handlers.
        eventData.Use();
    }

    void IInputHandler.OnInputDown(InputEventData eventData)
    {
        Debug.LogFormat("OnInputDown\r\nSource: {0}  SourceId: {1}  InteractionPressKind: {2}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.PressType);
        // Mark the event as used, so it doesn't fall through to other handlers.
        eventData.Use();
    }

    void IInputClickHandler.OnInputClicked(InputClickedEventData eventData)
    {
        Debug.LogFormat("OnInputClicked\r\nSource: {0}  SourceId: {1}  InteractionPressKind: {2}  TapCount: {3}",
            eventData.InputSource,
            eventData.SourceId,
            eventData.PressType,
            eventData.TapCount);
        // Mark the event as used, so it doesn't fall through to other handlers.
        eventData.Use();
    }
}

Next, let’s look at how to get input from a Xbox Controller.

Getting Focus With The Mixed Reality Toolkit

Getting Focus With The Mixed Reality Toolkit

What you’ll need

If you haven’t done so already, be sure you’ve properly setup your development environment and you’ve imported the Mixed Reality Toolkit into your project.  You’ll also need to be familiar with the Unity Editor and its interface controls.  If you are not, there is a great tutorial series to get you started.

Getting Started

  1. Create a new scene
  2. Run the MRTK scene wizard via:
    MixedRealityToolkit/Configure/Apply Scene Settings
  3. Create a Cube
  4. Create a new script named HighlightHandler
  5. Attach the HighlightHandler to your Cube
  6. Open the new script in your preferred text editor
  7. Implement the IFocusable interface
using UnityEngine;
using HoloToolkit.Unity.InputModule;

public class HighlightHandler : MonoBehaviour, IFocusable
{
    void IFocusable.OnFocusEnter()
    {
    }

    void IFocusable.OnFocusExit()
    {
    }
}
  1. Add Color, Renderer, and Material fields
using UnityEngine;
using HoloToolkit.Unity.InputModule;

[RequireComponent(typeof(Renderer))]
public class HighlightHandler : MonoBehaviour, IFocusable
{
    [SerializeField]
    private Color highlightColor;

    private Color initialColor;
    private Renderer rendererInstance;
    private Material materialInstance;

    void IFocusable.OnFocusEnter()
    {
    }

    void IFocusable.OnFocusExit()
    {
    }
}
  1. Cache initial Color, Renderer, and Material in Awake
  2. Properly destroy the Material instance in OnDestroy
private void Awake()
{
    rendererInstance = GetComponent<Renderer>();
    materialInstance = rendererInstance.material;
    initialColor = materialInstance.color;
}

private void OnDestroy()
{
    Destroy(materialInstance);
}
  1. Handle color changes in implemented methods
void IFocusable.OnFocusEnter()
{
    materialInstance.color = highlightColor;
}

void IFocusable.OnFocusExit()
{
    materialInstance.color = initialColor;
}

Next, let’s take a look at getting Generic Input and Clicking.

Detecting Input Sources with the Mixed Reality Toolkit

Detecting Input Sources with the Mixed Reality Toolkit

What You’ll Need

If you haven’t done so already, be sure you’ve properly setup your development environment and you’ve imported the Mixed Reality Toolkit into your project.  You’ll also need to be familiar with the Unity Editor and its interface controls.  If you are not, there is a great tutorial series to get you started.

Getting Started

  1. Create a new scene
  2. Run the MRTK scene wizard via:
    MixedRealityToolkit/Configure/Apply Scene Settings
  3. Create an empty GameObject
  4. Rename the GameObject to InputSourceHandler
  5. Attach a SetGlobalListener component to the InputSourceHander GameObject
  6. Create a new script named InputSourceHandler
  7. Attach the InputSourceHandler to your InputSourceHander GameObject
  8. Open the new script in your preferred text editor
  9. Implement the ISourceStateHandler interface after the Monobehaviour class inheritance declaration
  10. Add logic for handling source detection and loss
using UnityEngine;
using HoloToolkit.Unity.InputModule;

public class InputSourceDetectionHandler : MonoBehaviour, ISourceStateHandler
{
    public void OnSourceDetected(SourceStateEventData eventData)
    {
        Debug.LogFormat("OnSourceDetected\r\nSource: {0}  SourceId: {1}",
            eventData.InputSource,
            eventData.SourceId);
        // Mark the event as used, so it doesn't fall through to other handlers.
        eventData.Use();
    }

    public void OnSourceLost(SourceStateEventData eventData)
    {
        Debug.LogFormat("OnSourceLost\r\nSource: {0}  SourceId: {1}",
            eventData.InputSource,
            eventData.SourceId);
        // Mark the event as used, so it doesn't fall through to other handlers.
        eventData.Use();
    }
}

Next, let’s get some Focus.

Handling Input with the Mixed Reality Toolkit

Handling Input with the Mixed Reality Toolkit

What You’ll need

  • You’ll need to have the default scene setup as outlined in this post.
    • The InputManager.prefab in your scene
    • A cursor that inherits from the ICursor interface in your scene
  • GameObect(s) you want to interact with are required to have a Collider component.
  • When using uGUI, all world space canvases are required to use the UIRaycastCamera as their event camera.
  • When consuming events in your input handlers, it’s highly suggested you call eventData.Use(); so the event doesn’t fall through to other input handlers.

Getting Started

First, it’s important to read and review Microsoft’s developer guidelines for GazeGesturesVoice, and Motion Controllers.

Detecting Input Sources

Getting Focus

Getting Generic Input and Pointer Clicks

Xbox Controller Input

Voice Input

Dictation Input

Hold, Navigation, and Manipulation Inputs

Motion Controller Input

Preparing Your Project for Mixed Reality

Preparing Your Project for Mixed Reality

This post will cover how to prepare your Unity Project to use the Mixed Reality Toolkit.  We will be modifying the project’s settings, and ensuring we’re all set to create our first scene.

What you’ll need

If you haven’t done so already, be sure you’ve properly setup your development environment and you’ve imported the Mixed Reality Toolkit into your project.  You’ll also need to be familiar with the Unity Editor and its interface controls.  If you are not, there is a great tutorial series to get you started.

Getting Started

Open your project and navigate to the Mixed Reality Project Settings Wizard via:

Mixed Reality Toolkit/Configure/Apply Mixed Reality Project Settings...

Next you’ll see the Project Settings Wizard.  You’ll want to enable the following options:

  • Target Windows Universal UWP
  • Enable XR
  • Build for Direct3D
  • Target Occluded Devices
    • If you’re also targeting the HoloLens, these settings will be automatically detected and updated for you when the app runs on the device
  • Use the Toolkit specific input manager axes
  • Enable .NET scripting back-end

Then press Apply.  It’ll take a moment for the editor to switch the build platform and re-import and serialize the assets.  Once finished, navigate to the Editor Build window to double check the wizard completed successfully via File/Build Settings...

ProTip: You can access the build window via Ctrl + Shift + B

Next, we will configure the scene.  Navigate to the Scene Settings Wizard via:

Mixed Reality Toolkit/Configure/Apply Mixed Reality Scene Settings...

You’ll see the Scene Settings Wizard, with all the default settings already enabled for you.  Press Apply.

In your scenes hierarchy you should now see the standard scene prefabs.

Next you’ll want to create a new folder in your project’s folder and name it the same as your project.  This folder will contain all the assets specific to your project under the Assets root folder.  Inside that folder you’ll want to create the following folders:

  • Animations
  • Materials
  • Meshes
  • Prefabs
  • Scenes
  • Scripts
  • Textures

Then save your scene in the Assets/<Your Project Name>/Scenes folder.

From here, you’re all set to start creating custom content for your project. Good Luck!