Interacting with 3D Objects in Magic Leap with Unity (Gestures)

If you’re building an app for Magic Leap One then you might want users to be able to interact with their new environment by “touching” objects you’ve placed in it. In this tutorial we’ll use a combination of gestures and colliders to interact with these not-real-world objects. (using Lumin SDK 0.13.0)

1. Prerequisites

To get started make sure you’ve completed the Magic Leap Hello, Cube tutorial as this tutorial will follow on from that.

You should have a cube in your virtual room in the Magic Leap Simulator.

This tutorial contains the full scripts, but if you want to download them then they’re available in github.

2. Collisions

We need to add a rigid body to the cube so we can detect collisions.

Make sure the cube is highlighted in the hierarchy. In the inspector click Add Component.

Adding a Rigidbody to a Game Object in Unity

In the dropdown list choose Physics > Rigid Body.

Adding a Rigidbody to a Game Object in Unity

Make sure you uncheck the Use Gravity box in the Rigid bodies panel in the inspector otherwise your cube will fall as soon as you press play.

Disabling Use Gravity on a Rigidbody in Unity

3. Gestures

We’re going to be looking for two gestures. The pointing finger and the L shape.

Magic Leap Gesture Hand Positions

Create a new Empty Game Object by clicking Create at the top of the Inspector window and choosing Empty Game Object.

Creating an Empty Game Object in Unity

Rename the object to Gestures.

Renaming a Game Object in Unity

We need a physical object to do the colliding so create a 3D sphere as a child of Gestures.

Creating a 3D Sphere in Unity

Change the name of the sphere to Finger.

Making an object a child of another object in Unity

In the inspector scale the sphere down. I’m making mine 0.2, 0.2, 0.2.

Set a Game Objects scale in Unity

Add a rigid body to the sphere and make sure Use Gravity is unchecked.

Adding a Rigidbody to a 3D Sphere in Unity

Adding a Rigidbody to a Sphere in Unity

Disable Use Gravity on RigidBody in Unity

4. Detecting Gestures

The first thing we need to be able to do is detect gestures. We’ll need to import a few things from the Magic Leap Unity Examples package. In the Assets menu choose Import Package > Custom Package…, navigate to the Unity package and click to open the import window.

We only need to Libs folder so click None in the bottom left of the import window to clear all the checkboxes and recheck Libs. Click Import to start the import.

Import libs directory from magic leaps unity package

Create a new script on the Gestures Game Object by clicking Add Component in the inspector, choosing New Script, then entering a name for the script. I’m calling mine GestureDetection.

Create a new script on the Gestures Game Object

Add a New Script to a Game Object in Unity

Open up your newly created script in your preferred editor.

First we’re going to include the Magic Leap namespace by adding a using statement.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Experimental.XR.MagicLeap;

public class GestureDetection : MonoBehaviour 
{
    // Use this for initialization
    void Start () {}
    
    // Update is called once per frame
    void Update () {}
}

In the start function we’re starting up MLHands and checking it started successfully. In this case we simply output a message to the Log and return, but in a real world application you’ll probably want to handle it a little more gracefully.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Experimental.XR.MagicLeap;

public class GestureDetection : MonoBehaviour 
{
    // Use this for initialization
    void Start () 
    {
        if(!MLHands.Start())
        {
            Debug.Log("MLHands didn't start...bail.");
            return;
        }
    }
    
    // Update is called once per frame
    void Update () {}
}

We need to tell the Gesture Manager which gestures we want to look for. To keep things tidy we’ll create a new function to handle that and call it from the start function.

I’m calling my new function setGesturesToTrack().

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Experimental.XR.MagicLeap;
public class GestureDetection : MonoBehaviour 
{
    // Use this for initialization
    void Start () 
    {
        if(!MLHands.Start())
        {
            Debug.Log("MLHands didn't start...bail.");
            return;
        }
        //set gestures to track
        setGesturesToTrack();
    }
    
    // Update is called once per frame
    void Update () {}

    //set the gestures to track
    void setGesturesToTrack()
    {
        List<MLStaticGestureType> gestures = new List<MLStaticGestureType>();
        //add the gestures we want to track
        gestures.Add(MLStaticGestureType.Finger);
        gestures.Add(MLStaticGestureType.L);
        //add the gestures to the gesture manager
        MLHands.GestureManager.EnableGestures(gestures.ToArray(), true, true);
    }
}

Line 16: Call setGesturesToTrack from the start function after MLHands has been started successfully.

Line 25: Create a list of the gestures we want to track.

Line 27 – 28: Add the two gestures to the list.

Line 30: Enables the gestures in the gesture manager. The first parameter for EnableGestures is the array of static gestures we created. The second parameter is the value we want to set those gestures to with true meaning the gesture is being tracked. If the second and third parameter are both set to true, then this will enable all the gestures in your gesture array and disable all other gestures. If you set the third parameter to false then enabling gestures will have no effect on the state of the other gestures.

Now we need some code to check if the user is making either of those gestures. Create a new function called gestureTracker and call it from the Update function.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Experimental.XR.MagicLeap;
public class GestureDetection : MonoBehaviour 
{
    // Use this for initialization
    void Start () 
    {
        if(!MLHands.Start())
        {
            Debug.Log("MLHands didn't start...bail.");
            return;
        }
        //set gestures to track
        setGesturesToTrack();
    }
    
    // Update is called once per frame
    void Update () 
    {
         //if MLHands is running start tracking gestures
         if(MLHands.IsStarted)
             gestureTracker();
    }

    //set the gestures to track
    void setGesturesToTrack()
    {
        List<MLStaticGestureType> gestures = new List<MLStaticGestureType>();
        //add the gestures we want to track
        gestures.Add(MLStaticGestureType.Finger);
        gestures.Add(MLStaticGestureType.L);
        //add the gestures to the gesture manager
        MLHands.GestureManager.EnableGestures(gestures.ToArray(), true, true);
    }

    //track the gestures
    void gestureTracker()
    {
        //check for the l or finger gesture
        //and that we've got some keypoints
        if((MLHands.Left.StaticGesture == MLStaticGestureType.Finger || 
               MLHands.Left.StaticGesture == MLStaticGestureType.L) &&
               MLHands.Left.KeyPoints.Length > 0)
        {
             Debug.Log(MLHands.Left.StaticGesture);    
        }
        //check for the l or finger gesture
        //and that we've got some keypoints
        if ((MLHands.Right.StaticGesture == MLStaticGestureType.Finger ||
                MLHands.Right.StaticGesture == MLStaticGestureType.L) &&
                MLHands.Right.KeyPoints.Length > 0)
        {
            Debug.Log(MLHands.Right.StaticGesture);      
        }
    }
}

Line 23: Check that MLHands has started.

Line 24: Call the new gestureTracker function.

Lines 43 – 46: This if statement is tracking if the users left hand is doing either of the enabled static gestures (L & Finger). We’re also checking that we have some key points for the gesture as we need those to know where the gesture is being done within the view.

Line 47: Print out a debug line so that when a gesture is detected we know which gesture it is.

Line 51 – 56: We repeat the same process as lines 43-47 for the right hand as we’re accepting gestures from either hand.

5. Collisions

Now we need to move the sphere we created to the position of the outstretched finger, then when we perform the gesture and touch the cube the collider will fire.

We’re going to replace the debug lines in gestureTracker to call a new function called positionHand that’ll deal with repositioning the Finger game object.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Experimental.XR.MagicLeap;
public class GestureDetection : MonoBehaviour 
{
    //game object representing the finger position
    public Transform finger;
    // Use this for initialization
    void Start () 
    {
        if(!MLHands.Start())
        {
            Debug.Log("MLHands didn't start...bail.");
            return;
        }
        //set gestures to track
        setGesturesToTrack();
    }
    
    // Update is called once per frame
    void Update () 
    {
         //if MLHands is running start tracking gestures
         if(MLHands.IsStarted)
             gestureTracker();
    }

    //set the gestures to track
    void setGesturesToTrack()
    {
        List<MLStaticGestureType> gestures = new List<MLStaticGestureType>();
        //add the gestures we want to track
        gestures.Add(MLStaticGestureType.Finger);
        gestures.Add(MLStaticGestureType.L);
        //add the gestures to the gesture manager
        MLHands.GestureManager.EnableGestures(gestures.ToArray(), true, true);
    }

    //track the gestures
    void gestureTracker()
    {
        //check for the l or finger gesture
        //and that we've got some keypoints
        if((MLHands.Left.StaticGesture == MLStaticGestureType.Finger || 
               MLHands.Left.StaticGesture == MLStaticGestureType.L) &&
               MLHands.Left.KeyPoints.Length > 0)
        {
             positionHand(MLHands.Left);    
        }
        //check for the l or finger gesture
        //and that we've got some keypoints
        if ((MLHands.Right.StaticGesture == MLStaticGestureType.Finger ||
                MLHands.Right.StaticGesture == MLStaticGestureType.L) &&
                MLHands.Right.KeyPoints.Length > 0)
        {
            positionHand(MLHands.Right);    
        }
    }

    //position the game object to the tracked finger
    void positionHand(MLHand hand)
    {
        //in Lumin SDK 0.13 only the center hand position
        //updates so we need to use that until it's fixed
        finger.position = hand.Center;
        //set the finger position to the tip of the finger
        //finger.position = hand[1].position;
    }
}

Line 8: Setup a variable to hold the Finger game object

Line 49 & 57: Call the new positionHand function passing in the hand that is gesturing.

Line 66: There’s a bug in the Lumin SDK 0.13.0 where not all the gesture keypoints update. Currently the only keypoint that updates is the center of the hand, so for now we’ll use that so that we can see things working. Once it’s fixes delete this line and uncomment line 68.

Line 68: Set the position of the finger game object to the position of the tip of the finger keypoint.

Make sure you set the Finger variable to the Finger game object in Unity.

Linking a Game Object to a variable in Unity Linking a Game Object to a variable in Unity

If you run this on the Magic Leap Simulator then you should be able to touch the cube and it’ll move.

Before we move on we need an OnDestroy function to stop MLHands and clean up when the app’s closed.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Experimental.XR.MagicLeap;
public class GestureDetection : MonoBehaviour 
{
    //game object representing the finger position
    public Transform finger;
    // Use this for initialization
    void Start () 
    {
        if(!MLHands.Start())
        {
            Debug.Log("MLHands didn't start...bail.");
            return;
        }
        //set gestures to track
        setGesturesToTrack();
    }
    
    // Update is called once per frame
    void Update () 
    {
         //if MLHands is running start tracking gestures
         if(MLHands.IsStarted)
             gestureTracker();
    }

    //set the gestures to track
    void setGesturesToTrack()
    {
        List<MLStaticGestureType> gestures = new List<MLStaticGestureType>();
        //add the gestures we want to track
        gestures.Add(MLStaticGestureType.Finger);
        gestures.Add(MLStaticGestureType.L);
        //add the gestures to the gesture manager
        MLHands.GestureManager.EnableGestures(gestures.ToArray(), true, true);
    }

    //track the gestures
    void gestureTracker()
    {
        //check for the l or finger gesture
        //and that we've got some keypoints
        if((MLHands.Left.StaticGesture == MLStaticGestureType.Finger || 
               MLHands.Left.StaticGesture == MLStaticGestureType.L) &&
               MLHands.Left.KeyPoints.Length > 0)
        {
             positionHand(MLHands.Left);    
        }
        //check for the l or finger gesture
        //and that we've got some keypoints
        if ((MLHands.Right.StaticGesture == MLStaticGestureType.Finger ||
                MLHands.Right.StaticGesture == MLStaticGestureType.L) &&
                MLHands.Right.KeyPoints.Length > 0)
        {
            positionHand(MLHands.Right);    
        }
    }

    //position the game object to the tracked finger
    void positionHand(MLHand hand)
    {
        //in Lumin SDK 0.13 only the center hand position
        //updates so we need to use that until it's fixed
        finger.position = hand.Center;
        //set the finger position to the tip of the finger
        //finger.position = hand[1].position;
    }

    //cleanup on destroy
    private void OnDestroy()
    {
        //stop MLHands
        MLHands.Stop();
    }
}

Line 75: Stop MLHands.

6. Detect the Collision

Finally we need to write some code to detect the collision between the finger and the cube. Let’s change the colour of the cube when we touch it.

Select the Cube game object in the Hierarchy and add a new script to it. I’m calling mine TouchCubeCollider.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class TouchCubeCollider : MonoBehaviour 
{
    // Use this for initialization
    void Start () {}
    
    // Update is called once per frame
    void Update () {}

    void OnCollisionEnter(Collision collision)
    {
        //change the colour of the cube
        GetComponent<Renderer>().material.color = Color.blue;  
    }
}

Line 13: function to detect the a collision on the cube.

Line 16: Set the colour of the cube to blue.

7. Running the Code

Make sure the magic leap simulator is running, your projects build settings are correct and zero iteration mode is enabled then press play.

Under the gesture panel change one of the hands to either the finger or L gesture. Update the position (I use 0, 2, -2.4) to make the sphere representing the finger touch the cube.

Detecting Gestures in the Magic Leap Remote Simulator

The cube should turn blue and begin to move away from the sphere.

Interacting with 3d objects in the Magic Leap Simulator

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.