Unity Next Steps: Interaction

This is a Work in Progress Article based on materials available on TutsPlus.

Interaction Scripts

So far, we've developed a basic AR application that recognizes and tracks our target image and displays the designated 3D graphics. However, for a complete AR application, we also need to be able to interact with the objects, augmenting the reality.

For this purpose, we need to be able to detect where we clicked—or touched, in the case of a mobile device. We'll do this by implementing a ray-tracer.

First, create a folder named "scripts" under Assets to keep everything organized. We are going to store our script files in this folder. Then create a C# Script file in this folder. Name it "rayTracer". Naming is important due to the fact that the following code should match this specific file name. If you prefer to use a different name for your script file, you should also change the provided code accordingly.

Ray-Tracer Script

Copy and paste the following code into the C# Script file you have just created and named "rayTracer".

using UnityEngine;
using System.Collections;
using System.Collections.Generic;
 
public class rayTracer : MonoBehaviour {
 
    private List<GameObject> touchList = new List<GameObject>();
    private GameObject[] touchPrev;
    private RaycastHit hit;
 
     
    void Update () {
 
        #if UNITY_EDITOR
 
        if (Input.GetMouseButton(0) || Input.GetMouseButtonDown(0) || Input.GetMouseButtonUp(0)) {
 
            touchPrev = new GameObject[touchList.Count];
            touchList.CopyTo (touchPrev);
            touchList.Clear ();
 
            Ray ray = Camera.main.ScreenPointToRay (Input.mousePosition);
            //Debug.DrawRay(ray.origin, ray.direction*10000, Color.green, 10, false);
 
            if (Physics.Raycast (ray, out hit)) {
 
                GameObject recipient = hit.transform.gameObject;
                touchList.Add (recipient);
 
                if (Input.GetMouseButtonDown(0)) {
                    recipient.SendMessage ("touchBegan", hit.point, SendMessageOptions.DontRequireReceiver);
 
                }
                if (Input.GetMouseButtonUp(0)) {
                    recipient.SendMessage ("touchEnded", hit.point, SendMessageOptions.DontRequireReceiver);
 
                }
                if (Input.GetMouseButton(0)) {
                    recipient.SendMessage ("touchStay", hit.point, SendMessageOptions.DontRequireReceiver);
 
                }
            }
 
            foreach (GameObject g in touchPrev) {
                if(!touchList.Contains(g)){
                    g.SendMessage ("touchExit", hit.point, SendMessageOptions.DontRequireReceiver);
                }
            }
        }
 
        #endif
 
        if (Input.touchCount > 0) {
 
            touchPrev = new GameObject[touchList.Count];
            touchList.CopyTo (touchPrev);
            touchList.Clear ();
     
            foreach (Touch touch in Input.touches) {
 
                Ray ray = Camera.main.ScreenPointToRay (touch.position);
 
                if (Physics.Raycast (ray, out hit)) {
 
                    GameObject recipient = hit.transform.gameObject;
                    touchList.Add (recipient);
 
                    if (touch.phase == TouchPhase.Began) {
                        recipient.SendMessage ("touchBegan", hit.point, SendMessageOptions.DontRequireReceiver);
 
                    }
                    if (touch.phase == TouchPhase.Ended) {
                        recipient.SendMessage ("touchEnded", hit.point, SendMessageOptions.DontRequireReceiver);
 
                    }
                    if (touch.phase == TouchPhase.Stationary || touch.phase == TouchPhase.Moved) {
                        recipient.SendMessage ("touchStay", hit.point, SendMessageOptions.DontRequireReceiver);
 
                    }
                    if (touch.phase == TouchPhase.Canceled) {
                        recipient.SendMessage ("touchExit", hit.point, SendMessageOptions.DontRequireReceiver);
 
                    }
                }
            }
 
            foreach (GameObject g in touchPrev) {
                if(!touchList.Contains(g)){
                    g.SendMessage ("touchExit", hit.point, SendMessageOptions.DontRequireReceiver);
                }
            }
        }
    }
}

This script detects both mouse clicks if you are working on the Unity editor and touch inputs if you have deployed your application on a mobile device with a touch screen.

Once you've created your rayTracer script, you need to activate it by assigning it to one of the objects in the scene. I selected the ARCamera object and added the rayTracer scripts as a component by using the Add Component button under the Inspector tab.

Object Material

Now we are going to assign a material to our Cube object and change the color of the material upon interaction with the cube.

Under Assets, create a material and name it as you wish.

Now assign this material by dragging and dropping over the cube object.Advertisement

Interaction Script

Create a new C# Script under the scripts folder and name it "interaction".

Copy the following C# code into your "interaction" script file and then add this script file to the cube object as a component, just as we did with the "rayTracer" script file. However, this time it should be a component of the cube object—this is important in order to be able to only interact with the cube object.

using UnityEngine;
using System.Collections;
 
public class interaction : MonoBehaviour {

    public static Color defaultColor;
    public static Color selectedColor;
    public static Material mat;
 
    void Start(){
       
        mat = GetComponent<Renderer>().material;
 
        mat.SetFloat("_Mode", 2);
        mat.SetInt("_SrcBlend", (int)UnityEngine.Rendering.BlendMode.SrcAlpha);
        mat.SetInt("_DstBlend", (int)UnityEngine.Rendering.BlendMode.OneMinusSrcAlpha);
        mat.SetInt("_ZWrite", 0);
        mat.DisableKeyword("_ALPHATEST_ON");
        mat.EnableKeyword("_ALPHABLEND_ON");
        mat.DisableKeyword("_ALPHAPREMULTIPLY_ON");
        mat.renderQueue = 3000;
 
        defaultColor = new Color32 (255, 255, 255, 255);
        selectedColor = new Color32 (255, 0, 0, 255);
 
        mat.color = defaultColor;
    }
 
    void touchBegan(){
        mat.color = selectedColor;
        //Add your own functionality here
    }
 
    void touchEnded(){
        mat.color = defaultColor;
        //Add your own functionality here
    }
 
    void touchStay(){
        mat.color = selectedColor;
        //Add your own functionality here
    }
 
    void touchExit(){
        mat.color = defaultColor;
        //Add your own functionality here
    }
}

In this "interaction" script, we are referring to the material of the cube object as "mat".

We created two different material objects named defaultColor and selectedColor. defaultColor is selected to be white, as the RGBA parameters indicate, which are (255, 255, 255, 255).

We initialize the cube object's material color as defaultColor by the following line:

mat.color = defaultColor;

We have four different functions for four different states:

  • touchBegan() is called at the instant you touched on the object.

  • touchEnded() is called when you release your finger.

  • touchStay() is called right after you touched on the object—this function follows touchBegan(). So, if you assign different colors to your material in these functions, you are unlikely to see the color assigned in the touchStay() function, since it is the very first instant the touch is recognized.

  • touchExit() is called when you drag your finger out of the cube object's surface, instead of releasing your finger, which calls the touchEnded() function as explained above.

In our code, when we touch on the cube object, we assign the selectedColor object to mat.color, which is the color of our cube object's material.

By assigning the selectedColor within the touchStay() function, we make sure that the color of the cube object will be equal to selectedColor as long as we keep our finger on the cube object. If we release our finger or drag it out of the cube object, we assign defaultColor to the material's color parameter by calling the touchEnded() or touchExit() functions in accordance with the action we took.

Now run the project and click on the cube object once the target image is recognized and the cube object has appeared. It should turn red and white again when you release your click or move it out of the cube object's surface.

You can experiment with different colors for the four different actions to comprehend them thoroughly.

Extension

In the C# script above, functionality can be added to the events, as denoted by the comment //Add your own functionality here

Conclusion

In this tutorial, we've gone through an introduction to the Vuforia SDK for Unity along with its developer portal, and we've seen how to generate a target image and an appropriate license key.

On top of that, we generated custom script files in order to be able to interact with the augmented graphics. This tutorial is just an introduction to enable you to start using Vuforia with Unity and creating

Last updated