From Unity docs:
Graphic Raycaster is used for raycast vs Canvas. Raycaster looks at all the graphics on the canvas and determines if they were hit.
You can use EventSystem.RaycastAll for raycast for graphical user interface (UI) elements.
Here is a short example for your case:
void Update() { // Example: get controller current orientation: Quaternion ori = GvrController.Orientation; // If you want a vector that points in the direction of the controller // you can just multiply this quat by Vector3.forward: Vector3 vector = ori * Vector3.forward; // ...or you can just change the rotation of some entity on your scene // (eg the player arm) to match the controller orientation playerArmObject.transform.localRotation = ori; // Example: check if touchpad was just touched if (GvrController.TouchDown) { // Do something. // TouchDown is true for 1 frame after touchpad is touched. PointerEventData pointerData = new PointerEventData(EventSystem.current); pointerData.position = Input.mousePosition; // use the position from controller as start of raycast instead of mousePosition. List<RaycastResult> results = new List<RaycastResult>(); EventSystem.current.RaycastAll(pointerData, results); if (results.Count > 0) { //WorldUI is my layer name if (results[0].gameObject.layer == LayerMask.NameToLayer("WorldUI")){ string dbg = "Root Element: {0} \n GrandChild Element: {1}"; Debug.Log(string.Format(dbg, results[results.Count-1].gameObject.name,results[0].gameObject.name)); //Debug.Log("Root Element: "+results[results.Count-1].gameObject.name); //Debug.Log("GrandChild Element: "+results[0].gameObject.name); results.Clear(); } } }
The above script is not tested by me. Thus, there may be some errors.
Here are a few other links to help you understand more:
Hope this helps.
source share