I want to detect click/touch event on my gameObject 2D.
And this is my code:
void Update()
{
if (Input.touchCount > 0)
{
Debug.Log("Touch");
}
}
Debug.Log("Touch");
does not show when I click on screen or my gameObject.
I want to detect click/touch event on my gameObject 2D.
And this is my code:
void Update()
{
if (Input.touchCount > 0)
{
Debug.Log("Touch");
}
}
Debug.Log("Touch");
does not show when I click on screen or my gameObject.
Short answer: yes, touch may be handled with Input.GetMouseButtonDown()
.
Input.GetMouseButtonDown()
, Input.mousePosition
, and associated functions work as tap on the touch screen (which is kind of odd, but welcome). If you don't have a multi-touch game, this is a good way to keep the in-editor game functioning well while still keeping touch input for devices. (source: Unity Community)
Mouse simulation with touches can be enabled/disabled with Input.simulateMouseWithTouches
option. By default this option is enabled.
Though it is good for testing, I believe Input.GetTouch()
should be used in production code.
Interesting approach is to add touch handling to OnMouseUp()
/OnMouseDown()
event:
// OnTouchDown.cs
// Allows "OnMouseDown()" events to work on the iPhone.
// Attach to the main camera.
using UnityEngine;
using System.Collections;
using System.Collections.Generic;
public class OnTouchDown : MonoBehaviour {
void Update () {
// Code for OnMouseDown in the iPhone. Unquote to test.
RaycastHit hit = new RaycastHit();
for (int i = 0; i < Input.touchCount; ++i)
if (Input.GetTouch(i).phase.Equals(TouchPhase.Began)) {
// Construct a ray from the current touch coordinates
Ray ray = Camera.main.ScreenPointToRay(Input.GetTouch(i).position);
if (Physics.Raycast(ray, out hit))
hit.transform.gameObject.SendMessage("OnMouseDown");
}
}
}
(source: Unity Answers)
UPD.: There is Unity Remote mobile app for simulating touching in editor mode (works with Unity Editor 4 and Unity Editor 5).
From what I understand the Unity player does not allow you to trigger touch events, only mouse events.
But you can simulate fake touch events based on the mouse events, as explained in this blog post: http://2sa-studio.blogspot.com/2015/01/simulating-touch-events-from-mouse.html
void Update () {
// Handle native touch events
foreach (Touch touch in Input.touches) {
HandleTouch(touch.fingerId, Camera.main.ScreenToWorldPoint(touch.position), touch.phase);
}
// Simulate touch events from mouse events
if (Input.touchCount == 0) {
if (Input.GetMouseButtonDown(0) ) {
HandleTouch(10, Camera.main.ScreenToWorldPoint(Input.mousePosition), TouchPhase.Began);
}
if (Input.GetMouseButton(0) ) {
HandleTouch(10, Camera.main.ScreenToWorldPoint(Input.mousePosition), TouchPhase.Moved);
}
if (Input.GetMouseButtonUp(0) ) {
HandleTouch(10, Camera.main.ScreenToWorldPoint(Input.mousePosition), TouchPhase.Ended);
}
}
}
private void HandleTouch(int touchFingerId, Vector3 touchPosition, TouchPhase touchPhase) {
switch (touchPhase) {
case TouchPhase.Began:
// TODO
break;
case TouchPhase.Moved:
// TODO
break;
case TouchPhase.Ended:
// TODO
break;
}
}
The answer is no, there is a unity remote android app (Play Store) for simulating touching in editor mode. I think this maybe helpful.