My game uses unity 5 new UI system with canvas. The game itself can receive touches to shoot ammos via functions OnMouseDown() on several game objects with 2D colliders indicating touchable areas, and I can adjust the priorities of different touchable areas by changing the game objects' position.z.
However with the UI added, any touches on UI elements (buttons, panels, etc.) will not only (if possible) trigger the UI elements, but they can also pass through UI elements and trigger the touchable areas. It's very weird when you press a button, not only the button is pressed, but also triggers the shooting action behind (visually) the "UI layer".
I can think of one way is to add a collider to UI elements, then in the runtime change its position and size into world space, and adjust it's position.z value to swallow all touches that are on UI. However this seems very ugly and unsafe.
Is there any elegant way to let all UI elements (basically panels) to swallow touches? Thanks!
You can just use a 2d or 3d physics raycaster on the camera to accomplish this. BoredMormon has a great 5-min video on how to do this, I'd recommend checking it out! https://www.youtube.com/watch?v=EVZiv7DLU6E