all! I'm working with Unity UI elements. Until recently, my canvas was set to Render mode: Screen space - Overlay. I wrote up some code using touch.position to trigger specific events on my screen: I wanted some objects to appear whenever I would touch an object on my smartphone (or touch within a specific radius of it).
I'm tracking my touch position with touch.position
and my specific gameObject element positions with a list of positions I create like this:
Vector2 thisPosition = new Vector2(child.position.x, child.position.y);
When I was working with Render mode: Screen Space - Overlay
, everything seemed to be working fine in the inspector; but my elements, as they are not part of the UI, weren't displaying.
When I switched to Render mode: Screen space - Camera
or World view
, though, my distances started being all over the place.
For example, when touching a button, I would log:
Touch position is(341.7, 372.7)
While my button's position would log:
Button's at(0.0, -1.5)
How could I fix this?