java - libgdx coordinate system differences between rendering and touch input -


i have screen (basescreen implements screen interface) renders png image. on click of screen, moves character position touched (for testing purposes).

public class drawingspritescreen extends basescreen {     private texture _sourcetexture = null;     float x = 0, y = 0;      @override     public void create() {         _sourcetexture = new texture(gdx.files.internal("data/character.png"));     }      .     . } 

during rendering of screen, if user touched screen, grab coordinates of touch, , use these render character image.

@override public void render(float delta) {     if (gdx.input.justtouched()) {         x = gdx.input.getx();         y = gdx.input.gety();     }      super.getgame().batch.draw(_sourcetexture, x, y); } 

the issue coordinates drawing image start bottom left position (as noted in libgdx wiki) , coordinates touch input starts upper left corner. issue i'm having click on bottom right, moves image top right. coordinates may x 675 y 13, on touch near top of screen. character shows @ bottom, since coordinates start bottom left.

why what? why coordinate systems reversed? using wrong objects determine this?

to detect collision use camera.unproject(vector3). set vector3 as:

x = gdx.input.getx();      y = gdx.input.gety(); z=0; 

now pass vector in camera.unproject(vector3). use x , y of vector draw character.


Comments

Popular posts from this blog

c# - DetailsView in ASP.Net - How to add another column on the side/add a control in each row? -

javascript - firefox memory leak -

Trying to import CSV file to a SQL Server database using asp.net and c# - can't find what I'm missing -