This is probably linked to another unsolved mystery of mine.
I'm drawing Orthographic 2d on iPhone, using real device and simulator. I'm trying to color my pixels a given color depending on how far they are from arbitrary point in pixelspace of 'A', which I pass in (hard code). I'm doing everything in Retina 960x640 resolution. I calculate distance from A to gl_FragCoord
, and I color based on leaping between 2 colors with the 'max' being 300px distance.
When on simulator (with retina display) I need to give a center point of "460" pixels for screen midpoint X.. Y I give 160px, and I look for distance of '300'px.. to get the same effect on device I need center of 960X and distance of 150 to get the same results (interestingly, px 80 doesn't give the same results I want but 160 could be an overshoot on the original...)
Obviously a retina issue is at play. But where, and how, and how do I find and fix it?
I'm using:
glViewport(0, 0, 960.0f, 640.0f);
and:
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &framebufferWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &framebufferHeight);
And:
[self setView:[[EAGLView alloc] initWithFrame:[UIScreen mainScreen].bounds]];
[(EAGLView *)[self view] setContentScaleFactor:2.0f];