glLineStipple
has been deprecated in the latest OpenGL APIs.
What is it replaced with?
If not replaced, how can I get a similar effect?
(I don't want to use a compatibility profile of course...)
相关问题
- Is GLFW designed to use without LWJGL (in java)?
- glDrawElements only draws half a quad
- Scaling png font down
- OpenGL buffer update [duplicate]
- How does gl_ClipVertex work relative to gl_ClipDis
相关文章
- Converting glm::lookat matrix to quaternion and ba
- Behavior of uniforms after glUseProgram() and spee
- Keep constant number of visible circles in 3D anim
- GLEW and Qt5 redefinition of headers
- How do I remove axis from a rotation matrix?
- how to calculate field of view of the camera from
- Assimp model loading library install/linking troub
- anyone can explain the “field of view”
Sorry, it hasn't been replaced with anything. The first idea coming to my mind for emulating it would be the geometry shader. You feed the geometry shader with a line, compute its screen space length and based on that you generate a variable number of sub lines between its start and end vertex.
EDIT: Perhaps you could also use a 1D texture with the alpha (or red) channel encoding the pattern as 0.0 (no line) or 1.0 (line) and then have the lines texture coordinate go from 0 to 1 and in the fragment chader you make a simple alpha test, discarding fragments with alpha below some threshold. You can facilitate the geometry shader to generate your line texCoords, as otherwise you need different vertices for every line. This way you can also make the texCoord dependent on the screen space length of the line.
The whole thing get's more difficult if you draw triangles (using polygon mode
GL_LINE
). Then you have to do the triangle-line transformation yourself in the geometry shader, putting in triangles and putting out lines (that could also be a reason for deprecating polygon mode in the future, if it hasn't already).EDIT: Although I believe this question abandomned, I have made a simple shader triple for the second approach. It's just a minimal solution, feel free to add custom features yourself. I haven't tested it because I lack the neccessary hardware, but you should get the point:
The vertex shader is a simple pass through.
In the geometry shader we take a line and compute its screen space length in pixels. We then devide this by the size of the stipple pattern texture, which would be
factor*16
when emulating a call toglLineStipple(factor, pattern)
. This is taken as 1D texture coordinate of the second line end point.Note that this texture coordinate has to be interpolated linearly (
noperspective
interpolation specifier). The usual perpective-correct interpolation would cause the stipple pattern to "squeeze together" on farther away parts of the line, whereas we are explicitly working with screen-space values.The fragment shader now just performs a simple alpha test using the value from the pattern texture, which contains a 1 for line and a 0 for no line. So to emulate the fixed function stipple you would have a 16 pixel 1-component 1D texture instead of a 16bit pattern. Don't forget to set the pattern's wrapping mode to
GL_REPEAT
, about the filtering mode I'm not that sure, but I supposeGL_NEAREST
would be a good idea.But as said earlier, if you want to render triangles using
glPolygonMode
, it won't work this way. Instead you have to adapt the geometry shader to accept triangles and generate 3 lines for each triangle.EDIT: In fact OpenGL 3's direct support for integer operations in shaders allows us to completely drop this whole 1D-texture approach and work straight-forward with an actual bit-pattern. Thus the geometry shader is slightly changed to put out the actual screen-size pattern coordinate, without normalization:
In the fragment shader we then just take a bit pattern as unsigned integer (though 32-bit in contrast to
glLineStipple
's 16-bit value) and the stretch factor of the pattern and just take the texture coordinate (well, no texture anymore actually, but nevermind) modulo 32 to get it's position on the pattern (those explicituint
s are annoying, but my GLSL compiler says implicit conversions betweenint
anduint
are evil):