Making shader takes vertexArray as canvas instead

2019-08-02 21:01发布

问题:

Currently developing an open source GUI SFML based library, but struggling a bit with shaders. So far I've managed to render shader with no problem, but instead of applying the shader on the current object, it's apply it for the whole window, but only visible in the object range.

Here's my simple code:

#include <SFML/Window.hpp>
#include <SFML/Graphics.hpp>

#include <iostream>

const char *glsl = R"(
    #version 330 core
    uniform vec2 u_resolution;

    void main() {
        vec2 pos = gl_FragCoord.xy / u_resolution;
        float radius = 0.4;
        float d = distance(pos, vec2(0.5));

        gl_FragColor = d <= radius ? vec4(pos.y,pos.y/2.,pos.y, 1.0) : vec4(d*0.8, d*0.8, d*0.8, 1.0);


        //gl_FragColor = vec4(0.0, 0.5, 1.0, 1.0);
    }
)";

const char *pos = R"(
    // some default shader I've found
    // adding #version 330 core will complain for outdated gl_Position variable..

    void main() {
        gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;

        gl_FrontColor = gl_Color;
    }
)";

int main() {
    sf::RenderWindow window(sf::VideoMode(800, 600), "GLSL", sf::Style::Default);

    // create a quad
    sf::VertexArray quad(sf::Quads, 4);
    // define it as a rectangle, located at (10, 10) and with size 100x100
    quad[0].position = sf::Vector2f(100, 100);
    quad[1].position = sf::Vector2f(410, 100);
    quad[2].position = sf::Vector2f(410, 410);
    quad[3].position = sf::Vector2f(100, 410);


    sf::Shader shader;
    // the following line is the same as shader.loadFromMemory(pos, glsl);
    shader.loadFromMemory(glsl, sf::Shader::Fragment);
    shader.setUniform("u_resolution", sf::Vector2f(800, 600));

    while (window.isOpen()) {
        sf::Event event;
        while (window.pollEvent(event)) {
            if (event.type == sf::Event::Closed)
                window.close();
        }

        window.clear(sf::Color(244,244,244));

        window.draw(quad, &shader);

        window.display();
    }

    return 0;
}

Not sure what do to for now, but I guess that pos shader should affect how the rendering is done. Thanks!

回答1:

Just some minor changes needed. I'm not exactly sure what's the best way to get it done in any more modern shader version, but if you omit any version markers (i.e. using legacy code), it's quick to change:

First of all, your individual vertices (or more specific the rendered fragments) don't know where they are relative to the quad being drawn. Luckily you can abuse texture coordinates for this. For example, define the top left as (0, 0) and the bottom right as (1, 1). This way their values will range from 0 to 1 based on where the fragment is. This is basically what your existing shader code calculates as pos.

So let's start by adding the texture coordinates:

quad[0].position = sf::Vector2f(100, 100);
quad[0].texCoords = sf::Vector2f(0, 0);
quad[1].position = sf::Vector2f(410, 100);
quad[1].texCoords = sf::Vector2f(1, 0);
quad[2].position = sf::Vector2f(410, 410);
quad[2].texCoords = sf::Vector2f(1, 1);
quad[3].position = sf::Vector2f(100, 410);
quad[3].texCoords = sf::Vector2f(0, 1);

Now all we have to do is assign the texture coordinates to pos (or better remove pos completely). For simplification I'll just go with the assignment:

vec2 pos = gl_TexCoord[0].xy;

However, this will cause the shader compiler to complain, since gl_TexCoord got removed. So for testing I just removed the precompiler line and the result looks like what you've had in mind I guess: