I just started using OpenGL. This is my first code:
// OpenGL hello program
#include<iostream>
#include <GL/glut.h>
#include <cstring>
void display() {
glClear(GL_COLOR_BUFFER_BIT);
char message[] = "Hello, world!";
glRasterPos2d(0, 0);
for (int i = 0; i < sizeof(message) / sizeof(message[0]); i++)
{
glutBitmapCharacter(GLUT_BITMAP_HELVETICA_12, message[i]);
}
}
int main(int argc, char *argv[]) {
glutInit(&argc, argv);
glutInitWindowSize(500, 500);
glutCreateWindow("OpenGL hello program");
glutDisplayFunc(display);
glutMainLoop();
}
The error I am getting is: Warning: comparison between signed and unsigned integer expressions (line 9). I also tried writing a new code then to see whats causing the problem:
#include<iostream>
#include <cstring>
void display1() {
char message[] = "Hello, world!";
for (int i = 0; i < sizeof(message) / sizeof(message[0]); i++)
std::cout<<message[i];
}
int main() {
display1();
}
This code works perfectly fine. Why is the first code not working fine?
EDIT: Following up on Cyber's annswer, I changed the loop to:
for (unsigned int i = 0; i < sizeof(message) / sizeof(message[0]); i++)
But the OpenGL code does not do the expected i.e. show "Hello, world!" message in the window. It just creates a window with "OpenGL hello program" written at the top and nothing else.