glGenTextures(1, &texid);
glBindTexture(GL_TEXTURE_2D, texid);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 32, 32, 0, GL_RGB, GL_UNSIGNED_BYTE, texval);
Now, texval is an unsigned char array 3072 bytes long. It's been filled with variations of a color in the RGB order. Now, the display loop works something like this:
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glBindTexture(GL_TEXTURE_2D, texid);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3f(x,y,z);
glTexCoord2f(0.0, 1.0); glVertex3f(x,y+1.0f,z);
glTexCoord2f(1.0, 1.0); glVertex3f(x+1.0f,y+1.0f,z);
glTexCoord2f(1.0, 0.0); glVertex3f(x+1.0f,y,z);
glEnd();
All this is kosher. Problem is, no texture displays: only a white square. However--and this is the weird part--if I call glTexImage2D() as follows, I can get a solid color to appear:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 1, 1, 0, GL_RGB, GL_UNSIGNED_BYTE, texval);
Obviously I have no use for a 1×1 texture, but I don't see why it works while any higher size like 2×1 or 32×32 fails. And it's not a proper failure, because the data is in fact being accepted to the texture as far as I know. No error code is generated when this fails--it simply does not draw a texture.
Note that I'm using no lighting settings, and the projection is orthographic.
So, any idea why OpenGL would simply choke on any 2D texture bigger than 1×1?
Lummox JR
I modified a NeHe example and placed a macro
#define ORTHO
in there so that it will run in ortho mode and display a the texture. There are 3 textures, 256, 128, and 32. You can check you code with this code if you want. I hope it helps man.
http://www.ethereal-studios.com/night/ogl/
good luck! :) If this isn't the problem then just tell me and I can try to help some other way.