Weird OpenGL errors?
category: general [glöplog]
So I digged up some game I was posting about months ago already (timing issues) and it looks like something must be wrong with my OpenGL code, but I fail to see *what* is going wrong. First of all, I programmed the game on a Radeon HD 2600 XT (desktop pc). Now I'm on a laptop which has a HD 3750 Mobile or something like that and an Intel Chip. On the left side, you can see the Radeon graphics (fucked up) and on the right side the Intel chip (that's what it also looked like on the old Radeon):
A less notable thing (which is not too important) is that the textures are a bit too dark on the Radeon card (just compare the tree colours).
The more severe thing is that cloud and the pie chart. It is built from a few display lists which are basically just vertex coordinates:
Those are called in the program like this:
I would expect now that the vertexes get the proper RGBA values, but even if I just use one specific colour there, they stay completely white on the Radeon card. What the heck is going wrong there?
A less notable thing (which is not too important) is that the textures are a bit too dark on the Radeon card (just compare the tree colours).
The more severe thing is that cloud and the pie chart. It is built from a few display lists which are basically just vertex coordinates:
Code:
glNewList(glJetpackPie + i, GL_COMPILE)
glBegin GL_TRIANGLE_STRIP
glVertex2f(Sin(f - pie_piece_width), Cos(f - pie_piece_width))
glVertex2f(Sin(f - pie_piece_width) / 10, Cos(f - pie_piece_width) / 10)
glVertex2f(Sin(f), Cos(f))
glVertex2f(Sin(f + pie_piece_width) / 10, Cos(f + pie_piece_width) / 10)
glVertex2f(Sin(f + pie_piece_width), Cos(f + pie_piece_width))
glEnd
glEndList
Those are called in the program like this:
Code:
glColor4f(jetcolor.r, jetcolor.g, jetcolor.b, jetcolor.a)
glCallList(glJetpackPie + i)
I would expect now that the vertexes get the proper RGBA values, but even if I just use one specific colour there, they stay completely white on the Radeon card. What the heck is going wrong there?
Try using gl intercept.
http://glintercept.nutty.org/
http://glintercept.nutty.org/
Very strange. By using the opengl32.dll from glIntercept, the graphics actually look correct on the ATI card. O_o
looks to me like some uninitialized "something", which might be different on both machines... as for OpenGL, with the current drivers, one should never expect any default value, and everything should be set up explicitly for every context
If it suddenly works with glIntercept, it might be a problem related to multi-threading attempts of the graphics driver. I had that a while ago on nVidia, and it's absolutely possible that ATI has similar bugs.
To find out where the critical spots are, put a few glGet() calls in your code -- this forces the graphics driver to flush all pending commands and wait for the graphics chip until they're actually done. (And no, glFlush() is not sufficient.) If this still doesn't help, try glReadPixels() :)
To find out where the critical spots are, put a few glGet() calls in your code -- this forces the graphics driver to flush all pending commands and wait for the graphics chip until they're actually done. (And no, glFlush() is not sufficient.) If this still doesn't help, try glReadPixels() :)
if you use more than one context, check that wglSetContext() is set correctly for any GL operation on your thread.
Quote:
If it suddenly works with glIntercept, it might be a problem related to multi-threading attempts of the graphics driver. I had that a while ago on nVidia, and it's absolutely possible that ATI has similar bugs.
Actually, the driver of this card are optimized for OpenGL (it's from the FireGL series) - do you think that this could be an issue?