Welcome, Guest. Please login or register.

Login with username, password and session length

 
Advanced search

1411490 Posts in 69371 Topics- by 58428 Members - Latest Member: shelton786

April 25, 2024, 12:52:38 AM

Need hosting? Check out Digital Ocean
(more details in this thread)
TIGSource ForumsDeveloperTechnical (Moderator: ThemsAllTook)glDrawArrays failing silently...halp?
Pages: [1] 2
Print
Author Topic: glDrawArrays failing silently...halp?  (Read 5568 times)
Cheezmeister
Level 3
***



View Profile
« on: November 16, 2014, 01:08:11 PM »

I decided to take another stab at GL, and starting to remember why I gave up the first time around. I've been loosely following the arcsynthesis tutorial, using SDL instead of glut. I've gotten as far as opening a window and filling it with blue. Trying to get a triangle on the screen without cheating with immediate mode, setting a flat color instead of a shader for the time being.

Nothing's being drawn, there's no error reported, and I'm not sure how to diagnose further. I'm working on a macbook, which I understand has pitiful gl support, but challenge accepted.

The whole code is here:
https://gist.github.com/anonymous/5316e620c0881f625c59

The snippet in question is
Code:
        glColor3f(0, 0, 1.0);
        glBindBuffer(GL_ARRAY_BUFFER, vbo);
        glEnableVertexAttribArray(0);
        glVertexAttribPointer(0, 4, GL_FLOAT, GL_FALSE, 0, 0);
        glDrawArrays(GL_TRIANGLES, 0, 3);
Reported at runtime:
Code:
        SDL version: 2.03
        runtime version: 2.03
        OpenGL vendor: 'Intel Inc.'
        OpenGL renderer: 'Intel HD Graphics 3000 OpenGL Engine'
        OpenGL version: '3.3 INTEL-8.24.16'
        GLSL version: '3.30'

What am I doing wrong here?  Huh?
Logged

෴Me෴ @chzmstr | www.luchenlabs.com ቒMadeቓ RA | Nextris | Chromathud   ᙍMakingᙌCheezus II (Devlog)
Whiskas
Level 0
**



View Profile
« Reply #1 on: November 16, 2014, 01:33:58 PM »

You may want to have basic shaders since you're using Intel drivers (see there).

Other than that nothing seems to look wrong (by the way why do you have w for each point?).

Also don't use glColor, use buffers or shader uniforms.

Edit: As pointed out later in the topic by nox, the rest of the code has more errors, was just refering to the piece of code in the original post.
« Last Edit: November 18, 2014, 06:51:10 AM by Whiskas » Logged
Cheezmeister
Level 3
***



View Profile
« Reply #2 on: November 16, 2014, 01:44:52 PM »

Thanks for taking a look. I'm trying to avoid shaders (at first) to cut down on confounding factors.

There's no particular reason for the w coord, I think that's how the arcsynthesis tut does it but I can try without and see what happens.

Are you recommending I use the color buffer when drawing triangles? Can I do that?
Logged

෴Me෴ @chzmstr | www.luchenlabs.com ቒMadeቓ RA | Nextris | Chromathud   ᙍMakingᙌCheezus II (Devlog)
Cheezmeister
Level 3
***



View Profile
« Reply #3 on: November 16, 2014, 01:56:09 PM »

Well, I sprinkled  `glGetError` checks everywhere, and there are 1282 (invalid operation) codes getting thrown all over the place. glDrawArrays reports 1282, glColor does also, and interestingly, the innocuous line `glMatrixMode(GL_MODELVIEW);` throws it as well.

So at least there's something to go on.
Logged

෴Me෴ @chzmstr | www.luchenlabs.com ቒMadeቓ RA | Nextris | Chromathud   ᙍMakingᙌCheezus II (Devlog)
motorherp
Level 3
***



View Profile
« Reply #4 on: November 16, 2014, 02:02:06 PM »

Well, I sprinkled  `glGetError` checks everywhere, and there are 1282 (invalid operation) codes getting thrown all over the place. glDrawArrays reports 1282, glColor does also, and interestingly, the innocuous line `glMatrixMode(GL_MODELVIEW);` throws it as well.

So at least there's something to go on.

Sounds like you need to include and initialise GLEW
Logged
Cheezmeister
Level 3
***



View Profile
« Reply #5 on: November 16, 2014, 03:05:09 PM »

Hi motorherp!

Sounds like you need to include and initialise GLEW

Sorry, I don't follow. Am I using extensions here without realizing it? Are random, copious 1282 errors an indication of that? Or does glew provide some other diagnostics that'll help me figure out what's going on?
Logged

෴Me෴ @chzmstr | www.luchenlabs.com ቒMadeቓ RA | Nextris | Chromathud   ᙍMakingᙌCheezus II (Devlog)
motorherp
Level 3
***



View Profile
« Reply #6 on: November 16, 2014, 04:24:12 PM »

Hi motorherp!

Sounds like you need to include and initialise GLEW

Sorry, I don't follow. Am I using extensions here without realizing it? Are random, copious 1282 errors an indication of that? Or does glew provide some other diagnostics that'll help me figure out what's going on?


Quoting knowledge folders:

Quote
OpenGL allows extensions by hardware vendors. OpenGL has a protocol and low level API to discover what these extensions are and the supported functions. Using extensions in your C++ code is - unfortunately - platform specific. The address of the function (function pointer) must be retrieved from the OpenGL implementation (e.g. hardware driver). Under Windows this can be done using "wglGetProcAddress".

GLEW is a common API to write portable code which internally hides the platform specific detail. So you will be downloading GLEW for each platform and the client programs can stay portable as long as the target platform supports a GLEW implementation.

The way I understand it is that basically functions such as glDrawArrays etc are actually function pointers which need to be assigned to their implementations which will depend on the platform and hardware installed.  Hence why you get an "invalid operation" error when trying to use these functions without having assigned them since OpenGL believes they are not supported by the hardware.  There are ways you can do this manually but it's a massive headache.  GLEW is a free lib which auto handles setting up all these extensions for you.  Just for the record, alternatives are available but GLEW seems to be one of the main ones and it works for me.   
Logged
ThemsAllTook
Administrator
Level 10
******



View Profile WWW
« Reply #7 on: November 16, 2014, 05:08:30 PM »

Not having GLEW won't cause glMatrixMode to fail, but since the matrix stack doesn't exist in GL versions past 3.2 (I think?), and you're creating a GL 3.3 Core context (lines 125-127), that's probably where you're going wrong. You could either not make those SDL_GL_SetAttribute calls and let it create a GL 2 context, or avoid using functions that no longer exist in 3.3 Core.
Logged

Polly
Level 6
*



View Profile
« Reply #8 on: November 17, 2014, 09:00:48 AM »

As ThemsAllTook mentioned, you're using a Core context .. so functions such as glColor and glMatrixMode aren't available. But more importantly, you have to use a shader program ( glUseProgram ). Without a program being "installed" you can't perform any draw commands.

Why are you using OpenGL Core anyway ( there are valid reasons for this, i'm just wondering )?
Logged
nox
Level 0
***



View Profile WWW
« Reply #9 on: November 17, 2014, 02:07:22 PM »

Quote
Trying to get a triangle on the screen without cheating with immediate mode, setting a flat color instead of a shader for the time being.

For one thing, you can't draw without a shader if you're not in immediate mode. In fact, the first argument to glVertexAttribPointer is the index of the relevant vertex attribute from the shader. It should usually be retrieved with glGetAttribLocation. Apart from that, Whiskas is incorrect in saying that nothing looks wrong with your code. TONS looks wrong. You're using a weird mishmash of immediate mode functions and retained mode functions. glColor3f makes no sense in this context, nor do any of the matrix stack operations (glLoadIdentity.)

As to your GL_INVALID_OPERATION's...step through your code and print glGetError after every GL operation. Use the OpenGL API docs to find out why something might throw the error that you're getting. Fix each error as you come across it, because things aren't likely to work after one is thrown.
« Last Edit: November 17, 2014, 02:12:37 PM by nox » Logged

jgrams
Level 3
***



View Profile
« Reply #10 on: November 17, 2014, 02:20:47 PM »

step through your code and print glGetError after every GL operation.

Yeah, pretty much all GL functions fail "silently", so unless you call glGetError you won't know something is wrong until (unless?) it goes wrong enough to crash your program or something.

And a successful operation does not clear the error code, so unless you check it after every GL function, you won't know which one actually caused the error: it could have been set much earlier.
Logged
ThemsAllTook
Administrator
Level 10
******



View Profile WWW
« Reply #11 on: November 17, 2014, 02:38:53 PM »

For one thing, you can't draw without a shader if you're not in immediate mode. In fact, the first argument to glVertexAttribPointer is the index of the relevant vertex attribute from the shader. It should usually be retrieved with glGetAttribLocation. Apart from that, Whiskas is incorrect in saying that nothing looks wrong with your code. TONS looks wrong. You're using a weird mishmash of immediate mode functions and retained mode functions. glColor3f makes no sense in this context, nor do any of the matrix stack operations (glLoadIdentity.)

Two corrections to this:
  • Whether you use the fixed function pipeline or shaders isn't related to whether you submit your geometry with immediate mode or glDrawArrays/glDrawElements. glEnableVertexAttribArray and glVertexAttribPointer do require a shader to know what to do with those attributes, but you can use glEnableClientState(GL_VERTEX_ARRAY) and glVertexPointer to draw using glDrawArrays with fixed function.
  • glColor doesn't imply immediate mode; only glBegin/glEnd/glVertex do. You can use glColor to set a constant color to be applied to all geometry drawn with an array or VBO if you're not submitting a color array with your vertex data.
Logged

Cheezmeister
Level 3
***



View Profile
« Reply #12 on: November 17, 2014, 11:10:02 PM »

Hi nox, jgrams, thanks for the input. Interspersing glGetError() calls with each line is exactly what I've been doing. None of the documented reasons apply, hence my confuzzlement. I didn't realize the matrix stacks and friends were removed too, but it makes sense, given shaders do all this plus wait there's more. Still, it'd be nice if they were present for dev purposes, even via some wonky degraded emulation, or even a GL_NOT_HERE_ANYMORE error code, throw me a bone here. Moot point, though.

After nixing the old moldy api calls, calling glewInit() and finally wiring up some shaders (yes they're compiling successfully), all the calls to /gl.*Array.*/ are still failing. I have yet to try SDL_GL_GetProcAddress(), but tomorrow's another day. This is why we can't have nice things.
Logged

෴Me෴ @chzmstr | www.luchenlabs.com ቒMadeቓ RA | Nextris | Chromathud   ᙍMakingᙌCheezus II (Devlog)
Cheezmeister
Level 3
***



View Profile
« Reply #13 on: November 17, 2014, 11:21:47 PM »

Why are you using OpenGL Core anyway ( there are valid reasons for this, i'm just wondering )?

Missed this post. My understanding is that it's the lowest common denominator of new/recommended api, and I generally try to stick to the same--no platform left behind. If that's not the case, then you're right and I have no reason to use it, really.
Logged

෴Me෴ @chzmstr | www.luchenlabs.com ቒMadeቓ RA | Nextris | Chromathud   ᙍMakingᙌCheezus II (Devlog)
Cheesegrater
Level 1
*



View Profile
« Reply #14 on: November 18, 2014, 05:41:02 AM »

Still, it'd be nice if they were present for dev purposes, even via some wonky degraded emulation

That's what you get if you ask for SDL_GL_CONTEXT_PROFILE_COMPATIBILITY instead of SDL_GL_CONTEXT_PROFILE_CORE.
Logged
Polly
Level 6
*



View Profile
« Reply #15 on: November 18, 2014, 07:59:18 AM »

My understanding is that it's the lowest common denominator of new/recommended api, and I generally try to stick to the same--no platform left behind.

Different people will recommend you different things. Personally, i always target the lowest API version that makes sense for a specific game so that it runs on as many systems as possible.

For instance, if you're doing a Super Mario World type of game, OpenGL 1.4 or even 1.2 will suffice just fine. Sure it might run slightly faster using modern OpenGL .. but when a laptop from 2004 that only supports up to OpenGL 1.4 is able to run it using the potentially ( somewhat ) slower fixed-function pipeline, newer system won't have any problems running it either.



Another example .. something like Proteus might be a bit taxing for old systems, but it doesn't require anything that's only available in OpenGL 3.3 and will run just fine on most systems that support OpenGL 2.1. So why cut those people off?



Now if you're doing something like Assassin's Creed Unity, systems that are stuck with OpenGL 2.1 ( or perhaps even 3.2 ) probably will have trouble reaching a acceptable framerate to begin with, so in those cases 3.3* makes sense.

*It actually requires DirectX 11, which is primarily supported by GPUs that also support OpenGL 4.0 ( and up ).



But that's just my opinion Wink Others will surely think differently.

+ Keep in mind that for instance Vertex Buffer Objects have been available since 1.5 and Shaders since 2.0, so ( parts of ) "modern" OpenGL has been available long before 3.3 was introduced.
« Last Edit: March 07, 2015, 08:29:44 AM by Polly » Logged
Cheezmeister
Level 3
***



View Profile
« Reply #16 on: November 22, 2014, 10:27:11 PM »

That's a great explanation, thanks Polly. Perhaps I could/should drop down to 2.1, but in any case I want to solve this because now it's personal =P

I also found the gl wiki article on compatibility mode illuminating. I haven't managed to get compat context, though. Here's what I'm currently working with: https://gist.github.com/Cheezmeister/70cd71c6c6359f412b1e

The output includes everything in the OP plus the following. Interestingly, if you remove the early return, the screen clears to cyan just fine even though glClearColor complains.
Code:
GLEW version: 1.11.0
clearcolor reported error: 1280
taking raw addresses:
glda is 1
gleva is 1
gldva is 1
SDL_GL_GetProcAddress:
glda is 1
gleva is 1
gldva is 1
enabling vaa reported error: 1282
drawing arrays reported error: 1282
disabling vaa reported error: 1282

I also noticed the following error if I try explicitly taking the address of the /VertexAttrib/ functions. What exactly is going on here?
Code:
gl.cpp:170:16: error: cannot initialize a variable of type 'void (*)(GLuint)' with an rvalue of type
      'PFNGLDISABLEVERTEXATTRIBARRAYPROC *' (aka 'void (**)(GLuint)')
        void (*gldva)(GLuint) = &glDisableVertexAttribArray;
Logged

෴Me෴ @chzmstr | www.luchenlabs.com ቒMadeቓ RA | Nextris | Chromathud   ᙍMakingᙌCheezus II (Devlog)
Cheezmeister
Level 3
***



View Profile
« Reply #17 on: December 03, 2014, 09:58:54 PM »

Man, this is frustrating. It seems as though a 3.3 context with a compatibility profile is simply not a thing. Or perhaps just not with my broken implementation. 3.0, 3.1, 3.2, no dice. SDL reports that it can't create the context without further detail. I finally managed to get a compat profile with 2.1 and render a triangle with #version 120 shaders, but it sure would be nice to start learning with the latest and greatest API. I cannot for the life of me get glDrawArrays() to succeed under any circumstances with a core profile. WTF?

It's not listed at all under glewinfo, although I'm not quite sure whether I should even expect it to be there...

Code:
 % glewinfo | grep glDrawArrays
  glDrawArraysInstanced:                                       OK
  glDrawArraysInstancedANGLE:                                  MISSING
  glDrawArraysInstancedBaseInstance:                           MISSING
  glDrawArraysIndirect:                                        OK
  glDrawArraysInstancedARB:                                    OK
  glDrawArraysInstancedEXT:                                    MISSING
  glDrawArraysEXT:                                             MISSING

Call me jaded, but ain't it a little absurd to have to explicitly request a very specific context and utilize a 3rd-party (?) library, just to get a triangle on the screen? Yeesh.

Whatever. I can haz shaderz. Time to move on with my life.
Logged

෴Me෴ @chzmstr | www.luchenlabs.com ቒMadeቓ RA | Nextris | Chromathud   ᙍMakingᙌCheezus II (Devlog)
Polly
Level 6
*



View Profile
« Reply #18 on: December 04, 2014, 06:58:06 AM »

I cannot for the life of me get glDrawArrays() to succeed under any circumstances with a core profile. WTF?

It's not listed at all under glewinfo, although I'm not quite sure whether I should even expect it to be there.

I don't use GLEW ( or any extension wrangler for that matter ) myself, but glDrawArrays has been available since OpenGL 1.1, so it's not considered a extension ( which is probably why GLEW doesn't list it ).

Call me jaded, but ain't it a little absurd to have to explicitly request a very specific context and utilize a 3rd-party (?) library, just to get a triangle on the screen? Yeesh.

You don't Wink
Logged
Average Software
Level 10
*****

Fleeing all W'rkncacnter


View Profile WWW
« Reply #19 on: December 04, 2014, 04:51:49 PM »

Man, this is frustrating. It seems as though a 3.3 context with a compatibility profile is simply not a thing.

Are you on a Mac?  I may be wrong about this, but I believe there are no compatibility contexts for 3.3+ on Macs.
Logged



What would John Carmack do?
Pages: [1] 2
Print
Jump to:  

Theme orange-lt created by panic