Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length

 
Advanced search

999255 Posts in 39205 Topics- by 30615 Members - Latest Member: AlecKelley

April 22, 2014, 11:39:01 PM
TIGSource ForumsDeveloperTechnical (Moderators: Glaiel-Gamer, ThemsAllTook)How much should I worry about deprecated OpenGL API versions?
Pages: [1] 2
Print
Author Topic: How much should I worry about deprecated OpenGL API versions?  (Read 783 times)
Orz
Level 1
*


"When nothing is done, nothing is left undone."


View Profile WWW Email
« on: March 21, 2013, 08:42:17 AM »

I have heard bits and pieces of advice about deprecated OpenGL functions.  As I understand it, OpenGL at one time supported a few fixed shader functions, and then, some time around version 2.0, started to migrate to programmable shaders.  It was around this point that they added a whole bunch of shit to the API, like having to create vertex buffer objects instead of calling draw functions.

I'd like to use the old (OpenGL 1.x) API because it's more intuitive to me, it's supported on netbooks and older laptops, and my game doesn't need programmable shaders.  Will this prevent my game from running on newer video cards and drivers?  Or is GL 1.x going to become one of those archaic features like the <blink> tag that are supported long after they've fallen out of fashion?
Logged
Polly
Level 3
***


View Profile
« Reply #1 on: March 21, 2013, 09:06:15 AM »

I'd like to use the old (OpenGL 1.x) API because it's more intuitive to me, it's supported on netbooks and older laptops, and my game doesn't need programmable shaders.  Will this prevent my game from running on newer video cards and drivers?

Nope .. and discontinuation of support for older OpenGL versions isn't even on the horizon yet. So going with 1.x because you prefer the API + better compatibility + it's-all-you-need is perfectly fine Smiley

+ Side-note: The benefit of using for example 3.3 is that you can port to mobile ( iOS / Android ) very easily. Going from for example 1.4 to ES 1.1 takes a little bit of work.
Logged
ThemsAllTook
Moderator
Level 10
******


Alex Diener


View Profile WWW
« Reply #2 on: March 21, 2013, 09:10:54 AM »

I think you'll be in good shape for a while yet. As far as I know, everywhere desktop GL is available, GL 1.x stuff is still in it. There might be some features that aren't hardware accelerated anymore (the selection buffer being one I discovered recently), but pretty much everything should still work in some form.

OpenGL ES is a different matter. Quite a lot of features of desktop GL are unavailable in GL ES1, and a lot of features from ES1 are gone from ES2. I don't know if there's any chance ES will eventually replace desktop GL, but I wouldn't rule it out as a possibility.

That said, programmable pipelines are amazing, and can benefit you even if you don't need any features beyond what fixed function provides. They're well worth learning, and once you demystify them a bit and get comfortable with their way of doing things, shaders are a huge amount of fun to write. I'd recommend this book.
Logged
Orz
Level 1
*


"When nothing is done, nothing is left undone."


View Profile WWW Email
« Reply #3 on: March 21, 2013, 11:58:23 AM »

shaders are a huge amount of fun to write. I'd recommend this book.

Coming from you, I'll believe it!  Thanks for the info and I'll check it out.
Logged
JakobProgsch
Level 1
*



View Profile Email
« Reply #4 on: March 21, 2013, 02:54:26 PM »

I would personally always prefer programmable pipeline. It may look more complicated at first but it is actually way more transparent... There are shaders, buffers and draw calls that you need to know about. Way less "oh you forgot to call glOccultRitual(GL_SEEMINGLY_TOTALLY_UNRELATED_TO_WHAT_I_WANTED) of course it doesn't work" that you get constantly with the black box that is fixed function.

Basically if you learn to properly do ogl3+ its easy to also write 2.0 in a very similar style (if you need that compatibility) as well as ES2. Learning 1.1-1.5 at this point is imho mostly setting yourself up to "unlearning" a lot of stuff later on.
Logged

Orz
Level 1
*


"When nothing is done, nothing is left undone."


View Profile WWW Email
« Reply #5 on: March 23, 2013, 01:39:35 PM »

Yeah, ordinarily I'm kind of a control freak, but I want to be able to play it on a GMA450, so the shaders are out of my hands for now.  FWIW I'm trying to keep things as standardized as possible by using, for example, GLM for vector math (more on that can of worms later).
Logged
kamac
Level 10
*****


Notorious posts editor


View Profile Email
« Reply #6 on: March 23, 2013, 02:05:59 PM »

Yeah, ordinarily I'm kind of a control freak, but I want to be able to play it on a GMA450, so the shaders are out of my hands for now.  FWIW I'm trying to keep things as standardized as possible by using, for example, GLM for vector math (more on that can of worms later).

First learn OGL 3.0/2.1, then make abstract layer for 1.0. This way you don't have to 'unlearn' things. I'd do it this way.
(Or you have access to GMA450 only?)

Good to know!
OpenGL 2.x is somewhat an equivalent of DirectX 9
OpenGL 3.x is somewhat an equivalent of DirectX 10 (Requires the same hardware)
OpenGL 4.x is somewhat an equivalent of DirectX 11
Logged

Orz
Level 1
*


"When nothing is done, nothing is left undone."


View Profile WWW Email
« Reply #7 on: March 23, 2013, 02:26:08 PM »


First learn OGL 3.0/2.1, then make abstract layer for 1.0. This way you don't have to 'unlearn' things.


That sounds like a great idea - any examples?  Basically, I want the game to run if someone starts it on a old/cheap computer.  If it checks for programmable shaders, finds none, and falls back to an older rendering pipeline, that's fine too.  There are still lots of laptops out there with Intel integrated video that doesn't support shaders.
Logged
powly
Level 3
***



View Profile WWW
« Reply #8 on: March 23, 2013, 03:31:54 PM »

No, it doesn't. You're already working on slow hardware, no reason to jump through hoops to make it slower. And 2.1 is pretty much a horrible mess where you can use everything together without making too much sense. Just keep in mind that (core) OGL 3.0 onwards (the stuff you can safely assume any decent desktop has) and 1.4 (what most netbooks support) are completely different beasts and use them for separate projects as needed. And even with 1.4, please use vertex arrays instead of immediate mode. Please.
Logged
kamac
Level 10
*****


Notorious posts editor


View Profile Email
« Reply #9 on: March 24, 2013, 10:50:50 PM »

No, it doesn't.

Well, I don't see other way to make it work on older devices, while still not losing on quality/speed for others than adding an abstract layer.

Quote
And 2.1 is pretty much a horrible mess where you can use everything together without making too much sense

Still, it doesn't mean you have to, and using some 3.0 extensions won't work for somebody who's running OGL 2.1 on his device. But I guess going with 3.0 and 1.4 as abstract layer is a good idea.

Althrough, that might not be the best if you're just starting with OGL and want to rather learn it. This way everybody here (atleast the excellent majority) would recommend to start off with core version of it.

Abstraction layer is simply making (for example) two libraries, then from the main application you decide which to pick and you use that one.
Here's what I've found on that topic:

http://stackoverflow.com/questions/3697001/opengl-directx-abstraction-layer
Logged

JakobProgsch
Level 1
*



View Profile Email
« Reply #10 on: March 25, 2013, 02:52:50 AM »

You can code in a "modern style" in 2.1 though (the vao extension is usually available). Just consistently avoid builtin shader variables as far as possible and use generic attributes etc.

Also even my crummy netbook supports 2.1 (well technically 1.4 with lots of extensions, it's lacking full multisample support for proper 2.1). By this point most people that are really stuck with 1.4 probably don't have proper drivers installed and not incapable hardware.
Logged

nikki
Level 10
*****


View Profile Email
« Reply #11 on: March 25, 2013, 06:47:32 AM »

my ibm/lenovo thinkpad (from 2007) only does OGL 1.4
that's not a a driver issue;  it's just intel gma 950.

it is representing old/crappy machines like that, and I feel neglected whenever some (non aaa) game doesn't run on it (especcialy without telling me wich shader or whatnot I should  have).

It's sad because I really like the feeling of that laptop.
luckily I program more then playing games.
 
Logged
kamac
Level 10
*****


Notorious posts editor


View Profile Email
« Reply #12 on: March 25, 2013, 07:08:26 AM »

I guess best idea is to go with DX9 as your abstract layer (instead of OGL 1.4). Then you have both OGL 3.0 and DX9. First for linux and mac, third for windows (it's not a secret that DX9 usually works better than OGL on windows).

That requires more work, though.
Logged

Gregg Williams
Level 10
*****


Retromite code daemon


View Profile WWW Email
« Reply #13 on: March 25, 2013, 08:42:30 AM »

Realize that only the latest versions of Mac OS X (Lion/MT lion) support OGL 3.0
Logged

kamac
Level 10
*****


Notorious posts editor


View Profile Email
« Reply #14 on: March 25, 2013, 08:52:44 AM »

Realize that only the latest versions of Mac OS X (Lion/MT lion) support OGL 3.0

Didn't know that. Might be a good idea to go for 2.1 if it's not an advanced 3D game, then.
Logged

Pages: [1] 2
Print
Jump to:  

Theme orange-lt created by panic