DrDerekDoctors
|
|
« on: June 05, 2017, 07:22:46 AM » |
|
Hullo! I'm currently writing my game and using old-style OpenGL (i.e. no shaders, just the old commands) - is there any real risk in doing this from a porting perspective or future-proofing perspective? Say I wanted to port to a console (hahahahahahaha - unlikely, I know), if I was using manky old OpenGL 1.3 texture combiners and stuff like that, would I have to rip all that out and learn shader language anyhoo?
|
|
|
Logged
|
Me, David Williamson and Mark Foster do an Indie Games podcast. Give it a listen. And then I'll send you an apology. http://pigignorant.com/
|
|
|
JWki
|
|
« Reply #1 on: June 05, 2017, 08:22:45 AM » |
|
Hullo! I'm currently writing my game and using old-style OpenGL (i.e. no shaders, just the old commands) - is there any real risk in doing this from a porting perspective or future-proofing perspective? Say I wanted to port to a console (hahahahahahaha - unlikely, I know), if I was using manky old OpenGL 1.3 texture combiners and stuff like that, would I have to rip all that out and learn shader language anyhoo?
Yes, you would. In reality, graphics without shaders don't exist anymore - the deprecated OpenGL APIs you're using only exist for backwards compatibility and use shaders in the background. On consoles, you usually use entirely different, device specific graphics APIs that nodobody can really talk about too much because NDAs. OpenGL is kinda supported on some consoles but usually the "native" APIs are used.
|
|
|
Logged
|
|
|
|
ferreiradaselva
|
|
« Reply #2 on: June 05, 2017, 04:20:23 PM » |
|
Go with OpenGL 3.3 and you will even support computers from some years ago. But, nowadays, even average computers/laptops (just with an embed graphic card, like Intel HD, my case) support OpenGL 4.0. I always had the same issue you are having, wanting to support older computers, but knowing that even the average computer today support 4.0 OpenGL, I decided to just go with something a bit bellow it. On consoles, you usually use entirely different, device specific graphics APIs that nodobody can really talk about too much because NDAs. OpenGL is kinda supported on some consoles but usually the "native" APIs are used.
I thought the rule was DirectX for XBox and OpenGL for the others, since they are usually Unix-based. But, I never developed for console.
|
|
|
Logged
|
|
|
|
DrDerekDoctors
|
|
« Reply #3 on: June 06, 2017, 12:26:57 AM » |
|
Reet, ta' all. Well, I guess I'm learning shaders. *places gun in mouth*
|
|
|
Logged
|
Me, David Williamson and Mark Foster do an Indie Games podcast. Give it a listen. And then I'll send you an apology. http://pigignorant.com/
|
|
|
JWki
|
|
« Reply #4 on: June 06, 2017, 12:41:07 AM » |
|
Reet, ta' all. Well, I guess I'm learning shaders. *places gun in mouth* Shaders are great! And they're not very complicated,so don't worry about that .
|
|
|
Logged
|
|
|
|
Polly
Level 6
|
|
« Reply #5 on: June 06, 2017, 03:12:04 AM » |
|
I'd stick with what you know & have. Re-doing code for a hypothetical chance of a future port ( for which you might not even want to use OpenGL at all ) just seems silly.
|
|
|
Logged
|
|
|
|
DrDerekDoctors
|
|
« Reply #6 on: June 06, 2017, 03:28:36 AM » |
|
Reet, ta' all. Well, I guess I'm learning shaders. *places gun in mouth* Shaders are great! And they're not very complicated,so don't worry about that . Dammit, I'd already blown my brains out!
|
|
|
Logged
|
Me, David Williamson and Mark Foster do an Indie Games podcast. Give it a listen. And then I'll send you an apology. http://pigignorant.com/
|
|
|
DrDerekDoctors
|
|
« Reply #7 on: June 06, 2017, 03:29:50 AM » |
|
I'd stick with what you know & have. Re-doing code for a hypothetical chance of a future port ( for which you might not even want to use OpenGL at all ) just seems silly.
But... the joy of refactoring! Also, in order to achieve the effect I want to do (basically drawing a sprite but using both its own mask and another dither mask) means using some really shonky feeling texture-combiner stuff, so there'd be a buncha work there anyhoo to allow my system to use multiple textures.
|
|
|
Logged
|
Me, David Williamson and Mark Foster do an Indie Games podcast. Give it a listen. And then I'll send you an apology. http://pigignorant.com/
|
|
|
ThemsAllTook
|
|
« Reply #8 on: June 06, 2017, 11:56:34 AM » |
|
Shaders really are fantastic. You don't necessarily have to go all out and jump straight to the newest version to use them - OpenGL 2.0 is still pretty great. You can even mix and match if you only need a shader for one thing.
|
|
|
Logged
|
|
|
|
Whiteclaws
|
|
« Reply #9 on: June 06, 2017, 12:55:57 PM » |
|
Just copy-paste shaders from tutorials, it works 100% of the time 20% of the time
|
|
|
Logged
|
|
|
|
InfiniteStateMachine
|
|
« Reply #10 on: June 06, 2017, 04:52:31 PM » |
|
yar shaders aren't so bad these days. You can even get (good) compile time errors these days. I was surprised to see that when I was building a monogame project with a post-process effect and it was able to tell me exactly where my issue was. I'd stick with what you know & have. Re-doing code for a hypothetical chance of a future port ( for which you might not even want to use OpenGL at all ) just seems silly.
But... the joy of refactoring! Also, in order to achieve the effect I want to do (basically drawing a sprite but using both its own mask and another dither mask) means using some really shonky feeling texture-combiner stuff, so there'd be a buncha work there anyhoo to allow my system to use multiple textures. I'm not sure I completely understand what you are trying to do but you managed to achieve this with openGL1? It shouldn't be too hard to convert to a shader in that case. We can always help EDIT : OpenGL1 on the desktop isn't at risk right? I imagine that will be here for a long time. At the very least I would expect it to be around on windows (if there's one thing microsoft does well, it's support for extremely old api's)
|
|
|
Logged
|
|
|
|
Polly
Level 6
|
|
« Reply #11 on: June 07, 2017, 02:40:03 AM » |
|
OpenGL1 on the desktop isn't at risk right? I doubt they are willing to kill off Minecraft anytime soon
|
|
|
Logged
|
|
|
|
powly
|
|
« Reply #12 on: June 07, 2017, 07:09:01 AM » |
|
OpenGL1 on the desktop isn't at risk right? I doubt they are willing to kill off Minecraft anytime soon Well, all the drivers already have a GL1-to-modern path so it's likely relatively simple to keep support now and forever. As said, porting won't happen completely trivially but I think you'd want to succeed on one platform first and then consider doing that (and if you only use GL1 features, the reimplementation should be quite a breeze) I do suggest checking out GL4+ tho, no sense in not using the actually neat stuff GPUs can do
|
|
|
Logged
|
|
|
|
J-Snake
|
|
« Reply #13 on: June 08, 2017, 07:10:57 AM » |
|
If you worry about backwards compatibility I recommend to go with OpenGL 2.1 (similar feature set to DX9) and not 3.2+ as there are still plenty notebooks with integrated intelHD 3000 around, and most of them don't have support for "modern opengl". Not sure how the situation looks like on Macs as of now though, I would guess Opengl2.1 software should still be able to run.
|
|
|
Logged
|
|
|
|
DrDerekDoctors
|
|
« Reply #14 on: June 08, 2017, 11:04:00 PM » |
|
Cheers for all your answers!
|
|
|
Logged
|
Me, David Williamson and Mark Foster do an Indie Games podcast. Give it a listen. And then I'll send you an apology. http://pigignorant.com/
|
|
|
oahda
|
|
« Reply #15 on: June 13, 2017, 01:33:24 PM » |
|
Not sure how the situation looks like on Macs as of now though, I would guess Opengl2.1 software should still be able to run. Yeah, you asked me to test that some time ago and I got it working.
|
|
|
Logged
|
|
|
|
Raptor85
|
|
« Reply #16 on: June 25, 2017, 03:56:32 AM » |
|
Just fyi if you're just starting to learn now, you might want to skip opengl entirely as the khronos group is, instead of further upgrading the old opengl api, pushing forward with vulkan from here on out on all platforms (android/linux/windows/mac) and there's no more of the opengl vs opengles, it's one unified api now. All current nvidia/ati/intel/etc drivers now support vulkan as far back as the radeon69xx and nvidia 600+ cards (about 2010)
Just makes more sense if starting fresh from gl1.1 to skip 3.3 and go straight to the new api, which also has official c++ bindings and is imho FAR easier to learn as well. The shading language is the same though you now precompile them to SPIR-V instead of the video driver compiling them (similar to how HLSL works for dx).
|
|
|
Logged
|
-Fuzzy Spider
|
|
|
Polly
Level 6
|
|
« Reply #17 on: June 25, 2017, 04:43:18 AM » |
|
Just fyi if you're just starting to learn now, you might want to skip opengl entirely as the khronos group is, instead of further upgrading the old opengl api, pushing forward with vulkan from here on out on all platforms (android/linux/windows/mac) and there's no more of the opengl vs opengles, it's one unified api now. All current nvidia/ati/intel/etc drivers now support vulkan as far back as the radeon69xx and nvidia 600+ cards (about 2010)
Just makes more sense if starting fresh from gl1.1 to skip 3.3 and go straight to the new api, which also has official c++ bindings and is imho FAR easier to learn as well. The shading language is the same though you now precompile them to SPIR-V instead of the video driver compiling them (similar to how HLSL works for dx).
I suspect this is a joke / troll post ( or worse ) ... but Windows support for Vulkan is only available for 6th generation ( and up ) Intel cores, which were first released in September 2015. Might be a bit soon to cut off people with for instance Broadwell systems, the last of which were released in 2016
|
|
|
Logged
|
|
|
|
ferreiradaselva
|
|
« Reply #18 on: June 25, 2017, 09:20:43 AM » |
|
Just fyi if you're just starting to learn now, you might want to skip opengl entirely as the khronos group is, instead of further upgrading the old opengl api, pushing forward with vulkan from here on out on all platforms (android/linux/windows/mac) and there's no more of the opengl vs opengles, it's one unified api now. All current nvidia/ati/intel/etc drivers now support vulkan as far back as the radeon69xx and nvidia 600+ cards (about 2010)
Just makes more sense if starting fresh from gl1.1 to skip 3.3 and go straight to the new api, which also has official c++ bindings and is imho FAR easier to learn as well. The shading language is the same though you now precompile them to SPIR-V instead of the video driver compiling them (similar to how HLSL works for dx).
I suspect this is a joke / troll post ( or worse ) ... but Windows support for Vulkan is only available for 6th generation ( and up ) Intel cores, which were first released in September 2015. Might be a bit soon to cut off people with for instance Broadwell systems, the last of which were released in 2016 yeah. And Vulkan isn't a OpenGL replacement. And OpenGL and OpenGLES aren't unified.
|
|
|
Logged
|
|
|
|
J-Snake
|
|
« Reply #19 on: June 25, 2017, 11:21:14 AM » |
|
Just fyi if you're just starting to learn now, you might want to skip opengl entirely as the khronos group is, instead of further upgrading the old opengl api, pushing forward with vulkan from here on out on all platforms (android/linux/windows/mac) and there's no more of the opengl vs opengles, it's one unified api now. All current nvidia/ati/intel/etc drivers now support vulkan as far back as the radeon69xx and nvidia 600+ cards (about 2010)
Just makes more sense if starting fresh from gl1.1 to skip 3.3 and go straight to the new api, which also has official c++ bindings and is imho FAR easier to learn as well. The shading language is the same though you now precompile them to SPIR-V instead of the video driver compiling them (similar to how HLSL works for dx).
I wouldn't be so sure about Vulkan. Give it 10 more years and we will have a better picture about its place. What is sure though is that Opengl software has to be supported in order to prevent the death of huge quantities of Opengl-Software, Opengl has to be a safe bet at this point in time. Plus there is still a lot of pre-2010 hardware out there. So unless your game-concept requires to push the absolute latest hardware to its limits (which would be a very risky undertaking anyway) suggesting Vulkan is practically suicide.
|
|
|
Logged
|
|
|
|
|