Welcome, Guest. Please login or register.

Login with username, password and session length

 
Advanced search

1411484 Posts in 69371 Topics- by 58427 Members - Latest Member: shelton786

April 24, 2024, 03:04:50 AM

Need hosting? Check out Digital Ocean
(more details in this thread)
TIGSource ForumsDeveloperTechnical (Moderator: ThemsAllTook)OpenGL: BGRA on Mac, RGBA on PC???
Pages: [1]
Print
Author Topic: OpenGL: BGRA on Mac, RGBA on PC???  (Read 3719 times)
SheridanR
Level 0
***



View Profile WWW
« on: March 21, 2015, 11:58:38 AM »

Hi all,

Recently my programmer and I have been doing some last minute porting work on our game (going to Mac) and we've run into a bit of a roadblock. On Linux and Windows, the game works great. We use SDL 1.2 in conjunction with OpenGL to render our graphics. On Mac however, we've run into a unique glitch in that the color channels for the entire window are mixed up from the usual RGBA to BGRA.

As an example, here's a picture of the game on a mac with messed up colors: http://i.imgur.com/1xZrFk0.png

And here's a picture of the game with normal colors: http://i.imgur.com/OYuo52t.png

We have no idea why the game is doing this. At first we tried changing our glTexImage2D calls to use the different channel arrangement, so:
Code:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels);

Would become something like
Code:
glTexImage2D(GL_TEXTURE_2D, 0, GL_BGRA, image->w, image->h, 0, GL_BGRA, GL_UNSIGNED_BYTE, image->pixels);

But this didn't completely solve the problem. Most graphics were the right colors, except for some hardcoded gui elements, which had switched from a reddish-brown to a mossy green color.

To draw those elements we feed glColor directly with a 32bit unsigned int from which 8 bit color channels are extracted. ie:
Code:
glColor3ui( (Uint8)(color>>16), (Uint8)(color>>8), (Uint8)(color) );

We don't know why these colors would still be messed up, since by the official opengl docs the first argument is always red, the second green, and the last blue.

Can anyone with more experience in opengl, sdl, or mac development help us with this???
Logged

happymonster
Level 10
*****



View Profile WWW
« Reply #1 on: March 21, 2015, 12:41:42 PM »

On the Mac the surfaces are not being loaded with a different RGBA ordering are they?
Logged
surt
Level 7
**


Meat by-product.


View Profile
« Reply #2 on: March 21, 2015, 12:48:25 PM »

If the component byte ordering is reading reversed for image->pixels then maybe it's doing the same for color?
Are you are reversing the byte order for glTexImage2D but not for glColor3ui?
Code:
glColor3ui( (Uint8)(color>>8), (Uint8)(color>>16), (Uint8)(color>>24) );
Logged

Real life would be so much better with permadeath.
PJ Gallery - OGA Gallery - CC0 Scraps
SheridanR
Level 0
***



View Profile WWW
« Reply #3 on: March 21, 2015, 01:26:45 PM »

On the Mac the surfaces are not being loaded with a different RGBA ordering are they?
I have no idea, we're using SDL_image to load 32bit .png files and the function to load the images (and therefore setup the surfaces) doesn't take an argument to specify how the channels are ordered.

If the component byte ordering is reading reversed for image->pixels then maybe it's doing the same for color?
Are you are reversing the byte order for glTexImage2D but not for glColor3ui?
Code:
glColor3ui( (Uint8)(color>>8), (Uint8)(color>>16), (Uint8)(color>>24) );
It's definitely reading reversed for color too. We could just go through all the code and do something like this:
Code:
#if APPLE
glColor3ui( (Uint8)(color>>8), (Uint8)(color>>16), (Uint8)(color>>24) );
#else
glColor3ui( (Uint8)(color>>16), (Uint8)(color>>8), (Uint8)(color) );
#endif
But I'd like to figure out why the channels are being switched around in the first place, and if I can control it. After all I'm worried that somebody will play the game with a strange setup that we didn't test with and get weird colors just because we didn't understand something simple about the API.
Logged

surt
Level 7
**


Meat by-product.


View Profile
« Reply #4 on: March 21, 2015, 01:38:59 PM »

All macs are intel now aren't they? So endianness shouldn't be an issue.

If somehow it is related to endianness then there is SDL_BYTEORDER:
Code:
#include <SDL_endian.h>

if (SDL_BYTEORDER == SDL_LIL_ENDIAN) {
  glColor3ui( (Uint8)(color>>8), (Uint8)(color>>16), (Uint8)(color>>24) );
}
else { // SDL_BYTEORDER == SDL_BIG_ENDIAN
  glColor3ui( (Uint8)(color>>16), (Uint8)(color>>8), (Uint8)(color) );
}
Logged

Real life would be so much better with permadeath.
PJ Gallery - OGA Gallery - CC0 Scraps
SheridanR
Level 0
***



View Profile WWW
« Reply #5 on: March 21, 2015, 02:02:58 PM »

Good call, just integrated that into the code. It didn't fix the issue however, so I don't think endianness is the problem.

To be honest, I'm beginning to think we should just throw preprocessor stuff around and hope for the best. I still don't know why the issue exists in the first place though. As I said, we've never had this problem before on Windows or Linux...

Logged

happymonster
Level 10
*****



View Profile WWW
« Reply #6 on: March 21, 2015, 02:13:51 PM »

You could add this to your log file output and see if there is a difference between the Windows and Mac versions:

Code:
SDL_DisplayMode mode;
SDL_GetPixelFormatName( mode.format ) );
Logged
oahda
Level 10
*****



View Profile
« Reply #7 on: March 21, 2015, 02:28:35 PM »

If it really has to do with the image loading, perhaps you could try using what I switched to from SDL_Image: Tom Dalling's bitmap class. So long as its license works out for you. Zip with source code for this tutorial. It has loads of code to handle colour profiles under its hood.
Logged

SheridanR
Level 0
***



View Profile WWW
« Reply #8 on: March 21, 2015, 03:03:55 PM »

I don't think the image loading is really the core of the problem since it affects graphics that are simply drawn using glColor.

You could add this to your log file output and see if there is a difference between the Windows and Mac versions:

Code:
SDL_DisplayMode mode;
SDL_GetPixelFormatName( mode.format ) );
Good idea, will try that and report back soon.
Logged

SheridanR
Level 0
***



View Profile WWW
« Reply #9 on: March 21, 2015, 09:30:01 PM »

As it turns out, SDL_GetPixelFormatName is only an SDL2 function, and there exists no analogue in SDL 1.2. However, I did find something quite useful while digging around this evening. The SDL_PixelFormat type has some useful elements for bitshifting called Rshift, Gshift, Bshift, and Ashift, which are exactly what they sound like. So rather than hardcoding numbers to get 8bit channel components from our 32bit colors, we're using these variables to do the shifting instead - and so far our changes seem promising.

This doesn't answer why the format would be any different in the first place but it's half the battle in coming up with a portable solution for solving the problem I think.
Logged

jgrams
Level 3
***



View Profile
« Reply #10 on: March 25, 2015, 04:30:11 AM »

So you have determined (from looking at the shift fields in the PixelFormat) that SDL_Image is loading the same PNG into memory in BGRA format on Mac but RGBA on the PC? That is very strange. Although...it looks like SDL_image 1.2 can compile with a native ImageIO backend on Mac rather than using libpng etc. as on other platforms...I wonder if that's what's going on?

I also wonder what would happen if you left your textures' internalformat (in GPU memory) as RGBA and only switched the format (in CPU memory) to BGRA:

Code:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0, GL_BGRA, GL_UNSIGNED_BYTE, image->pixels);
Logged
oahda
Level 10
*****



View Profile
« Reply #11 on: March 25, 2015, 05:47:22 AM »

I just got BGRA text from SDL_ttf (for SDL2), with no file loaded, on Mac. Can't check whether it's any different on Windows, tho, since I don't have it.

why is bgra even a thing
« Last Edit: March 25, 2015, 07:11:09 AM by Prinsessa » Logged

Cheesegrater
Level 1
*



View Profile
« Reply #12 on: March 25, 2015, 09:20:33 AM »

why is bgra even a thing

Because ARGB was common on older big endian graphics workstations (For example, SGI's IRIX platform, or PowerPC based Macs). When you load those files on little endian CPUs, you'd get BGRA. On some platforms, it stuck as the standard.
Logged
Pages: [1]
Print
Jump to:  

Theme orange-lt created by panic