c++ - Open GL - Why are Pixel values in ABGR and how to use them in RGBA -
i have code loading textures i'm using devil load images , opengl creates texture pixels. code works fine , texture shows , that's fine.
besides can make array within program create texture or make changes in texture's pixels directly. problem here: when handling pixels format seems abgr rather rgba have liked.
i stumbled upon this question refers format that's passed in glteximage2d function:
(...) if have gl_rgba , gl_unsigned_int_8_8_8_8, means pixels stored in 32-bit integers, , colors in logical order rgba in such integer, e.g. red in high-order byte , alpha in low-order byte. if machine little-endian (as intel cpus), follows actual order in memory abgr. whereas, gl_rgba gl_unsigned_byte store bytes in rgba order regardless whether computer little-endian or big-endian. (...)
indeed have intel cpu. images loaded fine way things right , use gl_rgba mode , gl_unsigned_byte type.
gluint maketexture( const gluint* pixels, gluint width, gluint height ) { gluint texture = 0; glgentextures( 1, &texture ); glbindtexture( gl_texture_2d, texture ); glteximage2d( gl_texture_2d, 0, gl_rgba, width, height, 0, gl_rgba, gl_unsigned_byte, pixels ); gltexparameteri( gl_texture_2d, gl_texture_mag_filter, gl_linear ); gltexparameteri( gl_texture_2d, gl_texture_min_filter, gl_linear ); glbindtexture( gl_texture_2d, null ); glenum error = glgeterror(); if ( error != gl_no_error ) { return 0; } return texture; }
this function used in 2 methods loading textures, method loads image file , 1 creates texture array.
let's want create array of pixels , create texture,
gluint pixels[ 128 * 128 ]; ( int = 0; < 128 * 128; ++i ) { pixels[ ] = 0x800000ff; } texture.loadimagearray( pixels, 128, 128 );
by padding pixels value expect see dark red color.
red = 0x80, green = 0x00, blue = 0x00, alpha = 0xff
but instead transparent red,
alpha = 0x80, blue = 0x00, green = 0x00, red = 0xff
rather using raw unsigned ints made structure me handling individual channels:
struct color4c { unsigned char alpha; unsigned char blue; unsigned char green; unsigned char red; ... };
i can replace array of unsigned ints array of color4c , result same. if invert order of channels (red first, alpha last) can pass 0xrrggbbaa , make work.
the easy solution handle these values in abgr format. find easier work rgba values. if want use hardcoded color values prefer write them 0xrrggbbaa , not 0xaabbggrr.
but let's start using abgr format. if run code in machine, see strange colors wherever changed pixels/channels directly? there better solution?
Comments
Post a Comment