Dear All,
I'm trying to load an hdr cube map. For loading the radiance/hdr file I'm using VampyreImagingOpenGL from trunk (The rest of the package is the latest release because I couldn't get trunk working for me and the release version of VampyreImagingOpenGL is converting hdr images to 8-bit before loading).
I want to load it into a cube map but it does not seem to work (I get a black image).
Here is the relevant "pseudo" code:
//For each file...
PX:= LoadGLTextureFromFile( path + 'PX.hdr' );
//...
glGenTextures( 1, @GLTexture );
glBindTexture( GL_TEXTURE_CUBE_MAP, GLTexture );
//For each side
glBindTexture( GL_TEXTURE_CUBE_MAP_POSITIVE_X, PX );
//...
glBindTexture( GL_TEXTURE_CUBE_MAP, 0 );
From the OpenGL Documentation it says:
When a texture is first bound, it assumes the specified target: A texture first bound to GL_TEXTURE_1D becomes one-dimensional texture, ...
(
https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/glBindTexture.xhtml)
So my guess is that ImagingOpenGL initializes the texture as GL_TEXTURE_2D.
Is there a simple way to copy/convert a texture to another? I'm of course also open to other solutions. I'm trying to target OpenGL 2.1. Maybe glCopyTexImage2D is the solution but I'm not sure on how to use it. What I tried is this:
//For each file...
PX:= LoadGLTextureFromFile( path + 'PX.hdr' );
//...
glGenTextures( 1, @GLTexture );
glBindTexture( GL_TEXTURE_CUBE_MAP, GLTexture );
//For each side...
glBindTexture( GL_TEXTURE_2D, PX );
glGetTexLevelParameteriv( GL_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, @Width );
glGetTexLevelParameteriv( GL_TEXTURE_2D, 0, GL_TEXTURE_HEIGHT, @Height );
glGetTexLevelParameteriv( GL_TEXTURE_2D, 0, GL_TEXTURE_INTERNAL_FORMAT, @Internal );
glCopyTexImage2D( GL_TEXTURE_CUBE_MAP_POSITIVE_X, 0, Internal, 0, 0, Width, Height, 0 );
glDeleteTextures( 0, @PX );