mirror of
https://github.com/RGBCube/serenity
synced 2025-07-25 11:37:44 +00:00
LibGL+LibGPU+LibSoftGPU: Remove concept of layer
in favor of depth
Looking at how Khronos defines layers: https://www.khronos.org/opengl/wiki/Array_Texture We both have 3D textures and layers of 2D textures, which can both be encoded in our existing `Typed3DBuffer` as depth. Since we support depth already in the GPU API, remove layer everywhere. Also pass in `Texture2D::LOG2_MAX_TEXTURE_SIZE` as the maximum number of mipmap levels, so we do not allocate 999 levels on each Image instantiation.
This commit is contained in:
parent
44953a4301
commit
dda5987684
9 changed files with 54 additions and 65 deletions
|
@ -14,7 +14,7 @@ namespace GL {
|
|||
void Texture2D::download_texture_data(GLuint lod, GPU::ImageDataLayout output_layout, GLvoid* pixels)
|
||||
{
|
||||
VERIFY(!device_image().is_null());
|
||||
device_image()->read_texels(0, lod, { 0, 0, 0 }, pixels, output_layout);
|
||||
device_image()->read_texels(lod, { 0, 0, 0 }, pixels, output_layout);
|
||||
}
|
||||
|
||||
void Texture2D::upload_texture_data(GLuint lod, GLenum internal_format, GPU::ImageDataLayout input_layout, GLvoid const* pixels)
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue