mirror of
https://github.com/RGBCube/serenity
synced 2025-07-27 06:17:35 +00:00
LibGL+LibGPU+LibSoftGPU: Remove concept of layer
in favor of depth
Looking at how Khronos defines layers: https://www.khronos.org/opengl/wiki/Array_Texture We both have 3D textures and layers of 2D textures, which can both be encoded in our existing `Typed3DBuffer` as depth. Since we support depth already in the GPU API, remove layer everywhere. Also pass in `Texture2D::LOG2_MAX_TEXTURE_SIZE` as the maximum number of mipmap levels, so we do not allocate 999 levels on each Image instantiation.
This commit is contained in:
parent
44953a4301
commit
dda5987684
9 changed files with 54 additions and 65 deletions
|
@ -55,7 +55,7 @@ public:
|
|||
virtual RasterizerOptions options() const = 0;
|
||||
virtual LightModelParameters light_model() const = 0;
|
||||
|
||||
virtual NonnullRefPtr<Image> create_image(PixelFormat const&, u32 width, u32 height, u32 depth, u32 levels, u32 layers) = 0;
|
||||
virtual NonnullRefPtr<Image> create_image(PixelFormat const&, u32 width, u32 height, u32 depth, u32 max_levels) = 0;
|
||||
|
||||
virtual void set_sampler_config(unsigned, SamplerConfig const&) = 0;
|
||||
virtual void set_light_state(unsigned, Light const&) = 0;
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue