When filling in some missing part of a window (typically happens during
interactive window resize) we now use the ColorRole::Background from
the system theme palette instead of expecting the clients to send us
the same information when creating windows.
GApplication now has a palette. This palette contains all the system
theme colors by default, and is inherited by a new top-level GWidget.
New child widgets inherit their parents palette.
It is possible to override the GApplication palette, and the palette
of any GWidget.
The Palette object contains a bunch of colors, each corresponding to
a ColorRole. Each role has a convenience getter as well.
Each GWidget now has a background_role() and foreground_role(), which
are then looked up in their current palette when painting. This means
that you no longer alter the background color of a widget by setting
it directly, rather you alter either its background role, or the
widget's palette.
Color themes are loaded from .ini files in /res/themes/
The theme can be switched from the "Themes" section in the system menu.
The basic mechanism is that WindowServer broadcasts a SharedBuffer with
all of the color values of the current theme. Clients receive this with
the response to their initial WindowServer::Greet handshake.
When the theme is changed, WindowServer tells everyone by sending out
an UpdateSystemTheme message with a new SharedBuffer to use.
This does feel somewhat bloated somehow, but I'm sure we can iterate on
it over time and improve things.
To get one of the theme colors, use the Color(SystemColor) constructor:
painter.fill_rect(rect, SystemColor::HoverHighlight);
Some things don't work 100% right without a reboot. Specifically, when
constructing a GWidget, it will set its own background and foreground
colors based on the current SystemColor::Window and SystemColor::Text.
The widget is then stuck with these values, and they don't update on
system theme change, only on app restart.
All in all though, this is pretty cool. Merry Christmas! :^)
Windows that are being moved around by the user are now called "moving"
windows instead of "dragging" windows, to avoid confusion with the
drag and drop stuff.
This bitmap is displayed alongside the dragged text underneath the
mouse cursor while dragging.
This will be a perfect fit for dragging e.g files around. :^)
This patch enables basic drag&drop between applications.
You initiate a drag by creating a GDragOperation object and calling
exec() on it. This creates a nested event loop in the calling program
that only returns once the drag operation has ended.
On the receiving side, you get a call to GWidget::drop_event() with
a GDropEvent containing information about the dropped data.
The only data passed right now is a piece of text that's also used
to visually indicate that a drag is happening (by showing the text in
a little box that follows the mouse cursor around.)
There are things to fix here, but we're off to a nice start. :^)
Move this from WSCompositor::compose() to a separate run_animations()
function to keep compose() readable. We might want to add some more
animations later.
We now show a quick window outline animation when going in/out of
minimized state. It's a simple 10 frame animation at 60fps, just to
give a visual cue of what's happening with the window.
The Taskbar sends over the corresponding button rect for each window
to the WindowServer using a new WM_SetWindowTaskbarRect message.
Note that when unminimizing, we still *show* the window right away,
and don't hold off until the animation has finished. This avoids
making the desktop feel slow/sluggish. :^)
When a window is being resized, its size may differ from the size of its backing
store. In this case, peek at the direction the window is being resized in, and
render the backing store at the same place as it was previously.
https://github.com/SerenityOS/serenity/issues/52
Also change the API to take a destination rect instead of a source rect
since internally it was basically creating a destination rect from the
source rect anyway. It was a little confusing.
Booting without a wallpaper is significantly faster in QEMU when the
host machine is under load (e.g while recording the screen..)
Using a wallpaper is now optional. :^)
An interactive application to modify the current display settings, such as
the current wallpaper as well as the screen resolution. Currently we're
adding the resolutions ourselves, because there's currently no way to
detect was resolutions the current display adapter supports (or at least
I can't see one... Maybe VBE does and I'm stupid). It even comes with
a very nice template'd `ItemList` that can support a vector of any type,
which makes life much simpler.
We shouldn't assume that the pitch of some arbitrary bitmap memory that
we're wrapping is going to be 16-byte aligned. Instead, just take the
pitch as a parameter.
Also update WindowServer to pass the pitch to the framebuffer bitmaps.
The main changes are twofold:
* Buffer flipping is now controlled by the m_screen_can_set_buffer flag
in WSCompositor. This flag, in turn, is impacted by m_can_set_buffer
flag, in WSScreen. m_can_set_buffer is set in the WSScreen constructor
by checking the return value of fb_set_buffer. If the framebuffer
supports this operation, it will succeed, and we record this fact. This
information is then used by WSCompositor to set its own
m_screen_can_set_buffer flag.
* WSScreen now only requests a resolution change of the framebuffer. The
driver itself is ultimately responsible for what resolution or mode is
actually set, so WSScreen has to read the response from that request,
and has no choice but to accept the answer. This allows the driver to
choose a "close enough" value to what was requested, or simply ignore
it.
The result of this is that there is no special configuration necessary
for WindowServer to work with reduced-capability framebuffer devices.
We can't rely on all hardware to give us a way to flip between the back
and front buffer.
This mode should actually perform slightly better but may show some tearing
as we don't have a way to know when we're in vertical retrace.
Instead of LibGUI and WindowServer building their own copies of the drawing
and graphics code, let's it in a separate LibDraw library.
This avoids building the code twice, and will encourage better separation
of concerns. :^)
Previously we were rendering the whole menubar on every compose(),
even if nothing changed about it. Now it's in its own window and can
be invalidated and painted separately.
d66fa60fcf introduced the use of a timer
to coalesce screen updates. This is OK, but it does introduce update
latency.
To help mitigate the impact of this, we now have a second (immediate)
timer. When a compose pass is first triggered, the immediate timer will
allow the compose to happen on the next spin of the event loop (so, only
coalescing updates across a single event loop pass). Any updates that
trigger while the delayed timer is running, though, will be delayed to
that (~60fps) timer.
This fixes#103.