Ofcourse the single most important widget in Odysseus is to embed the (sandboxed) WebKit browser engine in GTK widget hierarchy. Today I want to describe how this surface-level works.

Which is implemented in two layers: the basic infrastructure (gtk/WebViewBase) & the public API (glib/WebView).

The WebViewBase has logic for tracking whether the view’s visible and preprocessing the GDK input events, both of which are forwarded to the sandbox.

The rendering hook is directly forwarded to WebKit, possibly via a shared-memory optimization.

Tooltips are stored aside until GTK asks for them.

It can contain a “dialog” and the WebInspector as child views.

In case it’s relevant there’s a suite of classes (in UIProcess/gtk/GestureController) WebViewBase dispatches to interpret touch gestures outside the sandbox, and lower them to mouse events.

And there’s DragAndDropHandler to convert between the GTK API for that and WebKit’s, which is similar integrated into WebViewBase.

And finally it integrates WebViewBaseAccessible to communicate it’s state and refer the screenreader to it’s in-sandbox Accessibility ToolKit “children”.

As I said the WebView subclass of WebViewBase provides the public API, and for the most part it just translates the C++ API of PageProxy to GObject/GIO. PageProxy in turn sends RPC calls into the sandbox to be handled by a Page class there.

The trickier bit is getting messages (rather than simple responses) back from the Page, and there’s dedicated classes routing state changes to it.

WebViewBase has classes routing data to it relating to:

You can find more details about various topics here at this site.