July 30, 2019
In the early days of VR on Linux, when plugging in an HMD into a system completely unaware of what it was, the display was initialized as generic desktop display and the window manager extended the desktop to it. This was obviously undesirable, but you were still able to see cropped and perspectively incorrect desktop windows on your HMD. Only the bravest of us were keen enough to find the cursor on the display and move windows out of the way to use it for extended mode rendering. While the situation was far from perfect, the goal was clear: desktop window manipulation in an accurately rendered, stereo 3D environment, controlled with VR controllers.
The first step into this direction was to make the window manager stop extending the desktop. With his work on drm leasing, Keith Packard introduced a
non-desktop property for X11 displays. A Vulkan VK_EXT_acquire_xlib_display extension was specified, intended to be used by VR compositors to enable rendering to HMDs directly. An equivalent extension is currently being introduced for Wayland and implemented into the graphics stack.
With the first step being completed, the Linux graphics stack was aware of HMDs, and VR runtimes like SteamVR or Monado were now able to render in direct mode, without the necessity of opening a window visible to the window manager and uncertainties about HMD refresh rate being synchronized with the desktop displays.
But this also meant desktop windows were no longer showing up on the HMD. It was time to bring them back, fully rendered, and in 3D.
Today, we are very excited to announce a new open source project which enables interaction with traditional desktop environments, such as GNOME and KDE, in VR. Sponsored by Valve, xrdesktop makes window managers aware of VR and is able to use VR runtimes to render desktop windows in 3D space, with the ability of manipulating them with VR controllers and generating mouse and keyboard input from VR.
Open Source virtual reality desktops have been around since the dawn of consumer VR HMDs. Since the Oculus Rift DK1 in 2012, several notable implementations of VR desktops have been published.
Hesham Wahba created the Ibex Desktop for the Oculus Rift DK1 as early as 2012, in a time before direct mode and OpenXR. Its VR integration was limited to mirroring an entire X11 desktop onto one surface in VR.
In 2014 Forrest Reiling published motorcar, the first 3DUI Wayland compositor for VR with support for controllers and 3D widgets. It's VR integration allowed showing individual application windows in VR, placing them freely in a 3D environment, as well as emulating mouse input with VR controllers. Motorcar was implemented for the Oculus Rift DK2 and the Razer Hydra, using these vendor's proprietary SDKs. Being rather scientific in its approach to usability and distribution, it was hard to set up and did not see wide usage.
These projects have in common that they were limited to a VR only compositor, where desktop windows were shown in VR and not on the desktop displays, or mirrored an entire 2D desktop on one surface in VR.
In contrast to these approaches xrdesktop aims to integrate into existing Linux desktop environments, eliminating the necessity of running a dedicated compositor for only VR and thus making it usable in current setups. For our initial release, we focused on integration in the most popular Linux desktops, GNOME and KDE, but xrdesktop is designed to be integrated into any desktop. This can be done with Compiz-like plugins as for KWin or patches on the compositor in the case of GNOME Shell.
This integration of xrdesktop into the window managers enables mirroring existing windows into XR and to synthesize desktop input through XR actions. xrdesktop can be run as a dedicated scene application, but it also features an overlay mode, where desktop windows are overlaid over any other running VR application.
xrdesktop is a set of several glib based libraries, written in C. This includes providing window textures to the VR runtime via OpenVR overlays to display desktop windows over VR applications, as well as our own Vulkan renderer for a full 3D desktop experience. xrdesktop does not provide a standalone window manager and requires integration into existing ones and can be ported to any X11 or Wayland based window manager. On the graphics driver side, we require a Vulkan implementation providing the
VK_KHR_external_memory extension to provide interoperability with window manager graphics memory. We aim to provide a pragmatic solution for integration with the classical 2D desktop in contrast to VR only window managers and do not use a game engine for simulation and rendering to stay as lightweight as possible.
gulkan A glib wrapper for Vulkan. It provides classes for handling Vulkan instances, devices, shaders and initialize textures from CPU memory and DMA buffers.
gxr An XR API abstraction. Currently it only supports OpenVR, but OpenXR support will be added soon.
libinputsynth A library for synthesizing desktop input like moving and clicking the mouse, as well as keyboard key presses. In contrast to similar libraries the synthesis is implemented in loadable backends like xdo, xi2 and Clutter for GNOME on Wayland. In the future more Wayland backends will be added.
xrdesktop A 3DUI window management library with several widgets, containing overlay and scene renderer backends. It implements features required for 3D window management including a 3D pointer, intersection testing and field of view attached objects.
kwin-effect-xrdesktop A KWin effects plugin for integration into KDE.
kdeplasma-applets-xrdesktop A plasma applet that utilizes D-Bus for toggling XR mode in KWin.
gnome-shell Our patchset on gnome-shell to integrate xrdesktop support.
gnome-shell-extension-xrdesktop An extension that utilizes D-Bus to toggle XR mode in gnome-shell.
Here is a small overview of our user interaction model on how to manipulate desktop windows in 3D.
When holding the Trigger button on the controller you can grab windows. The grab transformation is completely free and bound only to the controller. This means that the relative transformation to the hand is maintained during this operation, including distance and rotation. With this, windows can be placed and rotated freely in 3D space, allowing accurate interaction.
Due to the unrestricted movement in the grab transformation, resetting the window orientation can be handy sometimes. When pressing the Trigger Click the window will reset it's orientation and will realign with the controller, so the window normal matches the pointer ray and up vector.
For translating windows along the pointer ray you can move the Analog Stick up and down to push and pull the window during grabbing. This allows moving windows to distances you are not keen to walk and bringing them back.
Windows are initialized in 3D space with a global Pixel per Meter setting. If you wish to enlarge or shrink windows in 3D space during grabbing, you can use the Analog Stick's left and right axis. A large window has benefits if you want a finer granularity in input, which is useful for clicking small objects or drawing lines.
All the above interactions are not restricted to one hand at a time. Meaning you can transform multiple windows at once, depending how many controllers are connected. It's also up to you which hand to use, since our interaction model does not define a dominant hand. This is especially convenient for left-handed people like myself.
Our main menu can be toggled with the B Button and features widgets that can be activated by pressing the Trigger. The main menu is either head tracked, in the case of only one controller being connected, or hand tracked, in the case of two controllers. We plan to reuse this tracking on other things in the future, like head tracked modal dialogs or desktop windows attached to a controller.
Similar to the GNOME Shell Activities overview, or Exposé modes in other window managers, xrdesktop provides a sphere alignment mode, where all windows are aligned spherically. This alignment mode adjusts to the current viewer's head azimuth. The window positions can be reset with the reset button to the right. The sphere alignment algorithm is currently very primitive and is subject to improvement in terms of smarter positioning in the future.
In order to hide unwanted desktop windows you can pin certain windows to the view and hide all others. This is especially useful when you want to just display certain windows over another VR application in the overlay mode.
Another feature useful in the overlay mode is the disable input button. When input is disabled, VR desktop windows won't be receiving input and the pointer rays of xrdesktop will disappear. This makes the overlay mode more usable when running over another VR application. A demo for this can be seen in our podcast video.
Our default controls for the Valve Index and the VIVE Wand controller can be seen in these graphics. The controls are separated into actions executed in 3D and for synthesized desktop input in 2D. All actions can be remapped to other buttons using SteamVR's key binding UI.
The list of supported controllers will grow, as we intend to add controllers supported by Monado.
For a full demo, commented by the developers, you can watch our release podcast. It includes showing xrdesktop in use with GNOME Shell and KWin with actual desktop applications. We also show how the overlay mode works when running over a VR game.
For other distributions you need to install xrdesktop from source. You can help us by improving the instructions and creating packages for you distro.
In addition to OpenVR API support, we will implement OpenXR to enable running xrdesktop on FOSS runtimes like Monado.
While interacting with a window that is overlayed over a VR application, input should be restricted to this window and the VR application should be paused or have its own input handling disabled.
Since we developed the full scene renderer after the overlay one, we still are facing performance related issues when running through the window manager in the scene app. This will improve as one of our next goals. Currently, the overlay backend is used by default in the window managers. The scene application can be tested already in our example.
For now, xrdesktop focuses on X11, though the GNOME Shell patches should work on a GNOME Wayland session. libinputsynth already comes with a Clutter based input synthesis backend, which enables mouse interactions in a GNOME Wayland session. We would love to see xrdesktop implemented in Weston and other Wayland compositors.
We currently render our own widgets with Cairo and also have example code on how to access rendering of gtk3 and gtk4 widgets. A method of synthesizing input on these and integration as VR-only widgets in xrdesktop would allow us to have 2D UI elements more easily and introduce application launcher and settings menus.
Improvement to our 3D layouts and widgets should enable us to extract a separate library, allowing reuse for other projects that require 3D UI.
We want to expand the current set of actions to interact with windows. Our wish list includes gestures based on finger tracking, pinch zoom with 2 controllers, and scrolling by grabbing and dragging window contents, as seen in touch screen interfaces.
To submit issues, please use our bug tracker.
Following three days at ELCE, Collaborans are continuing their stay in the capital of France’s Auvergne-Rhône-Alpes region to take part…
The KernelCI project, which powers kernelci.org with automated testing for the upstream Linux kernel, has found a new home after sailing…
Next week, Collabora will be sponsoring, exhibiting & speaking at ELCE in Lyon, France. We'll be showcasing not one but two demos at our…