Setting up XR

Introduction to the XR system in Godot

Godot provides a modular XR system that abstracts many of the different XR platform specifics away from the user.At the core sits theXRServer which acts as a central interface to the XR system that allows users to discover interfaces and interact with the components of the XR system.

Each supported XR platform is implemented as anXRInterface. Supported interfaces register themselves with theXRServer and can be queried with thefind_interface method on theXRServer. When the desired interface is found it can be initialized by callinginitialize on the interface.

Warning

A registered interface means nothing more than that the interface is available, if the interface is not supported by the host system, initialization may fail and returnfalse. This can have many reasons and sadly the reasons differ from platform to platform. It can be because the user hasn't installed the required software, or that the user simply hasn't plugged in their headset. You as a developer must thus react properly on an interface failing to initialize.

Due to the special requirements for output in XR, especially for head mounted devices that supply different images to each eye, theXRServer in Godot will override various features in the rendering system. For stand-alone devices this means the final output is handled by theXRInterface and Godot's usual output system is disabled. For desktop XR devices that work as a second screen it is possible to dedicate a separateViewport to handle the XR output, leaving the main Godot window available for displaying alternative content.

Note

Note that only one interface can be responsible for handling the output to an XR device, this is known as the primary interface and by default will be the first interface that is initialized. Godot currently thus only supports implementations with a single headset.It is possible, but increasingly uncommon, to have a secondary interface, for example to add tracking to an otherwise 3DOF only device.

There are three XR specific node types that you will find in nearly all XR applications:

  • XROrigin3D represents, for all intents and purposes, the center point of your play space. That is an oversimplified statement but we'll go into more detail later. All objects tracked in physical space by the XR platform are positioned in relation to this point.

  • XRCamera3D represents the (stereo) camera that is used when rendering output for the XR device. The positioning of this node is controlled by the XR system and updated automatically using the tracking information provided by the XR platform.

  • XRController3D represents a controller used by the player, commonly there are two, one held in each hand. These nodes give access to various states on these controllers and send out signals when the player presses buttons on them. The positioning of this node is controlled by the XR system and updated automatically using the tracking information provided by the XR platform.

There are other XR related nodes and there is much more to say about these three nodes, but we'll get into that later on.

Which Renderer to use

Godot has 3 renderer options for projects: Compatibility, Mobile, and Forward+.The current recommendation is to use the Mobile renderer for any desktop VR project,and use the Compatibility renderer for any project running on a standalone headsetlike the Meta Quest 3. XR projects will run with the Forward+ renderer, but it isn'twell optimized for XR right now compared to the other two.

OpenXR

OpenXR is a new industry standard that allows different XR platforms to present themselves through a standardised API to XR applications. This standard is an open standard maintained by the Khronos Group and thus aligns very well with Godot's interests.

The Vulkan implementation of OpenXR is closely integrated with Vulkan, taking over part of the Vulkan system. This requires tight integration of certain core graphics features in the Vulkan renderer which are needed before the XR system is setup. This was one of the main deciding factors to include OpenXR as a core interface.

This also means OpenXR needs to be enabled when Godot starts in order to set thingsup correctly. Check theEnabledsetting in your project settings underXR > OpenXR.

../../_images/openxr_settings.webp

You can find several other settings related to OpenXR here as well. These can't bechanged while your application is running. The default settings will get us started,but for more information on what's here seeOpenXR Settings.

You'll also need to go toXR > Shaders in the project settings and check theEnabledbox to enable them. Once you've done that click theSave & Restart button.

../../_images/xr_shaders.webp

Warning

Many post process effects have not yet been updated to support stereoscopic rendering. Using these will have adverse effects.

Setting up the XR scene

Every XR application needs at least anXROrigin3D and anXRCamera3D node. Most will have twoXRController3D, one for the left hand and one for the right. Keep in mind that the camera and controller nodes should be children of the origin node. Add these nodes to a new scene and rename the controller nodes toLeftHand andRightHand, your scene should look something like this:

../../_images/xr_basic_scene.webp

The warning icons are expected and should go away after you configure thecontrollers. Select the left hand and set it up as follows:

../../_images/xr_left_hand.webp

And the right hand:

../../_images/xr_right_hand.webp

Right now all these nodes are on the floor, they will be positioned correctly in runtime. To help during development, it can be helpful to move the camera upwards so itsy is set to1.7, and move the controller nodes to-0.5,1.0,-0.5 and0.5,1.0,-0.5 for respectively the left and right hand.

Next we need to add a script to our root node. Add the following code into this script:

extendsNode3Dvarxr_interface:XRInterfacefunc_ready():xr_interface=XRServer.find_interface("OpenXR")ifxr_interfaceandxr_interface.is_initialized():print("OpenXR initialized successfully")# Turn off v-sync!DisplayServer.window_set_vsync_mode(DisplayServer.VSYNC_DISABLED)# Change our main viewport to output to the HMDget_viewport().use_xr=trueelse:print("OpenXR not initialized, please check if your headset is connected")

This code fragment assumes we are using OpenXR, if you wish to use any of the other interfaces you can change thefind_interface call.

Warning

As you can see in the code snippet above, we turn off v-sync.When using OpenXR you are outputting the rendering results to an HMD that often requires us to run at 90Hz or higher.If your monitor is a 60hz monitor and v-sync is turned on, you will limit the output to 60 frames per second.

XR interfaces like OpenXR perform their own sync.

Also note that by default the physics engine runs at 60Hz as well and this can result in choppy physics.You should setEngine.physics_ticks_per_second to a higher value.

If you run your project at this point in time, everything will work but you will be in a dark world. So to finish off our starting point add aDirectionalLight3D and aWorldEnvironment node to your scene.You may wish to also add a mesh instance as a child to each controller node just to temporarily visualise them.Make sure you configure a sky in your world environment.

Now run your project, you should be floating somewhere in space and be able to look around.

Note

While traditional level switching can definitely be used with XR applications, where this scene setup is repeated in each level, most find it easier to set this up once and loading levels as a subscene. If you do switch scenes and replicate the XR setup in each one, do make sure you do not runinitialize multiple times. The effect can be unpredictable depending on the XR interface used.

For the rest of this basic tutorial series we will create a game that uses a single scene.


User-contributed notes

Please read theUser-contributed notes policy before submitting a comment.