Hello all, Hopefully I'm just confused about the layering of things or something but I'm seeing some unexpected behaviour. I expectedXvfb to use a software renderer (likellvmpipe) but it seems to be using the d3d12 driver? For me, sinceXvfb is not using the software renderer, some automated testing in my OpenGL project isn't consistent between WSL and 'real' Ubuntu instances. I can work around it for now with$LIBGL_ALWAYS_SOFTWARE but I want to better understand why this happens. Is it a bug in WSLg?
My methodology: When I runglxinfo -B in a fresh WSL terminal I see, as expected, that the D3D12 'device' is used: $> glxinfo -Bglxinfo -Bname of display: :0display: :0 screen: 0direct rendering: YesExtended renderer info (GLX_MESA_query_renderer): Vendor: Microsoft Corporation (0xffffffff) Device: D3D12 (NVIDIA GeForce RTX 3080 Ti) (0xffffffff) Version: 24.2.8 Accelerated: yes...
When I runLIBGL_ALWAYS_SOFTWARE glxinfo -B I see, as expected, that the llvmpipe 'device' is used: $> LIBGL_ALWAYS_SOFTWARE=true glxinfo -Bname of display: :0display: :0 screen: 0direct rendering: YesExtended renderer info (GLX_MESA_query_renderer): Vendor: Mesa (0xffffffff) Device: llvmpipe (LLVM 19.1.1, 256 bits) (0xffffffff) Version: 24.2.8 Accelerated: no...
When I runxvfb-run glxinfo -B I see $> xvfb-run glxinfo -Bname of display: :99display: :99 screen: 0direct rendering: YesExtended renderer info (GLX_MESA_query_renderer): Vendor: Microsoft Corporation (0xffffffff) Device: D3D12 (NVIDIA GeForce RTX 3080 Ti) (0xffffffff) Version: 24.2.8 Accelerated: yes...
which, to me, is weird; I wouldn't expect the Xvfb server to expose what looks like "real hardware" to its X clients. |