hardware stereo on computers is annoying. you need specific hardware, with the appropriate drivers, and even then, not all software is compatible with it.
hardware wise, all of this has been written for and tested on machines with workstation NVIDIA cards (Quadro RTX 5000, RTX A5000). your mileage may vary, but if you’ve got an “Enable 3D” switch in your Windows display settings, it should work?
typically, this is done with OpenGL, but this can also be done with Vulkan, though it needs a little more work.
quick note: i am not a graphics programmer. feel free to contact me on my socials if you spot something to correct!
openGL: quad-buffer
for OpenGL, the technique used is generally referred to as “quad-buffer,” though some libraries can also use “stereo” in their constant names.
monoscopic applications only have a single double-buffered swapchain, with a back buffer which is invisible, and a front buffer which is visible on the screen. apps draw on the back buffer, then swap the front and back buffers at the end of the frame. that way, the user doesn’t see incomplete images drawn on screen.
stereoscopic applications using quad-buffer are essentially double that: they basically are a “double double buffer,” in that each eye gets a double-buffered swapchain. similarly to monoscopic, applications draw on the back buffers, and at the end of the frame, they swap the back and front buffers. the GPU driver then does whatever they elected to do to present the two front buffers.
with GLFW, it doesn’t take that much effort to turn a mono app into a stereo app:
// enable quad-buffer
glfwWindowHint(GLFW_STEREO, GL_TRUE);
// select which side to draw on
glDrawBuffer(GL_BACK_LEFT); // left eye
glDrawBuffer(GL_BACK_RIGHT); // right eye
// swap buffers
glfwSwapBuffers(); // one call for both eyes
and that’s about it. everything else like you would do with a mono app you do with a stereo app! each buffer is completely independent though, so you need to clear each buffer manually.
vulkan: imageArrayLayers
it felt impossible for a long time to make a stereoscopic app with just vulkan. nearly every implementation i had seen online relied on interop extensions between vulkan and openGL (vulkan draws on a shared image, openGL presents that image).
it turns out, there’s one line in the VkSwapchainCreateInfoKHR section of the 5000-page vulkan specification that suggests stereoscopic apps can be done:
imageArrayLayers
is the number of views in a multiview/stereo surface. For non-stereoscopic-3D applications, this value is 1.
that sentence is the key: setting that value to 2 when creating a swapchain, if supported, creates one that the GPU driver (at least on NVIDIA) will interpret as a stereo swapchain.
the changes
naturally, the first change on VkSwapchainCreateInfoKHR
:
VkSwapchainCreateInfoKHR createInfo{};
createInfo.sType = VK_STRUCTURE_TYPE_SWAPCHAIN_CREATE_INFO_KHR;
// ...
createInfo.imageArrayLayers = 2; // 👀
// ...
with this, the swapchain images will have two layers: one for the left eye, one for the right eye. you’ll need to duplicate a few structures, with some changed settings:
-
the
VkImageView
: thebaseArrayLayer
changes to select the correct layer.VkImageViewCreateInfo createInfo{}; createInfo.sType = VK_STRUCTURE_TYPE_IMAGE_VIEW_CREATE_INFO; createInfo.image = swapChainImages[i]; // they take the same image // ... createInfo.subresourceRange.baseMipLevel = 0; createInfo.subresourceRange.levelCount = 1; createInfo.subresourceRange.baseArrayLayer = 0; // 0: left, 1: right createInfo.subresourceRange.layerCount = 1;
-
the
VkFramebuffer
: change the attachment to select the correct image viewVkImageView attachments[] = { swapChainImageViewsLeft[i] // use the relevant image view }; VkFramebufferCreateInfo framebufferInfo{}; framebufferInfo.sType = VK_STRUCTURE_TYPE_FRAMEBUFFER_CREATE_INFO; // rest of the framebuffer config doesn't change.
don’t forget to use the correct framebuffer when recording your command buffer, and you should be good:
VkRenderPassBeginInfo renderPassInfo{};
renderPassInfo.sType = VK_STRUCTURE_TYPE_RENDER_PASS_BEGIN_INFO;
renderPassInfo.renderPass = renderPass;
renderPassInfo.framebuffer = left ? swapChainFramebuffersLeft[imageIndex] : swapChainFramebuffersRight[imageIndex];
// ...
wrapping up
there might be some extra things to note that i didn’t mention here. this mostly serves as a starting point for other people to work from, one that i didn’t have over a year ago when i first looked into this problem.
sample code for mono and stereo vulkan apps can be found on my github. as far as i could test, neither throw validation errors with the API layer, so you should be able to compare both files for a better view of the differences. pull requests are also welcome if there’s relevant additional concepts/features you think are worth showing!