OpenGL source code for Oculus Rift

If you are at all interested in gadgets, graphics, games you have surely picked up news about the up and coming disruptive technology that is oculus rift. If you haven't, then you should definitely check it out.

I have ordered my developer kit already, and I am looking forward to using the head mount display as a way to display the UI for my  robot.

Since there isn't really much to go on yet when it comes to code examples and such, I decided to create example source code for oculus rift using OpenGL. A little tutorial if you will, for rendering a scene in OpenGL in a way that will be adaptible to the VR gadget when it arrives.

First some explanation. When you render in 3D, you will create what is called a frustum, which means "view volume". This is usually shaped as a pyramid protruding from your viewpoint in the scene, in the direction you are "viewing".

Regular perspective frustum in OpenGL

This is fine when you are rendering to a regular monitor. However when you want to display the result in any form of stereoscopic display such as a 3DTV, VR goggles or similar, you will have to render TWO frustums, one for each eye.

Stereoscopic perspective frustum in OpenGL

Since most stereoscopic displays today have moderate field of view, the image will not be very immersive at all. The Oculus Rift changes this by boosting the field of view (also known as the view angle) to 110 degrees. This goes beyond what we are able to preceive, and will together with the stereoscopic 3D effect give a very immersive effect.

Wide angled stereoscopic perspective frustum (Oculus Rift style) in OpenGL

So how is this done in OpenGL? This entry in the OpenGL FAQ sums it up really nicely.

What are the pros and cons of using glFrustum() versus gluPerspective()? Why would I want to use one over the other?
glFrustum() and gluPerspective() both produce perspective projection matrices that you can use to transform from eye coordinate space to clip coordinate space. The primary difference between the two is that glFrustum() is more general and allows off-axis projections, while gluPerspective() only produces symmetrical (on-axis) projections. Indeed, you can use glFrustum() to implement gluPerspective(). However, aside from the layering of function calls that is a natural part of the GLU interface, there is no performance advantage to using matrices generated by glFrustum() over gluPerspective().
Since glFrustum() is more general than gluPerspective(), you can use it in cases when gluPerspective() can't be used. Some examples include projection shadows, tiled renderings, and stereo views.
Tiled rendering uses multiple off-axis projections to render different sections of a scene. The results are assembled into one large image array to produce the final image. This is often necessary when the desired dimensions of the final rendering exceed the OpenGL implementation's maximum viewport size.
In a stereo view, two renderings of the same scene are done with the view location slightly shifted. Since the view axis is right between the “eyes”, each view must use a slightly off-axis projection to either side to achieve correct visual results.

The glFrustum call will in other words allow you to set up a view matrix that with the necessary offset. But how should we go about rendering the scene? The oculus rift expects the image for each eye to be rendered side by side, so we simply render the scene twice, using the proper viewport each time. Again, from the OpenGL FAQ:

9.060 How can I draw more than one view of the same scene?
You can draw two views into the same window by using the glViewport() call. Set glViewport() to the area that you want the first view, set your scene’s view, and render. Then set glViewport() to the area for the second view, again set your scene’s view, and render.
You need to be aware that some operations don't pay attention to the glViewport, such as SwapBuffers and glClear(). SwapBuffers always swaps the entire window. However, you can restrain glClear() to a rectangular window by using the scissor rectangle.
Your application might only allow different views in separate windows. If so, you need to perform a MakeCurrent operation between the two renderings. If the two windows share a context, you need to change the scene’s view as described above. This might not be necessary if your application uses separate contexts for each window.
With no further ado, here is my working code for a stereoscopic view, which I think will work pretty well with the oculus rift from what I have gathered. It might need some tweaking with respect to projection mapping as they have been talking about  "adjusting for fisheye effect". However, I assume that it will be easy to perform with a custom projection matrix.

 * StereoView.hpp
 *  Created on: Feb 7, 2013
 *      Author: Lennart Rolland


#include "GLStuff.hpp"
#include "View.hpp"

using namespace std;
// Magic constant
const float DTR = 0.0174532925f;
// Intraocular distance (distance between eyes, should match the real distance between the eyes of the viewer when realism is a goal)
const float IOD = 0.5f;

class StereoView: public View {

 class Eye {
  float left;
  float right;
  float bot;
  float top;
  float translation;
  float near;
  float far;

  Eye(float lf, float rf, float bf, float tf, float mt, float near, float far) :
    left(lf), right(rf), bot(bf), top(tf), translation(mt), near(near), far(far) {

  void apply() {
   glMatrixMode (GL_PROJECTION);
   //Set view frustum
   glFrustum(left, right, bot, top, near, far);
   //Translate to cancel parallax
   glTranslatef(translation, 0.0, 0.0);
   glMatrixMode (GL_MODELVIEW);

 int w, h;
 float aspect, top, right, shift, distance;
 Eye eyeLeft, eyeRight;
 bool useViewports;

 void init(void) {
  glMatrixMode (GL_PROJECTION);
  glMatrixMode (GL_MODELVIEW);

 void drawSceneInstance(Scene &scene, Engine &e) {
  //Translate to screen plane
  glTranslatef(0.0, 0.0, distance);

 void selectEye(bool left) {
  //Use viewports
  if (useViewports) {
   const int w2 = w / 2;
   glViewport(left ? 0 : w2, 0, w2, h);
   glScissor(left ? 0 : w2, 0, w2, h);

   glEnable (GL_SCISSOR_TEST);
   glClearColor(left ? 1.0 : 0, 0, left ? 0 : 1.0, 1.0);

   //cout << "viewport left:" << left << "\n";
  //Use native OpenGL stereo back buffers
  else {
   glDrawBuffer(left ? GL_BACK_LEFT : GL_BACK_RIGHT);
   //cout << "buffer left:" << left << "\n";

 StereoView(int w = 1280, int h = 720, bool useViewports = true, float near = 3.0, float far = 30.0, float fov = 110, float screenZ = 10.0, float distance = -10.0) :
   w(w), h(h), aspect(double(w) / double(h)), top(near * tan(DTR * fov / 2)), right(aspect * top), shift((IOD / 2) * near / screenZ), distance(distance), eyeLeft(top, -top, -right + shift, right + shift, IOD / 2, near, far), eyeRight(top, -top, -right - shift, right - shift, -IOD / 2, near, far), useViewports(useViewports) {

 virtual ~StereoView() {

 void resize(int w, int h) {
  float fAspect, fHalfWorldSize = (float) (1.4142135623730950488016887242097 / 2);
  glViewport(0, 0, w, h);
  glMatrixMode (GL_PROJECTION);
  if (w <= h) {
   fAspect = (GLfloat) h / (GLfloat) w;
   glOrtho(-fHalfWorldSize, fHalfWorldSize, -fHalfWorldSize * fAspect, fHalfWorldSize * fAspect, -10 * fHalfWorldSize, 10 * fHalfWorldSize);
  } else {
   fAspect = (GLfloat) w / (GLfloat) h;
   glOrtho(-fHalfWorldSize * fAspect, fHalfWorldSize * fAspect, -fHalfWorldSize, fHalfWorldSize, -10 * fHalfWorldSize, 10 * fHalfWorldSize);
  glMatrixMode (GL_MODELVIEW);

 void renderView(Scene &scene, Engine &e) {
  gluLookAt(pos.x, pos.y, pos.z, dir.x, dir.y, dir.z, up.x, up.y, up.z);
  //Clear color and depth for all buffers
  glDrawBuffer (GL_BACK);
  glViewport(0, 0, w, h);
  //Left eye
  drawSceneInstance(scene, e);
  //Right eye
  drawSceneInstance(scene, e);
  glViewport(0, 0, w, h);
  glDisable (GL_SCISSOR_TEST);


#endif /* STEREO_VIEW_HPP_ */


  1. This comment has been removed by the author.

  2. Good work! neat! Have you tried lens distort compensate too?

  3. Hi, and thanks for the reply!

    I just received my rift developer edition myself, and I haven't had the time yet to see this through. I hope to get time for it soon though. PS: Please let me know if you get anywhere with the distortion!