This answer explains how this task can be accomplished using OpenGL, Qt, and GStreamer . But before I start, there are two questions that need to be addressed right away:
- HTML5 streaming video is still problematic . I suggest using Ogg for coding, as it is better supported by modern browsers than h264;
- Encoding video and transmitting it via HTTP is a rather difficult task without using third-party libraries. Take a look at GStreamer (a cross-platform library for processing multimedia files). This is what I use here to encode and stream a frame from an OpenGL framebuffer;
What does a roadmap do to implement something like this?
Start by capturing frames from the framebuffer . There are various methods that can be used for this purpose , and the Googling for opengl offscreen function will return some interesting posts and documents. I will not go into technical details, as this issue has been widely covered, but for educational purposes, I share the code below to demonstrate how to get the frame and save it as jpg on disk:
// GLWidget is a class based on QGLWidget. void GLWidget::paintGL() { /* Setup FBO and RBO */ glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _fb); glGenFramebuffersEXT(1, &_fb); glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _fb); glGenRenderbuffersEXT(1, &_color_rb); glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, _color_rb); GLint viewport[4]; glGetIntegerv(GL_VIEWPORT, viewport); glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_BGRA, viewport[2], viewport[3]); glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_RENDERBUFFER_EXT, _color_rb); /* Draw the scene (with transparency) */ glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glMatrixMode(GL_MODELVIEW); glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glLoadIdentity(); glTranslatef(-2.0f, 0.0f, -7.0f); glRotatef(45, 1.0f, 1.0f, 0.0f); _draw_cube(); glLoadIdentity(); glTranslatef(2.0f, 0.0f, -7.0f); glRotatef(30, 0.5f, 1.0f, 0.5f); _draw_cube(); glFlush(); /* Retrieve pixels from the framebuffer */ int imgsize = viewport[2] * viewport[3]; std::cout << "* Viewport size: " << viewport[2] << "x" << viewport[3] << std::endl; glPixelStorei(GL_UNPACK_ALIGNMENT, 1); glReadBuffer(GL_COLOR_ATTACHMENT0); unsigned char* pixels = new unsigned char[sizeof(unsigned char) * imgsize * 4]; glReadPixels(0, 0, viewport[2], viewport[3], GL_BGRA, GL_UNSIGNED_BYTE, pixels); // Use fwrite to dump data: FILE* fp = fopen("dumped.bin","w"); fwrite(pixels, sizeof(unsigned char) * imgsize * 4, 1, fp); fclose(fp); // or use QImage to encode the raw data to jpg: QImage image((const unsigned char*)pixels, viewport[2], viewport[3], QImage::Format_RGB32); QImage flipped = image.mirrored(); flipped.save("output2.jpg"); // Disable FBO and RBO glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0); glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0); // Delete resources glDeleteRenderbuffersEXT(1, &_color_rb); glDeleteFramebuffersEXT(1, &_fb); delete[] pixels; }
A QImage used to convert the source GL_BGRA frame to a jpg file. The draw_scene() method simply draws a color cube with transparency:

The next step is to encode the frame and pass it through HTTP . However, you probably won’t want to save every frame from the framebuffer to disk before you can transfer it. And you're right, you don’t have to! GStreamer provides an API API that you can use in your application to perform operations performed by gst-launch (presented below). There's even a Qt wrapper for this library called QtGstreamer to make things even easier.
GStreamer 1.0 provides the cmd-line application called gst-launch-1.0 , which can be used to test its functions before switching to encoding. Developers usually play with him to assemble a pipeline of instructions that force them to do magic before starting the code.
The following command shows how it can be used to decode jpg, encode it in Ogg theora, and transmit this single image in HTTP so that the HTML5 page can play it:
gst-launch-1.0.exe -v filesrc location=output.jpg ! decodebin ! imagefreeze ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc ! oggmux ! tcpserversink host=127.0.0.1 port=8080
The third and final step is to open the HTML5 page created to display the stream . This step should be completed while gst-launch is running, so copy and paste the code below into the file and open this page in your browser (I tested this in Chrome). The page connects to the local host, port 8080, and begins to receive the stream. You may have noticed that the gst-launch pipeline imposes a clock on the original image:
<html> <title>A simple HTML5 video test</title> </html> <body> <video autoplay controls width=320 height=240> <source src="http://localhost:8080" type="video/ogg"> You browser doesn't support element <code>video</code>. </video> </body>

I'm just trying to figure out exactly how GStreamer can convert a raw BGRA frame to jpg (or other formats) before streaming it.
Update:
The problem is solved! It is possible to encode a raw BGRA frame to jpg or * Ogg and directly transfer it without creating intermediate files to disk. I allowed to set the FPS limit to 15, and also reduce theoraenc standard quality by 50%:
gst-launch-1.0.exe -v filesrc location=dumped.bin blocksize=1920000 ! video/x-raw,format=BGRA,width=800,height=600,framerate=1/1 ! videoconvert ! video/x-raw,format=RGB,framerate=1/1 ! videoflip method=vertical-flip ! imagefreeze ! videorate ! video/x-raw,format=RGB,framerate=30/2 ! videoconvert ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc quality=24 ! oggmux ! queue ! tcpserversink host=127.0.0.1 port=8080 sync-method=2
There are several operations in this pipeline that you really don't need. However, some of the features you can do to optimize bandwidth reduce the frame size to a smaller size (400x300), set a lower limit for FPS, reduce the quality of the encoded frame, etc.:
gst-launch-1.0.exe -v filesrc location=dumped.bin blocksize=1920000 ! video/x-raw,format=BGRA,width=800,height=600,framerate=1/1 ! videoconvert ! video/x-raw,format=RGB,framerate=1/1 ! videoflip method=vertical-flip ! videoscale ! video/x-raw,width=400,height=300! imagefreeze ! videorate ! video/x-raw,format=RGB,framerate=30/2 ! videoconvert ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc quality=24 ! oggmux ! tcpserversink host=127.0.0.1 port=8080 sync-method=2