I experimented with Qt Webkit to render a video element in a browser. All my frames are decoded using a C ++ application (the original video packets received over the network), and I can display the video in the element using Qt, as described here: QtWebEngine displaying Qt's own widgets in the DOM? but z-index Qt Webkit's problems limit me :(
Instead, I was wondering if any of the following is possible, and if someone has achieved anything of this before.
I have a C ++ application that launches a Chrome Embedded Framework window (basically a browser window). Is there a way my C ++ application can render video directly in this browser window - using webGL or a similar library? Perhaps in some way I can use openGL in a C ++ application to write to the graphics card's memory, and webGL reads the video card data - it would be great if such a method existed, but I'm afraid that the sandbox will not allow this.
eg.
Browser webGL surface/object <--- C++ application <--- file/network data
A (very!) Naive approach would be for the browser window to connect to the main C ++ application using websocket at the loopback address, and websocket with the flow of surfaces to the browser. Then these surfaces could be drawn on html5 canvas or using webGL - this would probably be terrible in terms of delay.
eg
Browser <--- websocket <--- C++ decode <--- file/network data
Thank you very much - any other suggestions for alternative libraries that I could use for these code examples would be very grateful :)
source share