I am trying to read raw video frames from a video capture device using JMF - Java Media Framework.
I successfully recorded the “capture” part - using the Player object created by the manager, I can display live video from a webcam. However, I do not know how to create a custom component to access real frames. This is probably due to the fact that so far the Manager has created every instance of the class that I need for me.
I would like to start by writing a GUI component that displays a video. (I am not familiar with AWT / Swing, but, based on knowledge of other graphical interfaces, I would say something received from, say, JPanel, which draws a video when a request is made to redraw or a new frame is available). I would like to be able to process every new frame and loop at x / y across all pixels. I have access to the raw / RGB format on my device, but automatic conversion, say YUV, will not hurt.
I don’t know where to start. The JMF documentation recommends getting my class from Processor or DataSink in several different places. Using the CPU interface seems redundant - for example, I would not need the playback and synchronization control functions; and I would not know how to implement them in the first place. The output from DataSink looks like a simpler option with less useless abstract functions. However, in any case, I have a complete loss, like:
a) Connect the component to my DataSource video capture
b) Access to the actual frame buffers from the class
Perhaps I will even go in the wrong direction; I just wanted to document what I have tried so far. The JMF documentation seems sparse and is mainly focused on [designing] media players and converters.
. . : x = new Image(captureDevice.getFrame()), , -, , Image, .
, JMF.