I have a very simple question regarding Android and ffmpeg. I got ffmpeg from http://bambuser.com/opensource and was able to compile it for ARM.
The results are binary files ( ffmpeg ), as well as several libsomething.so files.
My question is: is it enough to decode video? How do I really use ffmpeg then?
To download the library, I:
static { System.load("/data/data/com.package/lib/libavcodec.so"); }
It loads fine. But then what?
<sub> More explanations: I saw other projects in which people had a ffmpeg source in the JNI directory in the project. They also created some Android.mk files and some C code along with it. Do I need this? Why would I first create .so files and then copy the ffmpeg source code again? Sub>
<sub> I know NDK and how it should work, but I have never seen an example of how to call it ffmpeg functions, because people seem to hide their implementations (which seems to be clear), but do not even give useful pointers or examples. Sub>
<sub> Let's say I would like to decode the video file. What kinds of native methods do I need to implement? How to start a project? What types of data need to be transmitted? etc. Of course, there are a few people who have at least done this, I know this from the search for watches and clocks. Sub>
slhck source share