Porting FFmpeg to Android
Background
If you’re not yet familiar with FFmpeg, take a look at FFmpeg’s commonly used commands. Study its features—such as audio/video merging, decoding and playback, clip extraction, turning multiple images into a video with audio, and format conversion—to see if it can add power and fun to your app.
For example, apps like “voice-over shows” make dubbing or adding a voice track a highly engaging user feature. We decided to develop such a dubbing module and discovered FFmpeg as a strong candidate to provide these capabilities on Android. It took roughly 3–4 days of attempts, with many online guides proving unhelpful—until I stumbled on a tutorial titled “android usage of ffmpeg and how to invoke it,” which finally led me to a successful port. Through experimentation, I found that FFmpeg 1.2 combined with ndk-r9 was the only combination that worked in my case.
Approach
FFmpeg is written in C and includes a main()
function. Our plan was:
- Use the Android NDK to compile
libffmpeg.so
. - Then compile (and slightly modify)
ffmpeg.c
, renaming itsmain()
function tovideo_merge(int argc, char **argv)
. This allows us to call it directly from JNI in order to perform video merging and other operations.
For instance, to merge videos—akin to running ffmpeg -i src1 -i src2 -y output
in a terminal—you might do something like:
video_merge(5, argv); // where argv simulates the command-line arguments
Environment
- Operating System: Ubuntu 12.04
- FFmpeg Version: 1.2
- NDK Version: ndk-r9
Before you start, it’s a good idea to browse other tutorials or guides about FFmpeg on Android. Use this article if you get stuck, since it covers potential pitfalls that might appear under certain circumstances.
Modifying the Interface and Android.mk
When creating your JNI interface for FFmpeg, you will write an Android.mk
file to link libraries and produce a usable .so
library. Some example Android.mk
files from the web might fail in your specific environment. Essentially, the Android.mk
file tells the NDK which source files to compile and which libraries to link to.
My method involves a two-step approach:
- First, compile a shared library named
myffmpeg
. - In a second module (e.g.,
ffmpeg-jni
), link againstmyffmpeg
to produce the final.so
.
Be sure to place the compiled libffmpeg.so
in your jni
directory so that the linker can find it.
Debugging FFmpeg
After porting FFmpeg to Android, you’ll likely use JNI to call its functions. You may need to debug the C layer, which is much easier if you can see detailed logs—just like running FFmpeg on the command line.
In Eclipse, if you hold the Ctrl key and click a function like av_log
, it should trace you into ffmpeg/libavutil/log.c
, where av_log_default_callback
calls Android’s __android_log_print
to output logs to Logcat. By inspecting these logs, you can troubleshoot failures related to unsupported formats, merging issues, etc.
If you’re encountering crashes, run:
adb shell logcat | ndk-stack -sym obj/local/armeabi
to see where the exception occurred. Also, note that if FFmpeg’s original main()
ends with exit(0)
, you’ll need to remove or comment it out, or your entire app will close.
Memory Leaks & the Service Approach
Sometimes, calling FFmpeg repeatedly can trigger errors like "INVALID HEAP ADDRESS IN dlfree ffmpeg"
, often caused by memory leaks within FFmpeg. One workaround is to place the entire merging/encoding process in a dedicated Service, and kill the Service upon completion to clear memory.
<!-- AndroidManifest.xml -->
<service android:name=".FFmpegService" />
Then, register a Receiver
to end the Service once the merge is done. This way, you avoid repeated calls without proper cleanup in the same process.
Potential Issues
- AAC Playback Failure
Certain devices (like Xiaomi 2s) may fail to play back AAC audio files with the defaultMediaPlayer
. - Codec Support
To support formats like AMR-NB or MP3, you must enable them when compiling FFmpeg. If the build script can’t locate the required libraries/headers, the build fails. - Slow Merge Times
Merging a 10-second 1280×720 video clip with audio can take 30–60 seconds on some devices, which may frustrate users. You might let them preview the recording before merging, or offload the task to a server.
For “voice-over shows” specifically:
- The original video, subtitle data, and partial audio (with certain segments removed for dubbing) are usually downloaded ahead of time.
- The user records their voice. To create the final output, you just mix the newly recorded audio with the pre-silenced sections.
- If local merges are too slow, you could upload the necessary data to a server, let the server handle merging, and then download the finished file.
Using the NDK in Eclipse
You don’t have to run ndk-build
from the command line each time. In Eclipse, simply right-click the project, choose Android Tools → Add Native Support, and Eclipse will automatically invoke ndk-build
when you run the project.
Auto-Generating JNI Header Files
Manually creating JNI headers can be tedious. Instead, you can use the javah
command. In Eclipse, you could configure it as an external tool, for example:
javah -jni -classpath bin/classes -d jni com.example.ffmpeg.MyFFmpeg
Run it, and it will generate a file like com_example_ffmpeg_MyFFmpeg.h
inside the jni
folder. Then, simply include it in your C code and implement the corresponding function signatures.
Conclusion
Porting FFmpeg to Android involves multiple skill sets: NDK config, C/C++ compilation and linking, JNI bridging, and audio/video codecs. If you face issues like failing merges, missing formats, or linker errors, you’ll need to carefully check configurations and logs. Hopefully, this post helps you avoid some common pitfalls. If you’re also working on FFmpeg for Android or encountering tricky problems, feel free to share your experience or questions in the comments, and let’s learn together!