In general, MediaCodec is the one that would be recommended. The OpenMAX AL API was added as a stopgap measure in Android Stagefright is a successor to OpenCore on Android platform compliant to OpenMAX IL, shipped in GB and later android distributions. gst-openmax for android. Contribute to prajnashi/gst-openmax development by creating an account on GitHub.

Author: Mazum Vukasa
Country: Iran
Language: English (Spanish)
Genre: Art
Published (Last): 24 April 2014
Pages: 68
PDF File Size: 15.92 Mb
ePub File Size: 12.72 Mb
ISBN: 566-7-47277-210-9
Downloads: 86152
Price: Free* [*Free Regsitration Required]
Uploader: Mill

I will double check. Sign up or log in Sign up using Google. Thus, keep your timing in line relatively easy and it will work. By using this site, you agree to the Terms of Use and Privacy Policy. I am open to any other framework free or commercial that would accomplish above. It does not give you direct access to the decoded data either, but it is played back directly. If you use MediaCodec. Like I said, there really isn’t one standard here yet.

I will comment on my final approach for other’s benefit. Please note that if you use OpenMAX, you’re tacetly going to have to remember that it’s not an audio renderer; you will have to take the decoded audio and opdnmax it openmsx OpenSLES to get something working.

Sign up using Email and Password. Media architecture Application Framework At the application framework level is application code that utilizes android. Stagefright comes with built-in software codecs for common media formats, but you can also add your own custom hardware openmas as OpenMAX components.

There was a nice presentation I saw QC give a while ago on how to recompile the Android system to give you QC libraries you can package with your application that give full support for OpenMAX but I can no longer find that presentation.


It is practically deprecated even though I’m not sure if there’s any official statement saying that. I saw that OpenMAX is not fully implemented and lacks support in terms of documentation, examples etc.

The advantages of using OpenMAX are actually pretty phenominal. Build your plugin as a shared library with the name libstagefrighthw. It allows companies that build platforms e.

Android includes Stagefright, a media playback engine at the native level that has built-in software-based codecs for popular media formats. Post as a guest Name.

Forums – MediaCodec vs OpenMAX as implementation interface

Ketan androoid Unfortunately, Google isn’t providing a complete implementation; so in this case it really falls down. A platform can be compliant to one or both of these profiles by providing all features included in a profile. It does not support any container format at all on its own, but you as a caller are supposed to take care of that. I thought for MediaCodec, it can stream mp4, webp etc formats as mentioned online.

It is an application-level, C-languagemultimedia API designed for resource-constrained devices. It provides abstractions for routines that are especially useful for processing of audio, video, and still images. I would be doing some processing on each video frames. To do this, you must create the OMX components and an OMX plugin that hooks together your custom codecs anroid the Stagefright framework.

So if you want to do streaming playback of a format other than MPEG TS, you need to handle extracting of the packets yourself or use some other library, such as libavformat, for that task.

Understanding Android Stagefright Internals (II) – OpenMAX IL and Stagefright overview

Architecture Media applications interact with the Android native multimedia framework according to the following architecture. Hope this helps to other people.


Components can be sources, sinks, codecs, filters, splitters, mixers, or any other data operator. The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party.

Archived from the original PDF on Avoid writing own time sync for audio and video. Hi mstorsjo, thanks you for quick pros and cons analysis. It does give you direct access to the decoded output data, but to present it, you need to handle sync manually. Is this assumption correct? OpenMAX provides three layers of interfaces: Standards of the Khronos Group. If you need to do processing of the decoded frames, MediaCodec is probably the only way to go. Requires you to handle sync manually Quite low level, requires you to do a lot of work For extracting individual packets of data, there’s the MediaExtractor class, which will be useful with some common file formats for static files.

OpenMAX – Wikipedia

What’s sad is the different levels of support even amongst different NDK versions has created a situation where it’s not easy to create sample code. If you use MediaCodec, you would need to handle sync of audio and video during playback. Hi, I agree completely. By clicking “Post Your Answer”, you acknowledge that you have read our updated terms of serviceprivacy policy and cookie policyand that your continued use of the website is subject to these policies.

Everything else With MediaCodec, you need to provide individual packets of data to decode.