OpenMAX AL is a royalty-free, cross platform open standard for accelerating the capture, and presentation of audio, video, and images in multimedia applications on embedded and mobile devices. The OpenMAX IL Integration Layer API defines a standardized media component interface to enable developers and platform providers to integrate and communicate with multimedia codecs implemented in hardware or software. All implementations should aim to match this version for interoperability. Development of multimedia hardware platforms is gathering pace as consumer demand grows for improved functionality from applications such as video, audio, voice, and 3D on platforms such as diverse as smartphones, audio and video media players and games consoles. In general, this class of product requires high-performance processing and high data throughput capabilities.
|Country:||Papua New Guinea|
|Published (Last):||2 April 2004|
|PDF File Size:||14.70 Mb|
|ePub File Size:||2.83 Mb|
|Price:||Free* [*Free Regsitration Required]|
I have written basic player using ffmpeg but I have not been able to use hardware decoders, so not following it. It is practically deprecated even though I'm not sure if there's any official statement saying that. They operate on slightly different levels of abstraction, and for most cases, MediaCodec is less work. It does not support other container formats.
It does not give you direct access to the decoded data either, but it is played back directly. It does, however, take care of sync of audio and video. With MediaCodec, you need to provide individual packets of data to decode. It does not support any container format at all on its own, but you as a caller are supposed to take care of that.
It does give you direct access to the decoded output data, but to present it, you need to handle sync manually. In Android 6. For extracting individual packets of data, there's the MediaExtractor class, which will be useful with some common file formats for static files.
I don't think it is usable for streaming e. So if you want to do streaming playback of a format other than MPEG TS, you need to handle extracting of the packets yourself or use some other library, such as libavformat, for that task. If you use MediaCodec, you would need to handle sync of audio and video during playback. If you need to do processing of the decoded frames, MediaCodec is probably the only way to go.
You can either get the decoded image data as raw YUV, or get it in a GL surface that you can modify using shaders. Learn more. MediaCodec NDK vs. Asked 3 years, 7 months ago. Active 1 year, 11 months ago. Viewed 2k times. My questions are: Is this the best way to use hardware decoders on mobile Snapdragon on Android? I am open to any other framework free or commercial that would accomplish above. Ketan Ketan 6 6 silver badges 16 16 bronze badges.
Active Oldest Votes. In general, MediaCodec is the one that would be recommended. Pros of MediaCodec: Generic, flexible Works equally well with any container doesn't require repacking into MPEG TS Cons of MediaCodec: Requires you to handle sync manually Quite low level, requires you to do a lot of work For extracting individual packets of data, there's the MediaExtractor class, which will be useful with some common file formats for static files. Hi mstorsjo, thanks you for quick pros and cons analysis.
I thought for MediaCodec, it can stream mp4, webp etc formats as mentioned online. I will double check. It appears that MediaCodec will be way to go - thanks again. One more question - Does MediaCodec guarantee that it will use hardware decoders and if fails then fallbacks to sw mode?
Android NDK Native APIs
The includes below are required. This enumeration reflects the current state of the component when. In the Loaded state, the component is not allowed to. The application will send one or more. When the application sends the.
Subscribe to RSS
Forums - MediaCodec vs OpenMAX as implementation interface
Does any approach increases overhead for syncing video and audio? I would be doing some processing on each video frames. Here are the answers to your questions. Use the right library for your needs. Thus, keep your timing in line relatively easy and it will work. This is unfortunately an area that hasn't received a lot of attention from Google. There is not one officially supported way of playing media within the NDK, there's actually several.