大佬教程收集整理的这篇文章主要介绍了使用Android MediaCodec以不同的比特率重新编码h.264内容,大佬教程大佬觉得挺不错的,现在分享给大家,也给大家做个参考。
我能够使用MediaCodec API解码和播放内容(在更改比特率之前),但如果我在解码之前尝试使用不同的比特率对内容进行重新编码,则会出现乱码(绿屏灰色)像素化).
我使用的代码基于Android测试用例android.media.cts.DecoderTest Android Test Case android.media.cts.DecoderTest:
public void encodeDecodeVideoFile(AssetFileDescriptor assetFileDescriptor) { int bitRate = 500000; int frameRate = 30; int width = 480; int height = 368; String mimeType = "video/avc"; MediaCodec encoder,decoder = null; ByteBuffer[] encoderInputBuffers; ByteBuffer[] encoderOutputBuffers; ByteBuffer[] decoderInputBuffers = null; ByteBuffer[] decoderOutputBuffers = null; // Find a code that supports the mime type int numCodecs = MediaCodecList.getCodecCount(); MediaCodecInfo codecInfo = null; for (int i = 0; i < numCodecs && codecInfo == null; i++) { MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i); if (!info.isEncoder()) { conTinue; } String[] types = info.getSupportedTypes(); Boolean found = false; for (int j = 0; j < types.length && !found; j++) { if (types[j].equals(mimeTypE)) found = true; } if (!found) conTinue; codecInfo = info; } Log.d(tag,"Found " + codecInfo.getName() + " supporTing " + mimeTypE); // Find a color profile that the codec supports int colorFormat = 0; MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeTypE); for (int i = 0; i < capabilities.colorFormats.length && colorFormat == 0; i++) { int format = capabilities.colorFormats[i]; switch (format) { case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar: case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar: case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar: case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar: case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar: colorFormat = format; break; default: Log.d(tag,"Skipping unsupported color format " + format); break; } } Log.d(tag,"Using color format " + colorFormat); // Determine width,height and slice sizes if (codecInfo.getName().equals("OMX.TI.DUCATI1.VIDEO.H264E")) { // This codec doesn't support a width not a multiple of 16,// so round down. width &= ~15; } int Stride = width; int sliceHeight = height; if (codecInfo.getName().startsWith("OMX.Nvidia.")) { Stride = (Stride + 15) / 16 * 16; sliceHeight = (sliceHeight + 15) / 16 * 16; } // Used MediaExtractor to SELEct the first track from the h.264 content MediaExtractor extractor = new MediaExtractor(); extractor.setDatasource(assetFileDescriptor.getFileDescriptor(),assetFileDescriptor.getStartOffset(),assetFileDescriptor.getLength()); MediaFormat extractedFormat = extractor.getTrackFormat(0); String mime = extractedFormat.getString(MediaFormat.KEY_MIME); Log.d(tag,"Extartced Mime " + mimE); extractor.SELEctTrack(0); // Create an encoder encoder = MediaCodec.createByCodecName(codecInfo.getName()); MediaFormat inputFormat = MediaFormat.createVideoFormat(mimeType,width,height); inputFormat.setInteger(MediaFormat.KEY_BIT_RATE,bitRatE); inputFormat.setInteger(MediaFormat.KEY_FRAME_RATE,frameRatE); inputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,colorFormat); inputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,5); inputFormat.setInteger("Stride",StridE); inputFormat.setInteger("slice-height",sliceHeight); Log.d(tag,"Configuring encoder with input format " + inputFormat); encoder.configure(inputFormat,null /* surface */,null /* crypto */,MediaCodec.CONfigURE_FLAG_ENCODE); encoder.start(); encoderInputBuffers = encoder.geTinputBuffers(); encoderOutputBuffers = encoder.getOutputBuffers(); // start encoding + decoding final long kTimeOutUs = 5000; MediaCodec.bufferInfo info = new MediaCodec.bufferInfo(); Boolean sawInputEOS = false; Boolean sawOutputEOS = false; MediaFormat oformat = null; long startms = System.currentTimeMillis(); while (!sawOutputEOS) { if (!sawInputEOS) { int inputBufIndex = encoder.dequeueInputBuffer(kTimeOutUs); if (inputBufIndex >= 0) { ByteBuffer dstBuf = encoderInputBuffers[inputBufIndex]; int sampleSize = extractor.readSampleData(dstBuf,0 /* offset */); long presentationTimeUs = 0; if (sampleSize < 0) { Log.d(tag,"saw input EOs."); sawInputEOS = true; sampleSize = 0; } else { presentationTimeUs = extractor.getSampleTime(); } encoder.queueInputBuffer(inputBufIndex,0 /* offset */,sampleSize,presentationTimeUs,sawInputEOS ? MediaCodec.bUFFER_FLAG_END_OF_STREAM : 0); if (!sawInputEOS) { extractor.advance(); } } } int res = encoder.dequeueOutputBuffer(info,kTimeOutUs); if (res >= 0) { int outputBufIndex = res; ByteBuffer buf = encoderOutputBuffers[outputBufIndex]; buf.position(info.offset); buf.limit(info.offset + info.sizE); if ((info.flags & MediaCodec.bUFFER_FLAG_CODEC_CONfig) != 0) { // create a decoder decoder = MediaCodec.createDecoderByType(mimeTypE); MediaFormat format = MediaFormat.createVideoFormat(mimeType,height); format.setInteger(MediaFormat.KEY_COLOR_FORMAT,colorFormat); format.setByteBuffer("csd-0",buf); decoder.configure(format,surface /* surface */,0 /* flags */); decoder.start(); decoderInputBuffers = decoder.geTinputBuffers(); decoderOutputBuffers = decoder.getOutputBuffers(); } else { int decIndex = decoder.dequeueInputBuffer(-1); decoderInputBuffers[decIndex].clear(); decoderInputBuffers[decIndex].put(buf); decoder.queueInputBuffer(decIndex,info.size,info.presentationTimeUs,info.flags); } encoder.releaSEOutputBuffer(outputBufIndex,false /* render */); } else if (res == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) { encoderOutputBuffers = encoder.getOutputBuffers(); Log.d(tag,"encoder output buffers have changed."); } else if (res == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) { MediaFormat encformat = encoder.getOutputFormat(); Log.d(tag,"encoder output format has changed to " + encformat); } if (decoder == null) res = MediaCodec.INFO_TRY_AGAIN_LATER; else res = decoder.dequeueOutputBuffer(info,kTimeOutUs); if (res >= 0) { int outputBufIndex = res; ByteBuffer buf = decoderOutputBuffers[outputBufIndex]; buf.position(info.offset); buf.limit(info.offset + info.sizE); // The worlds simplest FPS implementation while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startms) { try { Thread.sleep(10); } catch (InterruptedException E) { e.printStackTrace(); break; } } decoder.releaSEOutputBuffer(outputBufIndex,true /* render */); if ((info.flags & MediaCodec.bUFFER_FLAG_END_OF_STREAM) != 0) { Log.d(tag,"saw output EOs."); sawOutputEOS = true; } } else if (res == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) { decoderOutputBuffers = decoder.getOutputBuffers(); Log.d(tag,"decoder output buffers have changed."); } else if (res == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) { oformat = decoder.getOutputFormat(); Log.d(tag,"decoder output format has changed to " + oformat); } } encoder.stop(); encoder.release(); decoder.stop(); decoder.release(); }
我试图编码的@L_675_1@来自Android cts Test Project:
R.raw.video_480x360_mp4_h264_1000kbps_30fps_aac_stereo_128kbps_44100hz
我猜这个问题与我在编码器MediaCodec中指定的格式参数有关,但我无法弄清楚什么是不正确/缺失.
int inputBufIndex = encoder.dequeueInputBuffer(kTimeOutUs);
从编码器的输入/输入端口中取出一个缓冲区
ByteBuffer dstBuf = encoderInputBuffers[inputBufIndex];
指向出列缓冲区的缓冲区指针
int sampleSize = extractor.readSampleData(dstBuf,0 /* offset */);
出队缓冲区中填充了提取器中的数据.提取器的输出是压缩比特流.这不是YUV未压缩帧.
encoder.queueInputBuffer(inputBufIndex,....)
这里,压缩比特流被编码.由于这不是YUV帧,编码器将尽力执行压缩,因此,您会在编码器的输出处观察到绿色难以辨认的帧.我认为你在屏幕上观察到这一点,因为下面的部分代码解码相同的内容.即使编码器的输出被写入@L_675_1@并通过不同的播放器或PC播放,也会观察到这种情况.
从程序中,我假设你的设计是Decode ==>编码==>您的图表应该是实时解码
@H_862_11@mediaExtractor ==> MediaCodec (Decoder) ==> MediaCodec (Encoder) ==> MediaCodec (Decoder) ==> Display
P.S:你在运行这个程序时是否观察到任何内存违规?
以上是大佬教程为你收集整理的使用Android MediaCodec以不同的比特率重新编码h.264内容全部内容,希望文章能够帮你解决使用Android MediaCodec以不同的比特率重新编码h.264内容所遇到的程序开发问题。
如果觉得大佬教程网站内容还不错,欢迎将大佬教程推荐给程序员好友。
本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。