Android   发布时间:2022-04-28  发布网站:大佬教程  code.js-code.com
大佬教程收集整理的这篇文章主要介绍了Android Webrtc记录来自其他同行的流的视频大佬教程大佬觉得挺不错的,现在分享给大家,也给大家做个参考。
我正在开发一个webrtc视频通话 Android应用程序,它工作得很好,我需要录制其他对等(remoteVideoStream)和MyStream(localVideoStream)的视频,并将其转换为某些可保存的格式,如mp4或任何其他格式,我真的在寻找它,但却无法弄清楚如何完成这项工作.

我已经阅读了有关VideoFileRenderer的内容,我尝试将其添加到我的代码中以保存视频但是也无法使用它也没有任何方法调用如record()或save(),尽管它有一个名为release()的方法这将用于结束保存视频.如果任何人有任何想法,这是课程:

@JNINamespace("webrtc::jni")
public class VideoFileRenderer implements CallBACks,VideoSink {
private static final String TAG = "VideoFileRenderer";
private final HandlerThread renderThread;
private final Handler renderThreadHandler;
private final FiLeoutputStream videoOutFile;
private final String outputFilename;
private final int outputFileWidth;
private final int outputFileHeight;
private final int outputFrameSize;
private final ByteBuffer outputFrameBuffer;
private EglBase eglBase;
private YuvConverter yuvConverter;
private ArrayList<ByteBuffer> rawFrames = new ArrayList();

public VideoFileRenderer(String outputFile,int outputFileWidth,int outputFileHeight,final Context sharedContext) throws IOException {
    if (outputFileWidth % 2 != 1 && outputFileHeight % 2 != 1) {
        this.outputFilename = outputFile;
        this.outputFileWidth = outputFileWidth;
        this.outputFileHeight = outputFileHeight;
        this.outputFrameSize = outputFileWidth * outputFileHeight * 3 / 2;
        this.outputFrameBuffer = ByteBuffer.allocateDirect(this.outputFrameSizE);
        this.videoOutFile = new FiLeoutputStream(outputFilE);
        this.videoOutFile.write(("YUV4MPEG2 C420 W" + outputFileWidth + " H" + outputFileHeight + " Ip F30:1 A1:1\n").getBytes(Charset.forName("US-ASCII")));
        this.renderThread = new HandlerThread("VideoFileRenderer");
        this.renderThread.start();
        this.renderThreadHandler = new Handler(this.renderThread.getLooper());
        ThreadUtils.invokeAtFrontUninterruptibly(this.renderThreadHandler,new Runnable() {
            public void run() {
                VideoFileRenderer.this.eglBase = EglBase.create(sharedContext,EglBase.CONfig_PIXEL_BUFFER);
                VideoFileRenderer.this.eglBase.createDummyPbufferSurface();
                VideoFileRenderer.this.eglBase.makeCurrent();
                VideoFileRenderer.this.yuvConverter = new YuvConverter();
            }
        });
    } else {
        throw new IllegalArgumentexception("Does not support uneven width or height");
    }
}

public void renderFrame(I420Frame i420FramE) {
    VideoFrame frame = i420Frame.toVideoFrame();
    this.onFrame(framE);
    frame.release();
}

public void onFrame(VideoFrame framE) {
    frame.retain();
    this.renderThreadHandler.post(() -> {
        this.renderFrameOnRenderThread(framE);
    });
}

private void renderFrameOnRenderThread(VideoFrame framE) {
    Buffer buffer = frame.getBuffer();
    int targetWidth = frame.getRotation() % 180 == 0 ? this.outputFileWidth : this.outputFileHeight;
    int targetHeight = frame.getRotation() % 180 == 0 ? this.outputFileHeight : this.outputFileWidth;
    float frameAspectRatio = (float)buffer.getWidth() / (float)buffer.getHeight();
    float fileAspectRatio = (float)targetWidth / (float)targetHeight;
    int cropWidth = buffer.getWidth();
    int cropHeight = buffer.getHeight();
    if (fileAspectRatio > frameAspectRatio) {
        cropHeight = (int)((float)cropHeight * (frameAspectRatio / fileAspectRatio));
    } else {
        cropWidth = (int)((float)cropWidth * (fileAspectRatio / frameAspectRatio));
    }

    int cropX = (buffer.getWidth() - cropWidth) / 2;
    int cropY = (buffer.getHeight() - cropHeight) / 2;
    Buffer scaledBuffer = buffer.cropAndScale(cropX,cropY,cropWidth,cropHeight,targetWidth,targetHeight);
    frame.release();
    I420Buffer i420 = scaledBuffer.toI420();
    scaledBuffer.release();
    ByteBuffer byteBuffer = JniCommon.nativeAllocateByteBuffer(this.outputFrameSizE);
    YuvHelper.I420Rotate(i420.getDataY(),i420.getStrideY(),i420.getDataU(),i420.getStrideU(),i420.getDataV(),i420.getStrideV(),byteBuffer,i420.getWidth(),i420.getHeight(),frame.getRotation());
    i420.release();
    byteBuffer.rewind();
    this.rawFrames.add(byteBuffer);
}

public void release() {
    CountDownLatch cleanupBarrier = new CountDownLatch(1);
    this.renderThreadHandler.post(() -> {
        this.yuvConverter.release();
        this.eglBase.release();
        this.renderThread.quit();
        cleanupBarrier.countDown();
    });
    ThreadUtils.awaitUninterruptibly(cleanupBarrier);

    try {
        Iterator var2 = this.rawFrames.iterator();

        while(var2.hasNext()) {
            ByteBuffer buffer = (ByteBuffer)var2.next();
            this.videoOutFile.write("FRAME\n".getBytes(Charset.forName("US-ASCII")));
            byte[] data = new byte[this.outputFrameSize];
            buffer.get(data);
            this.videoOutFile.write(data);
            JniCommon.nativeFreeByteBuffer(buffer);
        }

        this.videoOutFile.close();
        Logging.d("VideoFileRenderer","Video written to disk as " + this.outputFilename + ". number frames are " + this.rawFrames.size() + " and the dimension of the frames are " + this.outputFileWidth + "x" + this.outputFileHeight + ".");
    } catch (IOException var5) {
        Logging.e("VideoFileRenderer","Error wriTing video to disk",var5);
    }

}

}

找不到任何有用的方法可以提供帮助.

解决方法

VideoFileRenderer类只演示了如何访问远程/本地对等体的解码原始视频帧.
这不是录制有效的视频文件.
您应该手动实现编码和将原始视频帧复用到容器中的逻辑,如mp4.

主要流程如下:

>切换到最新的webrtc版本(现在为v.0.0.25331)
>创建视频容器.例如,请参阅Android SDK中的MediaMuxer
>实现接口VideoSink以从特定视频源获取原始帧.例如,请参阅apprtc/CallActivity.java类ProxyVideoSink
>使用MediaCodec对每帧进行编码并写入视频容器>敲定muxer

大佬总结

以上是大佬教程为你收集整理的Android Webrtc记录来自其他同行的流的视频全部内容,希望文章能够帮你解决Android Webrtc记录来自其他同行的流的视频所遇到的程序开发问题。

如果觉得大佬教程网站内容还不错,欢迎将大佬教程推荐给程序员好友。

本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。