Android   发布时间:2022-04-28  发布网站:大佬教程  code.js-code.com
大佬教程收集整理的这篇文章主要介绍了android – 在相机流上绘制文本或图像(GLSL)大佬教程大佬觉得挺不错的,现在分享给大家,也给大家做个参考。
我有一个基于 grafika’s examples的实时广播应用程序,我通过RTMP发送我的视频源进行直播.

我现在想通过在视频流上叠加文字或徽标来为我的视频添加水印.我知道这可以通过GLSL过滤完成,但我不知道如何根据我链接的样本实现它.

我尝试使用Alpha混合,但似乎两种纹理格式在某种程度上是不兼容的(一种是TEXTURE_EXTERNAL_OES,另一种是TEXTURE_2D),我只是得到一个黑框作为回报.

编辑:

我的代码基于Kickflip API:

class CameraSurfaceRenderer implements GLSurfaceView.Renderer {
    private static final String TAG = "CameraSurfaceRenderer";
    private static final Boolean VERBOSE = false;

    private CameraEncoder mCameraEncoder;

    private FullFrameRect mFullScreenCamera;
    private FullFrameRect mFullScreenOverlay;     // For texture overlay

    private final float[] mSTMatrix = new float[16];
    privatE int mOverlayTexturEID;
    privatE int mCameraTexturEID;

    private Boolean mRecordingEnabled;

    privatE int mFrameCount;

    // Keep track of SELEcted filters + relevant state
    private Boolean mIncomingSizeupdated;
    privatE int mIncomingWidth;
    privatE int mIncomingHeight;
    privatE int mCurrentFilter;
    privatE int mNewFilter;

    Boolean showBox = false;


    /**
     * Constructs CameraSurfaceRenderer.
     * <p>
     * @param recorder video encoder object
     */
    public CameraSurfaceRenderer(CameraEncoder recorder) {
        mCameraEncoder = recorder;

        mCameraTexturEID = -1;
        mFrameCount = -1;

        SessionConfig config = recorder.getConfig();
        mIncomingWidth = config.getVideoWidth();
        mIncomingHeight = config.getVideoHeight();
        mIncomingSizeupdated = true;        // Force texture size update on next onDrawFrame

        mCurrentFilter = -1;
        mNewFilter = Filters.FILTER_NONE;

        mRecordingEnabled = false;
    }


    /**
     * Notifies the renderer that we want to stop or start recording.
     */
    public void changeRecordingState(Boolean isRecording) {
        Log.d(tag,"changeRecordingState: was " + mRecordingEnabled + " Now " + isRecording);
        mRecordingEnabled = isRecording;
    }

    @Override
    public void onSurfaceCreated(GL10 unused,EGLConfig config) {
        Log.d(tag,"onSurfaceCreated");
        // Set up the texture blitter that will be used for on-screen display.  This
        // is *not* applied to the recording,because that uses a separate shader.
        mFullScreenCamera = new FullFrameRect(
                new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_EXT));
        // For texture overlay:
        GLES20.glEnable(GLES20.GL_BLEND);
        GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA,GLES20.GL_ONE_MINUS_SRC_ALPHA);
        mFullScreenOverlay = new FullFrameRect(
                  new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_2D));
        mOverlayTexturEID = GlUtil.createTextureWithTextContent("Hello!");
        mOverlayTexturEID = GlUtil.createTextureFromImage(mCameraView.getContext(),R.drawable.red_dot);
        mCameraTexturEID = mFullScreenCamera.createTextureObject();

        mCameraEncoder.onSurfaceCreated(mCameraTexturEID);
        mFrameCount = 0;
    }

    @Override
    public void onSurfaceChanged(GL10 unused,int width,int height) {
        Log.d(tag,"onSurfaceChanged " + width + "x" + height);
    }

    @Override
    public void onDrawFrame(GL10 unused) {
        if (VERBOSE){
            if(mFrameCount % 30 == 0){
                Log.d(tag,"onDrawFrame tex=" + mCameraTexturEID);
                mCameraEncoder.logSavedEglState();
            }
        }

        if (mCurrentFilter != mNewFilter) {
            Filters.updateFilter(mFullScreenCamera,mNewFilter);
            mCurrentFilter = mNewFilter;
            mIncomingSizeupdated = true;
        }

        if (mIncomingSizeupdated) {
            mFullScreenCamera.getProgram().setTexSize(mIncomingWidth,mIncomingHeight);
            mFullScreenOverlay.getProgram().setTexSize(mIncomingWidth,mIncomingHeight);
            mIncomingSizeupdated = false;
            Log.i(tag,"setTexSize on display Texture");
        }

        // Draw the video frame.
        if(mCameraEncoder.isSurfaCETextureReadyForDisplay()){
            mCameraEncoder.getSurfaCETextureForDisplay().updateTexImage();
            mCameraEncoder.getSurfaCETextureForDisplay().getTransformMatrix(mSTMatriX);
            //Drawing texture overlay:
            mFullScreenOverlay.drawFrame(mOverlayTexturEID,mSTMatriX);
            mFullScreenCamera.drawFrame(mCameraTexturEID,mSTMatriX);
        }
        mFrameCount++;
    }

    public void signalVertialVideo(FullFrameRect.SCREEN_ROTATION isVertical) {
        if (mFullScreenCamera != null) mFullScreenCamera.adjustForVerticalVideo(isVertical,falsE);
    }

    /**
     * Changes the filter that we're applying to the camera preview.
     */
    public void changeFilterMode(int filter) {
        mNewFilter = filter;
    }

    public void handleTouchEvent(MotionEvent ev){
        mFullScreenCamera.handleTouchEvent(ev);
    }

}@H_772_16@ 
 

这是在屏幕上渲染图像的代码(GLSurfaceView),但实际上并没有覆盖在视频上.如果我没有记错的话,这是在CameraEncoder完成的.

事实是,将CameraSurfaceRenderer中的代码复制到CameraEncoder中(它们在过滤器时都有类似代码)不提供重叠的文本/图像.

解决方法

https://developer.android.com/reference/android/graphics/SurfaceTexture.html

发布您用于进行alpha混合的代码,我可以修复它.

我可能会覆盖Texture2dProgram并将其传递给FullFrame渲染器.它具有使用GL_TEXTURE_EXTERNAL_OES扩展进行渲染的示例代码.基本上,@ Overver the draw函数,调用基本实现,绑定你的水印并绘制.

那应该在相机和视频编码器之间.

大佬总结

以上是大佬教程为你收集整理的android – 在相机流上绘制文本或图像(GLSL)全部内容,希望文章能够帮你解决android – 在相机流上绘制文本或图像(GLSL)所遇到的程序开发问题。

如果觉得大佬教程网站内容还不错,欢迎将大佬教程推荐给程序员好友。

本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。