OpenGL Texture C++ 预览Camera视频

OpenGL是一个图形API,并不是一个独立的平台。包含了一系列可以操作图形、图像的函数。基于Texture纹理强大的功能,本篇文章实现Android OpenGL Texture C++ 预览Camera视频流的功能。

       项目github地址:https://github.com/wangyongyao1989/WyFFmpeg

        

一、代码实现步骤及图示预览:

二、Camera数据获取:

        Android Camera可以获取图片数据的视频流信息。

       1、 打开Camera后,获取ImageReader读取视频对应的相片数据。

    /**
     * Opens the camera.
     */
    @SuppressLint({"WrongConstant", "MissingPermission"})
    public void openCamera() {
        if (checkSelfPermission(mContext, Manifest.permission.CAMERA) 
                != PackageManager.PERMISSION_GRANTED) {
            return;
        }

        mImageReader = ImageReader.newInstance(mPreviewSize.getWidth()
                , mPreviewSize.getHeight()
                , ImageFormat.YUV_420_888, IMAGE_BUFFER_SIZE);
        mImageReader.setOnImageAvailableListener(mVideoCapture
                , mBackgroundHandler);

        Log.i(TAG, "openCamera");

        CameraManager manager = (CameraManager) 
                mContext.getSystemService(Context.CAMERA_SERVICE);
        try {
            if (!mCameraOpenCloseLock.tryAcquire(2500
                    , TimeUnit.MILLISECONDS)) {
                throw new RuntimeException("Time out waiting " +
                        "to lock camera opening.");
            }
            manager.openCamera(mCameraId, mStateCallback
                    , mBackgroundHandler);
        } catch (CameraAccessException e) {
            Log.e(TAG, "Cannot " +
                    "access the camera " + e);
        } catch (InterruptedException e) {
            throw new RuntimeException("Interrupted while " +
                    "trying to lock camera opening.", e);
        }
    }

        2、在监听ImageReader.OnImageAvailableListener中读取ImageReader并转换成YUV_420_888的数据。

  @Override
    public void onImageAvailable(ImageReader imageReader) {

        Image image = imageReader.acquireLatestImage();
        if (image != null) {
            if (mPreviewFrameHandler != null) {
                mPreviewFrameHandler.onPreviewFrame(YUV_420_888_data(image), image.getWidth(), image.getHeight());
            }

            image.close();
        }
    }

    private static byte[] YUV_420_888_data(Image image) {
        final int imageWidth = image.getWidth();
        final int imageHeight = image.getHeight();
        final Image.Plane[] planes = image.getPlanes();
        byte[] data = new byte[imageWidth * imageHeight *
                ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
        int offset = 0;

        for (int plane = 0; plane < planes.length; ++plane) {
            final ByteBuffer buffer = planes[plane].getBuffer();
            final int rowStride = planes[plane].getRowStride();
            // Experimentally, U and V planes have |pixelStride| = 2, which
            // essentially means they are packed.
            final int pixelStride = planes[plane].getPixelStride();
            final int planeWidth = (plane == 0) ? imageWidth : imageWidth / 2;
            final int planeHeight = (plane == 0) ? imageHeight : imageHeight / 2;
            if (pixelStride == 1 && rowStride == planeWidth) {
                // Copy whole plane from buffer into |data| at once.
                buffer.get(data, offset, planeWidth * planeHeight);
                offset += planeWidth * planeHeight;
            } else {
                // Copy pixels one by one respecting pixelStride and rowStride.
                byte[] rowData = new byte[rowStride];
                for (int row = 0; row < planeHeight - 1; ++row) {
                    buffer.get(rowData, 0, rowStride);
                    for (int col = 0; col < planeWidth; ++col) {
                        data[offset++] = rowData[col * pixelStride];
                    }
                }
                // Last row is special in some devices and may not contain the full
                // |rowStride| bytes of data.
                // See http://developer.android.com/reference/android/media/Image.Plane.html#getBuffer()
                buffer.get(rowData, 0, Math.min(rowStride, buffer.remaining()));
                for (int col = 0; col < planeWidth; ++col) {
                    data[offset++] = rowData[col * pixelStride];
                }
            }
        }

        return data;
    }

三、设置OpenGL的使用场景:

        GLTextureCPlusVideoPlayerView继承GLSurfcaeView,实现GLSurfcaeView.Renderer接口:

package com.wangyongyao.glplay.view;

import android.content.Context;
import android.opengl.GLSurfaceView;
import android.util.AttributeSet;
import android.util.Log;


import com.wangyongyao.glplay.OpenGLPlayCallJni;
import com.wangyongyao.glplay.camerahelper.camerahelper.CameraDataHelper;
import com.wangyongyao.glplay.camerahelper.camerahelper.CameraDataListener;
import com.wangyongyao.glplay.utils.OpenGLPlayFileUtils;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

/**
 * author : wangyongyao https://github.com/wangyongyao1989
 * Create Time : 2024/9/3 23:57
 * Descibe : MyyFFmpeg com.example.myyffmpeg.utils
 */
public class GLTextureCPlusVideoPlayerView extends GLSurfaceView
        implements GLSurfaceView.Renderer, CameraDataListener {


    private static String TAG = GLTextureCPlusVideoPlayerView.class.getSimpleName();
    private OpenGLPlayCallJni mJniCall;
    private Context mContext;


    private int mWidth;
    private int mHeight;
    private CameraDataHelper mCameraHelper;


    public GLTextureCPlusVideoPlayerView(Context context, OpenGLPlayCallJni jniCall) {
        super(context);
        mContext = context;
        mJniCall = jniCall;
        init();
    }

    public GLTextureCPlusVideoPlayerView(Context context, AttributeSet attrs) {
        super(context, attrs);
        mContext = context;
        init();
    }

    private void init() {
        getHolder().addCallback(this);
        setEGLContextClientVersion(3);
        setEGLConfigChooser(8, 8, 8, 8, 16, 0);
        String fragPath = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "texture_video_play_frament.glsl");
        String vertexPath = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "texture_video_play_vert.glsl");
        String picSrc1 = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "wall.jpg");
        mCameraHelper = new CameraDataHelper(getContext(), this);
        mCameraHelper.startCamera();
        mJniCall.glTextureVideoPlayCreate(0, vertexPath, fragPath);

        setRenderer(this);
        setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);

    }


    private void stopCameraPreview() {
        mCameraHelper.destroy();
    }


    public void onDrawFrame(GL10 gl) {
        if (mJniCall != null) {
            mJniCall.glTextureVideoPlayRender();
        }

    }

    public void onSurfaceChanged(GL10 gl, int width, int height) {
        Log.e(TAG, "onSurfaceChanged width:" + width + ",height" + height);
        if (mJniCall != null) {
            mJniCall.glTextureVideoPlayInit(null, null, width, height);
        }
        mWidth = width;
        mHeight = height;
        mCameraHelper.initialize(width, height);
    }


    @Override
    public void onSurfaceCreated(GL10 gl10, EGLConfig eglConfig) {
        Log.e(TAG, "onSurfaceCreated:");

    }


    @Override
    public void onPreviewFrame(byte[] yuvData, int width, int height) {
        mJniCall.glTextureVideoPlayDraw(yuvData, width, height, 90);
        requestRender();
    }


    public void destroyRender() {
        mJniCall.glTextureVideoPlayDestroy();
        stopCameraPreview();
    }

}

   这里需要注意的是要设置setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY),等待onPreviewFrame回调数据之后在进行Texture的帧渲染。

    /**
     * The renderer only renders
     * when the surface is created, or when {@link #requestRender} is called.
     *
     * @see #getRenderMode()
     * @see #setRenderMode(int)
     * @see #requestRender()
     */
    public final static int RENDERMODE_WHEN_DIRTY = 0;

 四、JNI层把Java的数据传入C++层:

        1、Java层实现:

        定义的执行流程的顺序方法:glTextureVideoPlayeCreate -> glTextureVideoPlayeInit -> glTextureVideoPlayeCreate -> glTextureVideoPlayeDraw -> glTextureVideoPlayeRender 。

 /*********************** OpenGL Texture显示视频********************/
    public void glTextureVideoPlayCreate(int type, String vertexPath, String fragPath) {
        native_texture_video_play_create(type, vertexPath, fragPath);
    }

    public void glTextureVideoPlayDestroy() {
        native_texture_video_play_destroy();
    }

    public void glTextureVideoPlayInit(Surface surface, AssetManager assetManager
            , int width, int height) {
        native_texture_video_play_init(surface, assetManager, width, height);
    }

    public void glTextureVideoPlayRender() {
        native_texture_video_play_render();
    }

    public void glTextureVideoPlayDraw(byte[] data, int width, int height, int rotation) {
        native_texture_video_play_draw(data, width, height, rotation);
    }

    public void glTextureVideoPlaySetParameters(int params) {
        native_texture_video_play_set_parameters(params);
    }

    public int glTextureVideoPlayGetParameters() {
        return native_texture_video_play_get_parameters();
    }

    private native void native_texture_video_play_create(int type, String vertexPath
            , String fragPath);

    private native void native_texture_video_play_destroy();

    private native void native_texture_video_play_init(Surface surface
            , AssetManager assetManager
            , int width, int height);

    private native void native_texture_video_play_render();

    private native void native_texture_video_play_draw(byte[] data, int width
            , int height, int rotation);

    private native void native_texture_video_play_set_parameters(int params);

    private native int native_texture_video_play_get_parameters();

        2、JNI层实现:

/*********************** OpenGL Texture预览Camera视频********************/
extern "C"
JNIEXPORT void JNICALL
cpp_texture_video_play_creat(JNIEnv *env, jobject thiz, jint type,
                             jstring vertex,
                             jstring frag) {
    const char *vertexPath = env->GetStringUTFChars(vertex, nullptr);
    const char *fragPath = env->GetStringUTFChars(frag, nullptr);
    if (textureVideoRender == nullptr)
        textureVideoRender = new OpenglesTexureVideoRender();

    textureVideoRender->setSharderPath(vertexPath, fragPath);

    env->ReleaseStringUTFChars(vertex, vertexPath);
    env->ReleaseStringUTFChars(frag, fragPath);
}

extern "C"
JNIEXPORT void JNICALL
cpp_texture_video_play_destroy(JNIEnv *env, jobject thiz) {

}

extern "C"
JNIEXPORT void JNICALL
cpp_texture_video_play_init(JNIEnv *env, jobject thiz,
                            jobject surface,
                            jobject assetManager,
                            jint width,
                            jint height) {
    if (textureVideoRender != nullptr) {
        ANativeWindow *window = surface ? ANativeWindow_fromSurface(env, surface) : nullptr;
        auto *aAssetManager = assetManager ? AAssetManager_fromJava(env, assetManager) : nullptr;
        textureVideoRender->init(window, aAssetManager, (size_t) width, (size_t) height);
    }
}

extern "C"
JNIEXPORT void JNICALL
cpp_texture_video_play_render(JNIEnv *env, jobject thiz) {
    if (textureVideoRender != nullptr) {
        textureVideoRender->render();
    }
}

extern "C"
JNIEXPORT void JNICALL
cpp_texture_video_play_draw(JNIEnv *env, jobject obj, jbyteArray data, jint width, jint height,
                            jint rotation) {
    jbyte *bufferPtr = env->GetByteArrayElements(data, nullptr);
    jsize arrayLength = env->GetArrayLength(data);

    if (textureVideoRender != nullptr) {

        textureVideoRender->draw((uint8_t *) bufferPtr, (size_t) arrayLength, (size_t) width,
                                 (size_t) height,
                                 rotation);
    }

    env->ReleaseByteArrayElements(data, bufferPtr, 0);
}

extern "C"
JNIEXPORT void JNICALL
cpp_texture_video_play_setParameters(JNIEnv *env, jobject thiz, jint p) {

    if (textureVideoRender != nullptr) {
        textureVideoRender->setParameters((uint32_t) p);
    }

}

extern "C"
JNIEXPORT jint JNICALL
cpp_texture_video_play_getParameters(JNIEnv *env, jobject thiz) {
    if (textureVideoRender != nullptr) {
        textureVideoRender->getParameters();
    }
    return 0;

}

static const JNINativeMethod methods[] = {
        

        /*********************** OpenGL Texture显示视频********************/
        {"native_texture_video_play_create",         "(I"
                                                     "Ljava/lang/String;"
                                                     "Ljava/lang/String;)V",  (void *) cpp_texture_video_play_creat},
        {"native_texture_video_play_destroy",        "()V",                   (void *) cpp_texture_video_play_destroy},
        {"native_texture_video_play_init",           "(Landroid/view/Surface;"
                                                     "Landroid/content/res"
                                                     "/AssetManager;II)V",    (void *) cpp_texture_video_play_init},
        {"native_texture_video_play_render",         "()V",                   (void *) cpp_texture_video_play_render},
        {"native_texture_video_play_draw",           "([BIII)V",              (void *) cpp_texture_video_play_draw},
        {"native_texture_video_play_set_parameters", "(I)V",                  (void *) cpp_texture_video_play_setParameters},
        {"native_texture_video_play_get_parameters", "()I",                   (void *) cpp_texture_video_play_getParameters},


};

// 定义注册方法
JNIEXPORT jint JNICALL JNI_OnLoad(JavaVM *vm, void *reserved) {
    LOGD("动态注册");
    JNIEnv *env;
    if ((vm)->GetEnv((void **) &env, JNI_VERSION_1_6) != JNI_OK) {
        LOGD("动态注册GetEnv  fail");
        return JNI_ERR;
    }

    // 获取类引用
    jclass clazz = env->FindClass(rtmp_class_name);

    // 注册native方法
    jint regist_result = env->RegisterNatives(clazz, methods,
                                              sizeof(methods) / sizeof(methods[0]));
    if (regist_result) { // 非零true 进if
        LOGE("动态注册 fail regist_result = %d", regist_result);
    } else {
        LOGI("动态注册 success result = %d", regist_result);
    }
    return JNI_VERSION_1_6;
}

       

五、C++层的OpenGL Texture渲染实现:

       这里OpenGL相关的代码基于我的github项目:GitHub - wangyongyao1989/AndroidLearnOpenGL: OpenGL基础及运用

抽取过来进行实现的。

感兴趣的可以去阅读我的关于OpenGL的相关的博客:https://blog.csdn.net/wangyongyao1989/category_6943979.html?spm=1001.2014.3001.5482

  1、着色器程序:

        着色器程序是GLSL的文件,把存放文件夹地址传入C++层的OpenGLShader.cpp中。

  •   texture_video_play_vert.glsl顶点着色器:
#version 320 es

out vec2 v_texcoord;

in vec4 position;
in vec2 texcoord;

void main() {
    v_texcoord = texcoord;
    gl_Position =  position;
}
  •  texture_video_play_fragment.glsl片段着色器:       
#version 320 es

precision mediump float;

in vec2 v_texcoord;

uniform lowp sampler2D s_textureY;
uniform lowp sampler2D s_textureU;
uniform lowp sampler2D s_textureV;

out vec4 gl_FragColor;

void main() {
     float y, u, v, r, g, b;
     y = texture(s_textureY, v_texcoord).r;
     u = texture(s_textureU, v_texcoord).r;
     v = texture(s_textureV, v_texcoord).r;
     u = u - 0.5;
     v = v - 0.5;
     r = y + 1.403 * v;
     g = y - 0.344 * u - 0.714 * v;
     b = y + 1.770 * u;
     gl_FragColor = vec4(r, g, b, 1.0);

}

2、OpenGLShader.cpp中着色器程序编译、连接、使用。

//
// Created by MMM on 2024/8/8.
//

#include "OpenGLShader.h"

GLuint
OpenGLShader::createProgram() {
    vertexShader = loadShader(GL_VERTEX_SHADER, gVertexShaderCode);
    LOGI("=====gVertexShaderCode :%s", gVertexShaderCode);
    LOGI("======gFragmentShaderCode :%s", gFragmentShaderCode);
    if (!vertexShader) {
        checkGlError("loadShader GL_VERTEX_SHADER");
        return 0;
    }

    fraShader = loadShader(GL_FRAGMENT_SHADER, gFragmentShaderCode);

    if (!fraShader) {
        checkGlError("loadShader GL_FRAGMENT_SHADER");
        return 0;
    }

    shaderId = glCreateProgram();      //创建一个着色程序对象
    if (shaderId) {
        glAttachShader(shaderId, vertexShader);        //把着色器附加到了程序对象上
        checkGlError("glAttachShader");
        glAttachShader(shaderId, fraShader);
        checkGlError("glAttachShader");
        glLinkProgram(shaderId);   //链接程序对象
        GLint linkStatus = GL_FALSE;
        glGetProgramiv(shaderId, GL_LINK_STATUS, &linkStatus);  //检测链接着色器程序是否失败
        if (linkStatus != GL_TRUE) {
            GLint bufLength = 0;
            glGetProgramiv(shaderId, GL_INFO_LOG_LENGTH, &bufLength);
            if (bufLength) {
                char *buf = (char *) malloc(bufLength);
                if (buf) {
                    glGetProgramInfoLog(shaderId, bufLength, NULL, buf);
                    LOGE("Could not link shaderId:\n%s\n", buf);
                    free(buf);
                }
            }
            glDeleteProgram(shaderId);     //
            shaderId = 0;
        }
    }
    return shaderId;
}

/**
 * 加载着色器
 * @param shaderType
 * @param pSource
 * @return
 */
GLuint OpenGLShader::loadShader(GLenum shaderType, const char *pSource) {
    GLuint shader = glCreateShader(shaderType);     //创建着色器
    if (shader) {
        glShaderSource(shader, 1, &pSource, NULL);  //着色器源码附加到着色器对象上
        glCompileShader(shader);                    //编译着着色器
        GLint compiled = 0;
        glGetShaderiv(shader, GL_COMPILE_STATUS, &compiled);
        if (!compiled) {
            GLint infoLen = 0;
            glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &infoLen);
            if (infoLen) {
                char *buf = (char *) malloc(infoLen);
                if (buf) {
                    glGetShaderInfoLog(shader, infoLen, NULL, buf);
                    LOGE("Could not compile shader %d:\n%s\n",
                         shaderType, buf);
                    free(buf);
                }
                glDeleteShader(shader);     //删除着色器对象
                shader = 0;
            }
        }
    }
    return shader;
}

bool OpenGLShader::getSharderPath(const char *vertexPath, const char *fragmentPath) {
    ifstream vShaderFile;
    ifstream fShaderFile;

    // ensure ifstream objects can throw exceptions:
    vShaderFile.exceptions(ifstream::failbit | ifstream::badbit);
    fShaderFile.exceptions(ifstream::failbit | ifstream::badbit);
    try {
        // open files
        vShaderFile.open(vertexPath);
        fShaderFile.open(fragmentPath);
        stringstream vShaderStream, fShaderStream;
        // read file's buffer contents into streams
        vShaderStream << vShaderFile.rdbuf();
        fShaderStream << fShaderFile.rdbuf();
        // close file handlers
        vShaderFile.close();
        fShaderFile.close();
        // convert stream into string
        vertexCode = vShaderStream.str();
        fragmentCode = fShaderStream.str();
    }
    catch (ifstream::failure &e) {
        LOGE("Could not getSharderPath error :%s", e.what());
        return false;
    }
    gVertexShaderCode = vertexCode.c_str();
    gFragmentShaderCode = fragmentCode.c_str();

    return true;
}

void OpenGLShader::printGLString(const char *name, GLenum s) {
    const char *v = (const char *) glGetString(s);
    LOGI("OpenGL %s = %s\n", name, v);
}

void OpenGLShader::checkGlError(const char *op) {
    for (GLint error = glGetError(); error; error = glGetError()) {
        LOGI("after %s() glError (0x%x)\n", op, error);
    }
}

OpenGLShader::~OpenGLShader() {
    if (vertexShader) {
        glDeleteShader(vertexShader);
    }
    if (fraShader) {
        glDeleteShader(fraShader);
    }
    vertexCode.clear();
    fragmentCode.clear();

    gVertexShaderCode = nullptr;
    gFragmentShaderCode = nullptr;
}

OpenGLShader::OpenGLShader() {

}

  

3、OpenGLTextureVideoRender.cpp进行YUV的Texture渲染:

        大致流程为:createProgram() -> createTextures() -> draw() -> render()。

  • createProgram()创建程序,获取着色器中的输入顶点坐标、输入纹理顶点坐标及uniform的参数:
int
OpenglesTexureVideoRender::createProgram() {

    m_program = lightColorShader->createProgram();
    m_vertexShader = lightColorShader->vertexShader;
    m_pixelShader = lightColorShader->fraShader;
    LOGI("OpenglesTexureVideoRender createProgram m_program:%d", m_program);

    if (!m_program) {
        LOGE("Could not create program.");
        return 0;
    }

    //Get Uniform Variables Location
    m_vertexPos = (GLuint) glGetAttribLocation(m_program, "position");
    m_textureYLoc = glGetUniformLocation(m_program, "s_textureY");
    m_textureULoc = glGetUniformLocation(m_program, "s_textureU");
    m_textureVLoc = glGetUniformLocation(m_program, "s_textureV");
    m_textureLoc = (GLuint) glGetAttribLocation(m_program, "texcoord");

    return m_program;
}
  • createTextures()分别创建YUV三个通道的纹理:
bool OpenglesTexureVideoRender::createTextures() {
    auto widthY = (GLsizei) m_width;
    auto heightY = (GLsizei) m_height;

    glActiveTexture(GL_TEXTURE0);
    glGenTextures(1, &m_textureIdY);
    glBindTexture(GL_TEXTURE_2D, m_textureIdY);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, widthY, heightY, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE,
                 nullptr);

    if (!m_textureIdY) {
//        check_gl_error("Create Y texture");
        return false;
    }

    GLsizei widthU = (GLsizei) m_width / 2;
    GLsizei heightU = (GLsizei) m_height / 2;

    glActiveTexture(GL_TEXTURE1);
    glGenTextures(1, &m_textureIdU);
    glBindTexture(GL_TEXTURE_2D, m_textureIdU);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, widthU, heightU, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE,
                 nullptr);

    if (!m_textureIdU) {
//        check_gl_error("Create U texture");
        return false;
    }

    GLsizei widthV = (GLsizei) m_width / 2;
    GLsizei heightV = (GLsizei) m_height / 2;

    glActiveTexture(GL_TEXTURE2);
    glGenTextures(1, &m_textureIdV);
    glBindTexture(GL_TEXTURE_2D, m_textureIdV);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, widthV, heightV, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE,
                 nullptr);

    if (!m_textureIdV) {
//        check_gl_error("Create V texture");
        return false;
    }

    return true;
}
  • draw()分离出YUV的每个通道的数据集:

void OpenglesTexureVideoRender::draw(uint8_t *buffer, size_t length
                                    , size_t width, size_t height,
                                     float rotation) {
    m_length = length;
    m_rotation = rotation;

    video_frame frame{};
    frame.width = width;
    frame.height = height;
    frame.stride_y = width;
    frame.stride_uv = width / 2;
    frame.y = buffer;
    frame.u = buffer + width * height;
    frame.v = buffer + width * height * 5 / 4;

    updateFrame(frame);
}


void OpenglesTexureVideoRender::updateFrame(const video_frame &frame) {
    m_sizeY = frame.width * frame.height;
    m_sizeU = frame.width * frame.height / 4;
    m_sizeV = frame.width * frame.height / 4;

    if (m_pDataY == nullptr || m_width != frame.width || m_height != frame.height) {
        m_pDataY = std::make_unique<uint8_t[]>(m_sizeY + m_sizeU + m_sizeV);
        m_pDataU = m_pDataY.get() + m_sizeY;
        m_pDataV = m_pDataU + m_sizeU;
        isProgramChanged = true;
    }

    m_width = frame.width;
    m_height = frame.height;

    if (m_width == frame.stride_y) {
        memcpy(m_pDataY.get(), frame.y, m_sizeY);
    } else {
        uint8_t *pSrcY = frame.y;
        uint8_t *pDstY = m_pDataY.get();

        for (int h = 0; h < m_height; h++) {
            memcpy(pDstY, pSrcY, m_width);

            pSrcY += frame.stride_y;
            pDstY += m_width;
        }
    }

    if (m_width / 2 == frame.stride_uv) {
        memcpy(m_pDataU, frame.u, m_sizeU);
        memcpy(m_pDataV, frame.v, m_sizeV);
    } else {
        uint8_t *pSrcU = frame.u;
        uint8_t *pSrcV = frame.v;
        uint8_t *pDstU = m_pDataU;
        uint8_t *pDstV = m_pDataV;

        for (int h = 0; h < m_height / 2; h++) {
            memcpy(pDstU, pSrcU, m_width / 2);
            memcpy(pDstV, pSrcV, m_width / 2);

            pDstU += m_width / 2;
            pDstV += m_width / 2;

            pSrcU += frame.stride_uv;
            pSrcV += frame.stride_uv;
        }
    }

    isDirty = true;
}
  • render()每个渲染时更新YUV三个纹理:

void OpenglesTexureVideoRender::render() {
//    LOGI("OpenglesTexureVideoRender render");

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

    if (!updateTextures() || !useProgram()) return;

    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
}


bool OpenglesTexureVideoRender::updateTextures() {
    if (!m_textureIdY 
            && !m_textureIdU
            && !m_textureIdV 
            && !createTextures()) return false;
                        
//    LOGI("OpenglesTexureVideoRender updateTextures");

    if (isDirty) {
        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, m_textureIdY);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, (GLsizei) m_width, (GLsizei) m_height, 0,
                     GL_LUMINANCE, GL_UNSIGNED_BYTE, m_pDataY.get());

        glActiveTexture(GL_TEXTURE1);
        glBindTexture(GL_TEXTURE_2D, m_textureIdU);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, (GLsizei) m_width / 2, (GLsizei) m_height / 2,
                     0,
                     GL_LUMINANCE, GL_UNSIGNED_BYTE, m_pDataU);

        glActiveTexture(GL_TEXTURE2);
        glBindTexture(GL_TEXTURE_2D, m_textureIdV);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, (GLsizei) m_width / 2, (GLsizei) m_height / 2,
                     0,
                     GL_LUMINANCE, GL_UNSIGNED_BYTE, m_pDataV);

        isDirty = false;

        return true;
    }

    return false;
}

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/872740.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

第T10周:数据增强

>- **&#x1f368; 本文为[&#x1f517;365天深度学习训练营](https://mp.weixin.qq.com/s/0dvHCaOoFnW8SCp3JpzKxg) 中的学习记录博客** >- **&#x1f356; 原作者&#xff1a;[K同学啊](https://mtyjkh.blog.csdn.net/)** 在本教程中&#xff0c;你将学会如何进行数…

iOS——APP启动流程

APP启动 APP启动主要分为两个阶段&#xff1a;pre-main和main之后&#xff0c;而APP的启动优化也主要是在这两个阶段进行的。 main之后的优化&#xff1a;1. 减少不必要的任务&#xff0c;2.必要的任务延迟执行&#xff0c;例如放在控制器界面等等。 APP启动的大致过程&#…

云原生技术:‌引领数字化转型的新浪潮

云原生技术&#xff1a;‌引领数字化转型的新浪潮 在数字化转型的时代背景下&#xff0c;‌企业面临着前所未有的挑战与机遇。‌随着云计算技术的飞速发展&#xff0c;‌云原生技术作为一种新型的应用程序开发和部署方式&#xff0c;‌正逐步成为构建高可用、‌可扩展应用程序…

景联文科技:专业视频标注服务助力计算机视觉应用升级

视频标注是指对视频内容进行分析&#xff0c;并在视频中的特定对象、行为或事件上添加标签的过程。 视频标注包括&#xff1a; 1. 对象检测与跟踪 •对象检测&#xff1a;在每一帧中识别并定位特定的对象&#xff0c;如人、车、动物等。 •对象跟踪&#xff1a;跟踪这些对象…

使用html+css+layui实现动态表格组件

1、概述 需求&#xff0c;表格第一列指标可配置通过后端api传进来&#xff0c;表格显示数据以及鼠标触摸后气泡弹出层提示信息都是从后端传过来&#xff0c;实现动态表格的组件&#xff01;&#xff01;实现效果如下&#xff1a; 接口标准数据格式如下&#xff1a; {"da…

Unity TMP (TextMeshPro) 更新中文字符集

TMP更新中文字符集 1 字符集缺失说明2 字体的字符表2.1 字符表更新模式&#xff1a;动态2.2 字符表更新模式&#xff1a;静态 3 更新字符集步骤3.1 打开纹理更新面板3.1 导入文本文件3.3 关于警告处理 4 修改TMP默认字体设置 1 字符集缺失说明 使用TMP显示中文需要用到中文字体…

SprinBoot+Vue问卷调查微信小程序的设计与实现

目录 1 项目介绍2 项目截图3 核心代码3.1 Controller3.2 Service3.3 Dao3.4 application.yml3.5 SpringbootApplication3.5 Vue3.6 uniapp代码 4 数据库表设计5 文档参考6 计算机毕设选题推荐7 源码获取 1 项目介绍 博主个人介绍&#xff1a;CSDN认证博客专家&#xff0c;CSDN平…

uniapp / uniapp x UI 组件库推荐大全

在 uniapp 开发中&#xff0c;我们大多数都会使用到第三方UI 组件库&#xff0c;提起 uniapp 的UI组件库&#xff0c;我们最常使用的应该就是uview了吧&#xff0c;但是随着日益增长的需求&#xff0c;uview 在某些情况下已经不在满足于我们的一些开发需求&#xff0c;尽管它目…

单例模式的总结

常规模式:有属性/构造方法/普通方法&#xff0c;也可以在类中执行主方法&#xff0c;也可以在test类中执行主方法 单例模式是什么&#xff1f; 单例模式&#xff1a;类只有1个对象&#xff1b;保证一个类仅有一个实例&#xff0c;并提供一个访问它的全局访问点。单例模式是在内…

Linux平台屏幕|摄像头采集并实现RTMP推送两种技术方案探究

技术背景 随着国产化操作系统的推进&#xff0c;市场对国产化操作系统下的生态构建&#xff0c;需求越来越迫切&#xff0c;特别是音视频这块&#xff0c;今天我们讨论的是如何在linux平台实现屏幕|摄像头采集&#xff0c;并推送至RTMP服务。 我们知道&#xff0c;Linux平台&…

pdf压缩到指定大小需要怎么压缩?2024快速进行文件压缩的软件合集

pdf压缩到指定大小需要怎么压缩&#xff1f;2024快速进行文件压缩的软件合集 当你需要将PDF文件压缩到指定的大小时&#xff0c;选择适当的软件和方法可以帮助你在保持文件质量的同时&#xff0c;尽可能地减小文件体积。以下是五款可以帮助你快速压缩PDF文件并控制其大小的软件…

pdf在线转换成word免费版,一键免费转换

在日常的学习和办公中&#xff0c;PDF文件和Word文档是我们离不开的两种最常见的文件&#xff0c;而PDF与Word文档之间的转换成为了我们日常工作中不可或缺的一部分。无论是为了编辑、修改还是共享文件&#xff0c;掌握多种PDF转Word的方法都显得尤为重要。很多小伙伴关心能不能…

linux下的Socket网络编程教程

套接字概念 Socket本身有“插座”的意思&#xff0c;在Linux环境下&#xff0c;用于表示进程间网络通信的特殊文件类型。本质为内核借助缓冲区形成的伪文件。与管道类似的&#xff0c;Linux系统将其封装成文件的目的是为了统一接口&#xff0c;使得读写套接字和读写文件的操作…

万界星空科技MES:企业实现数字化转型的护航者

万界星空科技在制造业管理软件领域&#xff0c;特别是MES系统上的技术实力和创新能力&#xff0c;为制造型企业实现数字化转型提供了全方位的支持和保障。 一、万界星空MES系统的核心功能 实时数据采集与分析&#xff1a; 万界星空科技MES系统通过物联网技术实时采集生产现场的…

阿里P7大牛整理自动化测试高频面试题

最近好多粉丝咨询我&#xff0c;有没有软件测试方面的面试题&#xff0c;尤其是Python自动化测试相关的最新面试题&#xff0c;所以今天给大家整理了一份&#xff0c;希望能帮助到你们。 接口测试基础 1、公司接口测试流程是什么&#xff1f; 从开发那边获取接口设计文档、分…

IDOR + 账户接管

访问控制&#xff1a; 访问控制是对谁或什么有权执行操作或访问资源进行限制。在 Web 应用程序环境中&#xff0c;访问控制依赖于身份验证和会话管理&#xff1a; 身份验证可确认用户确实是其所说的身份。 会话管理识别同一用户发出了哪些后续 HTTP 请求。 访问控制决定用户…

【数据结构取经之路】布隆过滤器BloomFilter原理、误判率推导、代码实现

目录 背景介绍 简介 布隆过滤器的实现思路 布隆过滤器的作用 布隆过滤器误判率推导过程 布隆过滤器的实现 布隆过滤器的删除问题 布隆过滤器的优缺点 布隆过滤器的应用 背景介绍 在一些场景下面&#xff0c;有大量数据需要判断是否存在&#xff0c;而这些数据不是整…

免费分享:2014-2018年全球5.0级及以上地震正式报目录数据集

数据详情 本数据集为2014年—2018年中国台网正式目录&#xff08;统一编目目录&#xff09;全球5.0及以上地震6459次地震数据&#xff0c;属性字段包含发震时刻、经度、纬度、深度、地震类型、震级、参考位置、事件类型等。 数据属性 数据名称&#xff1a;全球5.0级及以上地震…

扑捉一只耿鬼(HTML文件)

图例&#xff1a; 代码&#xff1a; <!DOCTYPE html> <html lang"en"><head><meta charset"UTF-8" /><title>耿鬼</title><style>body {background: #fff;font-family: Comfortaa, sans-serif;}* {box-sizing:…

【K8s】专题十三:Kubernetes 容器运行时之 Docker 与 Containerd 详解

本文内容均来自个人笔记并重新梳理&#xff0c;如有错误欢迎指正&#xff01; 如果对您有帮助&#xff0c;烦请点赞、关注、转发、订阅专栏&#xff01; 专栏订阅入口 Linux 专栏 | Docker 专栏 | Kubernetes 专栏 往期精彩文章 【Docker】&#xff08;全网首发&#xff09;Kyl…