Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频写入H264 SEI数据

记录一下学习过程,得到一个需求是基于Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频。

需求:

  1. 在每一帧视频数据中,写入SEI额外数据,方便后期解码时获得每一帧中的自定义数据。
  2. 点击录制功能后,录制的是前N秒至后N秒这段时间的音视频,保存的文件都按照60s进行保存。

写在前面,整个学习过程涉及到以下内容,可以快速检索是否有想要的内容

  • MediaCodec的使用,采用的是createInputSurface()创建一个surface,通过EGL接受camera2传过来的画面。
  • AudioRecord的使用
  • Camera2的使用
  • OpenGL的简单使用
  • H264 SEI的写入简单例子

整体思路设计比较简单,打开相机,创建OpenGL相关环境,然后创建video线程录制video相关数据,创建audio线程录制audio相关数据,video和audio数据都存在自定义的List中作为缓存,最后使用一个编码线程,将video List和audio List中的数据编码到MP4中即可。用的安卓sdk 28,因为29以上保存比较麻烦。整个工程暂时没上传,有需要私。
将以上功能都模块化,分别写到不同的类中。先介绍一些独立的模块。

UI布局

ui很简单,一个GLSurfaceView,两个button控件。

在这里插入图片描述

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">
    <android.opengl.GLSurfaceView
        android:id="@+id/glView"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toTopOf="parent" />
    <Button
        android:id="@+id/recordBtn"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_marginBottom="80dp"
        android:text="Record"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintLeft_toLeftOf="parent"
        app:layout_constraintRight_toRightOf="parent" />
    <Button
        android:id="@+id/exit"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_marginTop="20dp"
        android:layout_marginRight="20dp"
        android:text="Eixt"
        app:layout_constraintTop_toTopOf="parent"
        app:layout_constraintRight_toRightOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>

Camera2

camera2框架的使用,比较简单,需要注意的一点是, startPreview函数中传入的surface用于后续mCaptureRequestBuilder.addTarget(surface)的参数传入。surface的产生由以下基本几步完成。现在简单提一下,下面会贴代码。
1.这个surface 就是通过openGL 生成的纹理, GLES30.glGenTextures(1, mTexture, 0);
2.纹理生成SurfaceTexture, mSurfaceTexture = new SurfaceTexture(mTexture[0]);
3.mSurfaceTexture生成一个surface, mSurface = new Surface(mSurfaceTexture);
4.mCamera.startPreview(mSurface);

public class Camera2 {
    private final String TAG = "Abbott Camera2";
    private Context mContext;
    private CameraManager mCameraManager;
    private CameraDevice mCameraDevice;
    private String[] mCamList;
    private String mCameraId;
    private Size mPreviewSize;
    private HandlerThread mBackgroundThread;
    private Handler mBackgroundHandler;
    private CaptureRequest.Builder mCaptureRequestBuilder;
    private CaptureRequest mCaptureRequest;
    private CameraCaptureSession mCameraCaptureSession;

    public Camera2(Context Context) {
        mContext = Context;
        mCameraManager = (CameraManager) mContext.getSystemService(android.content.Context.CAMERA_SERVICE);
        try {
            mCamList = mCameraManager.getCameraIdList();
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }

        mBackgroundThread = new HandlerThread("CameraThread");
        mBackgroundThread.start();
        mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
    }

    public void openCamera(int width, int height, String id) {
        try {
            Log.d(TAG, "openCamera: id:" + id);
            CameraCharacteristics characteristics = mCameraManager.getCameraCharacteristics(id);
            if (characteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT) {
            }
            StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            mPreviewSize = getOptimalSize(map.getOutputSizes(SurfaceTexture.class), width, height);
            mCameraId = id;
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }

        try {
            if (ActivityCompat.checkSelfPermission(mContext, android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                return;
            }
            Log.d(TAG, "mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);: " + mCameraId);
            mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private Size getOptimalSize(Size[] sizeMap, int width, int height) {
        List<Size> sizeList = new ArrayList<>();
        for (Size option : sizeMap) {
            if (width > height) {
                if (option.getWidth() > width && option.getHeight() > height) {
                    sizeList.add(option);
                }
            } else {
                if (option.getWidth() > height && option.getHeight() > width) {
                    sizeList.add(option);
                }
            }
        }
        if (sizeList.size() > 0) {
            return Collections.min(sizeList, new Comparator<Size>() {
                @Override
                public int compare(Size lhs, Size rhs) {
                    return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight());
                }
            });
        }
        return sizeMap[0];
    }

    private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
        @Override
        public void onOpened(@NonNull CameraDevice camera) {
            mCameraDevice = camera;
        }

        @Override
        public void onDisconnected(@NonNull CameraDevice camera) {
            camera.close();
            mCameraDevice = null;
        }

        @Override
        public void onError(@NonNull CameraDevice camera, int error) {
            camera.close();
            mCameraDevice = null;
        }
    };

    public void startPreview(Surface surface) {
        try {
            mCaptureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            mCaptureRequestBuilder.addTarget(surface);
            mCameraDevice.createCaptureSession(Collections.singletonList(surface), new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(@NonNull CameraCaptureSession session) {
                    try {
                        mCaptureRequest = mCaptureRequestBuilder.build();
                        mCameraCaptureSession = session;
                        mCameraCaptureSession.setRepeatingRequest(mCaptureRequest, null, mBackgroundHandler);
                    } catch (CameraAccessException e) {
                        e.printStackTrace();
                    }
                }

                @Override
                public void onConfigureFailed(@NonNull CameraCaptureSession session) {
                }
            }, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }
}

ImageList

这个类就是用于video 和audio缓存类,没有什么可以介绍的,直接用就好了。

public class ImageList {
    private static final String TAG = "Abbott ImageList";
    private Object mImageListLock = new Object();
    int kCapacity;
    private List<ImageItem> mImageList = new CopyOnWriteArrayList<>();

    public ImageList(int capacity) {
        kCapacity = capacity;
    }

    public synchronized void addItem(long Timestamp, ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
        synchronized (mImageListLock) {
            ImageItem item = new ImageItem(Timestamp, byteBuffer, bufferInfo);
            mImageList.add(item);
            if (mImageList.size() > kCapacity) {
                int excessItems = mImageList.size() - kCapacity;
                mImageList.subList(0, excessItems).clear();
            }
        }
    }

    public synchronized List<ImageItem> getItemsInTimeRange(long startTimestamp, long endTimestamp) {
        List<ImageItem> itemsInTimeRange = new ArrayList<>();
        synchronized (mImageListLock) {
            for (ImageItem item : mImageList) {
                long itemTimestamp = item.getTimestamp();
                // 判断时间戳是否在指定范围内
                if (itemTimestamp >= startTimestamp && itemTimestamp <= endTimestamp) {
                    itemsInTimeRange.add(item);
                }
            }
        }
        return itemsInTimeRange;
    }

    public synchronized ImageItem getItem() {
        return mImageList.get(0);
    }

    public synchronized void removeItem() {
        mImageList.remove(0);
    }

    public synchronized int getSize() {
        return mImageList.size();
    }

    public static class ImageItem {
        private long mTimestamp;
        private ByteBuffer mVideoBuffer;
        private MediaCodec.BufferInfo mVideoBufferInfo;
        public ImageItem(long first, ByteBuffer second, MediaCodec.BufferInfo bufferInfo) {
            this.mTimestamp = first;
            this.mVideoBuffer = second;
            this.mVideoBufferInfo = bufferInfo;
        }

        public synchronized long getTimestamp() {
            return mTimestamp;
        }

        public synchronized ByteBuffer getVideoByteBuffer() {
            return mVideoBuffer;
        }

        public synchronized MediaCodec.BufferInfo getVideoBufferInfo() {
            return mVideoBufferInfo;
        }
    }
}

GlProgram

用于创建OpenGL的程序的类。目前使用的是OpenGL3.0 版本

public class GlProgram {
    public static final String mVertexShader =
            "#version 300 es \n" +
            "in vec4 vPosition;" +
            "in vec2 vCoordinate;" +
            "out vec2 vTextureCoordinate;" +
            "void main() {" +
            "   gl_Position = vPosition;" +
            "   vTextureCoordinate = vCoordinate;" +
            "}";
    public static final String mFragmentShader =
            "#version 300 es \n" +
            "#extension GL_OES_EGL_image_external : require \n" +
            "#extension GL_OES_EGL_image_external_essl3 : require \n" +
            "precision mediump float;" +
            "in vec2 vTextureCoordinate;" +
            "uniform samplerExternalOES oesTextureSampler;" +
            "out vec4 gl_FragColor;" +
            "void main() {" +
            "    gl_FragColor = texture(oesTextureSampler, vTextureCoordinate);" +
            "}";

    public static int createProgram(String vertexShaderSource, String fragShaderSource) {
        int program = GLES30.glCreateProgram();
        if (0 == program) {
            Log.e("Arc_ShaderManager", "create program error ,error=" + GLES30.glGetError());
            return 0;
        }
        int vertexShader = loadShader(GLES30.GL_VERTEX_SHADER, vertexShaderSource);
        if (0 == vertexShader) {
            return 0;
        }
        int fragShader = loadShader(GLES30.GL_FRAGMENT_SHADER, fragShaderSource);
        if (0 == fragShader) {
            return 0;
        }
        GLES30.glAttachShader(program, vertexShader);
        GLES30.glAttachShader(program, fragShader);
        GLES30.glLinkProgram(program);

        int[] status = new int[1];
        GLES30.glGetProgramiv(program, GLES30.GL_LINK_STATUS, status, 0);
        if (GLES30.GL_FALSE == status[0]) {
            String errorMsg = GLES30.glGetProgramInfoLog(program);
            Log.e("Arc_ShaderManager", "createProgram error : " + errorMsg);
            GLES30.glDeleteShader(vertexShader);
            GLES30.glDeleteShader(fragShader);
            GLES30.glDeleteProgram(program);
            return 0;
        }
        GLES30.glDetachShader(program, vertexShader);
        GLES30.glDetachShader(program, fragShader);
        GLES30.glDeleteShader(vertexShader);
        GLES30.glDeleteShader(fragShader);
        return program;
    }

    private static int loadShader(int type, String shaderSource) {
        int shader = GLES30.glCreateShader(type);
        if (0 == shader) {
            Log.e("Arc_ShaderManager", "create shader error, shader type=" + type + " , error=" + GLES30.glGetError());
            return 0;
        }
        GLES30.glShaderSource(shader, shaderSource);
        GLES30.glCompileShader(shader);

        int[] status = new int[1];
        GLES30.glGetShaderiv(shader, GLES30.GL_COMPILE_STATUS, status, 0);
        if (0 == status[0]) {
            String errorMsg = GLES30.glGetShaderInfoLog(shader);
            Log.e("Arc_ShaderManager", "createShader shader = " + type + "  error: " + errorMsg);
            GLES30.glDeleteShader(shader);
            return 0;
        }
        return shader;
    }
}

OesTexture

连接上面介绍的OpenGL程序,通过顶点着色器和片元着色器的坐标生成纹理

public class OesTexture {
    private static final String TAG = "Abbott OesTexture";
    private int mProgram;
    private final FloatBuffer mCordsBuffer;
    private final FloatBuffer mPositionBuffer;
    private int mPositionHandle;
    private int mCordsHandle;
    private int mOESTextureHandle;

    public OesTexture() {
        float[] positions = {
                -1.0f, 1.0f,
                -1.0f, -1.0f,
                1.0f, 1.0f,
                1.0f, -1.0f
        };
        float[] texCords = {
                0.0f, 0.0f,
                0.0f, 1.0f,
                1.0f, 0.0f,
                1.0f, 1.0f,
        };
        mPositionBuffer = ByteBuffer.allocateDirect(positions.length * 4).order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        mPositionBuffer.put(positions).position(0);

        mCordsBuffer = ByteBuffer.allocateDirect(texCords.length * 4).order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        mCordsBuffer.put(texCords).position(0);
    }

    public void init() {
        this.mProgram = GlProgram.createProgram(GlProgram.mVertexShader, GlProgram.mFragmentShader);
        if (0 == this.mProgram) {
            Log.e(TAG, "createProgram failed");
        }
        mPositionHandle = GLES30.glGetAttribLocation(mProgram, "vPosition");
        mCordsHandle = GLES30.glGetAttribLocation(mProgram, "vCoordinate");
        mOESTextureHandle = GLES30.glGetUniformLocation(mProgram, "oesTextureSampler");
        GLES30.glDisable(GLES30.GL_DEPTH_TEST);
    }

    public void PrepareTexture(int OESTextureId) {
        GLES30.glUseProgram(this.mProgram);
        GLES30.glEnableVertexAttribArray(mPositionHandle);
        GLES30.glVertexAttribPointer(mPositionHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mPositionBuffer);
        GLES30.glEnableVertexAttribArray(mCordsHandle);
        GLES30.glVertexAttribPointer(mCordsHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mCordsBuffer);
        GLES30.glActiveTexture(GLES30.GL_TEXTURE0);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, OESTextureId);
        GLES30.glUniform1i(mOESTextureHandle, 0);
        GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);
        GLES30.glDisableVertexAttribArray(mPositionHandle);
        GLES30.glDisableVertexAttribArray(mCordsHandle);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
    }
}

接下来介绍的VideoRecorder,AudioEncoder,EncodingRunnable三个类需要互相搭配使用

public class AudioEncoder extends Thread {
    private static final String TAG = "Abbott AudioEncoder";
    private static final int SAVEMP4_INTERNAL = Param.recordInternal * 1000 * 1000;
    private static final int SAMPLE_RATE = 44100;
    private static final int CHANNEL_COUNT = 1;
    private static final int BIT_RATE = 96000;
    private EncodingRunnable mEncodingRunnable;
    private MediaCodec mMediaCodec;
    private AudioRecord mAudioRecord;
    private MediaFormat mFormat;
    private MediaFormat mOutputFormat;

    private long nanoTime;
    int mBufferSizeInBytes = 0;
    boolean mExitThread = true;
    private ImageList mAudioList;
    private MediaCodec.BufferInfo mAudioBufferInfo;
    private boolean mAlarm = false;
    private long mAlarmTime;
    private long mAlarmStartTime;
    private long mAlarmEndTime;
    private List<ImageList.ImageItem> mMuxerImageItem;
    private Object mLock = new Object();
    private MediaCodec.BufferInfo mAlarmBufferInfo;

    public AudioEncoder( EncodingRunnable encodingRunnable) throws IOException {
        mEncodingRunnable = encodingRunnable;
        nanoTime = System.nanoTime();
        createAudio();
        createMediaCodec();
        int kCapacity = 1000 / 20 * Param.recordInternal;
        mAudioList = new ImageList(kCapacity);
    }


    public void createAudio() {
        mBufferSizeInBytes = AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
        mAudioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, mBufferSizeInBytes);
    }

    public void createMediaCodec() throws IOException {
        mFormat = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, SAMPLE_RATE, CHANNEL_COUNT);
        mFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
        mFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
        mFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 8192);
        mMediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);
        mMediaCodec.configure(mFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
    }

    public synchronized void setAlarm() {
        synchronized (mLock) {
            Log.d(TAG, "setAudio Alarm enter");
            mEncodingRunnable.setAudioFormat(mOutputFormat);
            mEncodingRunnable.setAudioAlarmTrue();
            mAlarmTime = mAlarmBufferInfo.presentationTimeUs;
            mAlarmEndTime = mAlarmTime + SAVEMP4_INTERNAL;
            if (!mAlarm) {
                mAlarmStartTime = mAlarmTime - SAVEMP4_INTERNAL;
            }
            mAlarm = true;
            Log.d(TAG, "setAudio Alarm exit");
        }
    }


    @Override
    public void run() {
        super.run();
        mMediaCodec.start();
        mAudioRecord.startRecording();
        while (mExitThread) {
            synchronized (mLock) {
                byte[] inputAudioData = new byte[mBufferSizeInBytes];
                int res = mAudioRecord.read(inputAudioData, 0, inputAudioData.length);
                if (res > 0) {
                    if (mAudioRecord != null) {
                        enCodeAudio(inputAudioData);
                    }
                }
            }
        }
        Log.d(TAG, "AudioRecord run: exit");
    }

    private void enCodeAudio(byte[] inputAudioData) {
        mAudioBufferInfo = new MediaCodec.BufferInfo();
        int index = mMediaCodec.dequeueInputBuffer(-1);
        if (index < 0) {
            return;
        }
        ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
        ByteBuffer audioInputBuffer = inputBuffers[index];
        audioInputBuffer.clear();
        audioInputBuffer.put(inputAudioData);
        audioInputBuffer.limit(inputAudioData.length);
        mMediaCodec.queueInputBuffer(index, 0, inputAudioData.length, (System.nanoTime() - nanoTime) / 1000, 0);

        int status = mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);
        ByteBuffer outputBuffer;
        if (status == MediaCodec.INFO_TRY_AGAIN_LATER) {
        } else if (status == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
            mOutputFormat = mMediaCodec.getOutputFormat();
        } else {
            while (status >= 0) {
                MediaCodec.BufferInfo tmpaudioBufferInfo = new MediaCodec.BufferInfo();
                tmpaudioBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);
                mAlarmBufferInfo = new MediaCodec.BufferInfo();
                mAlarmBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);
                outputBuffer = mMediaCodec.getOutputBuffer(status);
                ByteBuffer buffer = ByteBuffer.allocate(tmpaudioBufferInfo.size);
                buffer.limit(tmpaudioBufferInfo.size);
                buffer.put(outputBuffer);
                buffer.flip();
                if (tmpaudioBufferInfo.size > 0) {
                    if (mAlarm) {
                        mMuxerImageItem = mAudioList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);
                        for (ImageList.ImageItem item : mMuxerImageItem) {
                            mEncodingRunnable.pushAudio(item);
                        }
                        mAlarmStartTime = tmpaudioBufferInfo.presentationTimeUs;
                        mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);
                        if (tmpaudioBufferInfo.presentationTimeUs - mAlarmTime > SAVEMP4_INTERNAL) {
                            mAlarm = false;
                            mEncodingRunnable.setAudioAlarmFalse();
                            Log.d(TAG, "mEncodingRunnable.setAudio itemAlarmFalse();");
                        }
                    } else {
                        mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);
                    }
                }
                mMediaCodec.releaseOutputBuffer(status, false);
                status = mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);
            }
        }
    }


    public synchronized void stopAudioRecord() throws IllegalStateException {
        synchronized (mLock) {
            mExitThread = false;
        }
        try {
            join();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        mMediaCodec.stop();
        mMediaCodec.release();
        mMediaCodec = null;
    }
}
public class VideoRecorder extends Thread {
    private static final String TAG = "Abbott VideoRecorder";
    private static final int SAVE_MP4_Internal = 1000 * 1000 * Param.recordInternal;
    // EGL
    private static final int EGL_RECORDABLE_ANDROID = 0x3142;
    private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;
    private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;
    private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;
    private EGLContext mSharedContext = EGL14.EGL_NO_CONTEXT;
    private Surface mSurface;
    private int mOESTextureId;
    private OesTexture mOesTexture;
    private ImageList mImageList;
    private List<ImageList.ImageItem> muxerImageItem;
    // Thread
    private boolean mExitThread;
    private Object mLock = new Object();
    private Object object = new Object();
    private MediaCodec mMediaCodec;
    private MediaFormat mOutputFormat;
    private boolean mAlarm = false;
    private long mAlarmTime;
    private long mAlarmStartTime;
    private long mAlarmEndTime;

    private MediaCodec.BufferInfo mBufferInfo;
    private EncodingRunnable mEncodingRunnable;
    private String mSeiMessage;

    public VideoRecorder(EGLContext eglContext, EncodingRunnable encodingRunnable) {
        mSharedContext = eglContext;
        mEncodingRunnable = encodingRunnable;
        int kCapacity = 1000 / 40 * Param.recordInternal;
        mImageList = new ImageList(kCapacity);

        try {
            MediaFormat mediaFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1920, 1080);
            mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
            mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1920 * 1080 * 25 / 5);
            mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);
            mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
            mMediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
            mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            mSurface = mMediaCodec.createInputSurface();
        } catch (IOException e) {
            e.printStackTrace();
        }

    }

    @Override
    public void run() {
        super.run();
        try {
            initEgl();
            mOesTexture = new OesTexture();
            mOesTexture.init();
            synchronized (mLock) {
                mLock.wait(33);
            }
            guardedRun();
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    private void guardedRun() throws InterruptedException, RuntimeException {
        mExitThread = false;
        while (true) {
            synchronized (mLock) {
                if (mExitThread) {
                    break;
                }
                mLock.wait(33);
            }
            mOesTexture.PrepareTexture(mOESTextureId);
            swapBuffers();
            enCodeVideo();
        }
        Log.d(TAG, "guardedRun: exit");
        unInitEgl();
    }


    private void enCodeVideo() {
        mBufferInfo = new MediaCodec.BufferInfo();
        int status = mMediaCodec.dequeueOutputBuffer(mBufferInfo, 0);

        ByteBuffer outputBuffer = null;
        if (status == MediaCodec.INFO_TRY_AGAIN_LATER) {
        } else if (status == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
            mOutputFormat = mMediaCodec.getOutputFormat();
        } else if (status == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
        } else {
            outputBuffer = mMediaCodec.getOutputBuffer(status);
            if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                mBufferInfo.size = 0;
            }
            if (mBufferInfo.size > 0) {
                outputBuffer.position(mBufferInfo.offset);
                outputBuffer.limit(mBufferInfo.size - mBufferInfo.offset);
                mSeiMessage = "avcIndex" + String.format("%05d", 0);
            }
            mMediaCodec.releaseOutputBuffer(status, false);
        }
        if (mBufferInfo.size > 0) {
            mEncodingRunnable.setTimeUs(mBufferInfo.presentationTimeUs);

            ByteBuffer seiData = buildSEIData(mSeiMessage);
            ByteBuffer frameWithSEI = ByteBuffer.allocate(outputBuffer.remaining() + seiData.remaining());
            frameWithSEI.put(seiData);
            frameWithSEI.put(outputBuffer);
            frameWithSEI.flip();
            mBufferInfo.size = frameWithSEI.remaining();

            MediaCodec.BufferInfo tmpAudioBufferInfo = new MediaCodec.BufferInfo();
            tmpAudioBufferInfo.set(mBufferInfo.offset, mBufferInfo.size, mBufferInfo.presentationTimeUs, mBufferInfo.flags);
            if (mAlarm) {
                muxerImageItem = mImageList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);
                mAlarmStartTime = tmpAudioBufferInfo.presentationTimeUs;
                for (ImageList.ImageItem item : muxerImageItem) {
                    mEncodingRunnable.push(item);
                }
                mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);
                if (mBufferInfo.presentationTimeUs - mAlarmTime > SAVE_MP4_Internal) {
                    Log.d(TAG, "mEncodingRunnable.set itemAlarmFalse()");
                    Log.d(TAG, tmpAudioBufferInfo.presentationTimeUs + " " + mAlarmTime);
                    mAlarm = false;
                    mEncodingRunnable.setVideoAlarmFalse();
                }
            } else {
                mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);
            }

        }
    }

    public synchronized void setAlarm() {
        synchronized (mLock) {
            Log.d(TAG, "setAlarm enter");
            mEncodingRunnable.setMediaFormat(mOutputFormat);
            mEncodingRunnable.setVideoAlarmTrue();
            if (mBufferInfo.presentationTimeUs != 0) {
                mAlarmTime = mBufferInfo.presentationTimeUs;
            }
            mAlarmEndTime = mAlarmTime + SAVE_MP4_Internal;
            if (!mAlarm) {
                mAlarmStartTime = mAlarmTime - SAVE_MP4_Internal;
            }
            mAlarm = true;
            Log.d(TAG, "setAlarm exit");
        }
    }

    public synchronized void startRecord() throws IllegalStateException {
        super.start();
        mMediaCodec.start();
    }

    public synchronized void stopVideoRecord() throws IllegalStateException {
        synchronized (mLock) {
            mExitThread = true;
            mLock.notify();
        }
        try {
            join();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }

        mMediaCodec.signalEndOfInputStream();
        mMediaCodec.stop();
        mMediaCodec.release();
        mMediaCodec = null;
    }

    public void requestRender(int i) {
        synchronized (object) {
            mOESTextureId = i;
        }
    }

    private void initEgl() {
        this.mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
        if (this.mEGLDisplay == EGL14.EGL_NO_DISPLAY) {
            throw new RuntimeException("EGL14.eglGetDisplay fail...");
        }
        int[] major_version = new int[2];
        boolean eglInited = EGL14.eglInitialize(this.mEGLDisplay, major_version, 0, major_version, 1);
        if (!eglInited) {
            this.mEGLDisplay = null;
            throw new RuntimeException("EGL14.eglInitialize fail...");
        }

        //4. 设置显示设备的属性
        int[] attrib_list = new int[]{
                EGL14.EGL_SURFACE_TYPE, EGL14.EGL_WINDOW_BIT,
                EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
                EGL14.EGL_RED_SIZE, 8,
                EGL14.EGL_GREEN_SIZE, 8,
                EGL14.EGL_BLUE_SIZE, 8,
                EGL14.EGL_ALPHA_SIZE, 8,
                EGL14.EGL_DEPTH_SIZE, 16,
                EGL_RECORDABLE_ANDROID, 1,
                EGL14.EGL_NONE};

        EGLConfig[] configs = new EGLConfig[1];
        int[] numConfigs = new int[1];
        boolean eglChose = EGL14.eglChooseConfig(this.mEGLDisplay, attrib_list, 0, configs, 0, configs.length, numConfigs, 0);
        if (!eglChose) {
            throw new RuntimeException("eglChooseConfig [RGBA888 + recordable] ES2 EGL_config_fail...");
        }
        int[] attr_list = {EGL14.EGL_CONTEXT_CLIENT_VERSION, 2, EGL14.EGL_NONE};
        this.mEGLContext = EGL14.eglCreateContext(this.mEGLDisplay, configs[0], this.mSharedContext, attr_list, 0);

        checkEglError("eglCreateContext");
        if (this.mEGLContext == EGL14.EGL_NO_CONTEXT) {
            throw new RuntimeException("eglCreateContext == EGL_NO_CONTEXT");
        }
        int[] surface_attr = {EGL14.EGL_NONE};
        this.mEGLSurface = EGL14.eglCreateWindowSurface(this.mEGLDisplay, configs[0], this.mSurface, surface_attr, 0);
        if (this.mEGLSurface == EGL14.EGL_NO_SURFACE) {
            throw new RuntimeException("eglCreateWindowSurface == EGL_NO_SURFACE");
        }

        Log.d(TAG, "initEgl , display=" + this.mEGLDisplay + " ,context=" + this.mEGLContext + " ,sharedContext= " +
                this.mSharedContext + ", surface=" + this.mEGLSurface);
        boolean success = EGL14.eglMakeCurrent(this.mEGLDisplay, this.mEGLSurface, this.mEGLSurface, this.mEGLContext);
        if (!success) {
            checkEglError("makeCurrent");
            throw new RuntimeException("eglMakeCurrent failed");
        }
    }

    private void unInitEgl() {
        boolean success = EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_CONTEXT);
        if (!success) {
            checkEglError("makeCurrent");
            throw new RuntimeException("eglMakeCurrent failed");
        }
        if (this.mEGLDisplay != EGL14.EGL_NO_DISPLAY) {
            EGL14.eglDestroySurface(this.mEGLDisplay, this.mEGLSurface);
            EGL14.eglDestroyContext(this.mEGLDisplay, this.mEGLContext);
            EGL14.eglTerminate(this.mEGLDisplay);
        }
        this.mEGLDisplay = EGL14.EGL_NO_DISPLAY;
        this.mEGLContext = EGL14.EGL_NO_CONTEXT;
        this.mEGLSurface = EGL14.EGL_NO_SURFACE;
        this.mSharedContext = EGL14.EGL_NO_CONTEXT;
        this.mSurface = null;
    }

    private boolean swapBuffers() {
        if ((null == this.mEGLDisplay) || (null == this.mEGLSurface)) {
            return false;
        }
        boolean success = EGL14.eglSwapBuffers(this.mEGLDisplay, this.mEGLSurface);
        if (!success) {
            checkEglError("eglSwapBuffers");
        }
        return success;
    }

    private void checkEglError(String msg) {
        int error = EGL14.eglGetError();
        if (error != EGL14.EGL_SUCCESS) {
            throw new RuntimeException(msg + ": EGL_ERROR_CODE: 0x" + Integer.toHexString(error));
        }
    }

    private ByteBuffer buildSEIData(String message) {
        // 构建 SEI 数据
        int seiSize = 128;
        ByteBuffer seiBuffer = ByteBuffer.allocate(seiSize);
        seiBuffer.put(new byte[]{0, 0, 0, 1, 6, 5});
        // 设置 SEI message
        String seiMessage = "h264testdata" + message;
        seiBuffer.put((byte) seiMessage.length());
        // 设置 SEI user data
        seiBuffer.put(seiMessage.getBytes());
        seiBuffer.flip();
        return seiBuffer;
    }

}
public class EncodingRunnable extends Thread {
    private static final String TAG = "Abbott EncodingRunnable";
    private Object mRecordLock = new Object();
    private boolean mExitThread = false;
    private MediaMuxer mMediaMuxer;
    private int avcIndex;
    private int mAudioIndex;
    private MediaFormat mOutputFormat;
    private MediaFormat mAudioOutputFormat;
    private ImageList mImageList;
    private ImageList mAudioImageList;
    private boolean itemAlarm;
    private long mAudioImageListTimeUs = -1;
    private boolean mAudioAlarm;

    private int mVideoCapcity = 1000 / 40 * Param.recordInternal;
    private int mAudioCapcity = 1000 / 20 * Param.recordInternal;
    private int recordSecond = 1000 * 1000 * 60;
    long Video60sStart = -1;

    public EncodingRunnable() {
        mImageList = new ImageList(mVideoCapcity);
        mAudioImageList = new ImageList(mAudioCapcity);
    }

    private boolean mIsRecoding = false;

    public void setMediaFormat(MediaFormat OutputFormat) {
        if (mOutputFormat == null) {
            mOutputFormat = OutputFormat;
        }
    }

    public void setAudioFormat(MediaFormat OutputFormat) {
        if (mAudioOutputFormat == null) {
            mAudioOutputFormat = OutputFormat;
        }
    }

    public void setMediaMuxerConfig() {
        long currentTimeMillis = System.currentTimeMillis();
        Date currentDate = new Date(currentTimeMillis);
        SimpleDateFormat dateFormat = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault());
        String fileName = dateFormat.format(currentDate);
        File mFile = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM),
                fileName + ".MP4");
        Log.d(TAG, "setMediaMuxerSavaPath: new MediaMuxer  " + mFile.getPath());
        try {
            mMediaMuxer = new MediaMuxer(mFile.getPath(), MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
        } catch (IOException e) {
            e.printStackTrace();
        }
        avcIndex = mMediaMuxer.addTrack(mOutputFormat);
        mAudioIndex = mMediaMuxer.addTrack(mAudioOutputFormat);
        mMediaMuxer.start();

    }

    public void setMediaMuxerSavaPath() {
        if (!mIsRecoding) {
            mExitThread = false;
            setMediaMuxerConfig();
            setRecording();
            notifyStartRecord();
        }
    }

    @Override
    public void run() {
        super.run();
        while (true) {
            synchronized (mRecordLock) {
                try {
                    mRecordLock.wait();
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
            }
            MediaCodec.BufferInfo tmpAudioBufferInfo = new MediaCodec.BufferInfo();
            while (mIsRecoding) {
                if (mAudioImageList.getSize() > 0) {
                    ImageList.ImageItem audioItem = mAudioImageList.getItem();
                    tmpAudioBufferInfo.set(audioItem.getVideoBufferInfo().offset,
                            audioItem.getVideoBufferInfo().size,
                            audioItem.getVideoBufferInfo().presentationTimeUs + mAudioImageListTimeUs,
                            audioItem.getVideoBufferInfo().flags);
                    mMediaMuxer.writeSampleData(mAudioIndex, audioItem.getVideoByteBuffer(), tmpAudioBufferInfo);
                    mAudioImageList.removeItem();
                }
                if (mImageList.getSize() > 0) {
                    ImageList.ImageItem item = mImageList.getItem();
                    if (Video60sStart < 0) {
                        Video60sStart = item.getVideoBufferInfo().presentationTimeUs;
                    }
                    mMediaMuxer.writeSampleData(avcIndex, item.getVideoByteBuffer(), item.getVideoBufferInfo());
                    if (item.getVideoBufferInfo().presentationTimeUs - Video60sStart > recordSecond) {
                        Log.d(TAG, "System.currentTimeMillis() - Video60sStart :" + (item.getVideoBufferInfo().presentationTimeUs - Video60sStart));
                        mMediaMuxer.stop();
                        mMediaMuxer.release();
                        mMediaMuxer = null;
                        setMediaMuxerConfig();
                        Video60sStart = -1;
                    }
                    mImageList.removeItem();
                }
                if (itemAlarm == false && mAudioAlarm == false) {
                    mIsRecoding = false;
                    Log.d(TAG, "mediaMuxer.stop()");
                    mMediaMuxer.stop();
                    mMediaMuxer.release();
                    mMediaMuxer = null;
                    break;
                }
            }
            if (mExitThread) {
                break;
            }
        }
    }

    public synchronized void setRecording() throws IllegalStateException {
        synchronized (mRecordLock) {
            mIsRecoding = true;
        }
    }

    public synchronized void setAudioAlarmTrue() throws IllegalStateException {
        synchronized (mRecordLock) {
            mAudioAlarm = true;
        }
    }

    public synchronized void setVideoAlarmTrue() throws IllegalStateException {
        synchronized (mRecordLock) {
            itemAlarm = true;
        }
    }

    public synchronized void setAudioAlarmFalse() throws IllegalStateException {
        synchronized (mRecordLock) {
            mAudioAlarm = false;
        }
    }

    public synchronized void setVideoAlarmFalse() throws IllegalStateException {
        synchronized (mRecordLock) {
            itemAlarm = false;
        }
    }


    public synchronized void notifyStartRecord() throws IllegalStateException {
        synchronized (mRecordLock) {
            mRecordLock.notify();
        }
    }

    public synchronized void push(ImageList.ImageItem item) {
        mImageList.addItem(item.getTimestamp(),
                item.getVideoByteBuffer(),
                item.getVideoBufferInfo());
    }

    public synchronized void pushAudio(ImageList.ImageItem item) {
        synchronized (mRecordLock) {
            mAudioImageList.addItem(item.getTimestamp(),
                    item.getVideoByteBuffer(),
                    item.getVideoBufferInfo());
        }
    }

    public synchronized void setTimeUs(long l) {
        if (mAudioImageListTimeUs != -1) {
            return;
        }
        mAudioImageListTimeUs = l;
        Log.d(TAG, "setTimeUs: " + l);
    }

    public synchronized void setExitThread() {
        mExitThread = true;
        mIsRecoding = false;
        notifyStartRecord();
        try {
            join();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }

    }

}

最后介绍一下Camera2Renderer和MainActivity

Camera2Renderer

Camera2Renderer继承GLSurfaceView.Renderer,通过这个类来调动所有的代码。

public class Camera2Renderer implements GLSurfaceView.Renderer {
    private static final String TAG = "Abbott Camera2Renderer";
    final private Context mContext;
    final private GLSurfaceView mGlSurfaceView;
    private Camera2 mCamera;
    private int[] mTexture = new int[1];
    private SurfaceTexture mSurfaceTexture;
    private Surface mSurface;
    private OesTexture mOesTexture;
    private EGLContext mEglContext = null;
    private VideoRecorder mVideoRecorder;
    private EncodingRunnable mEncodingRunnable;
    private AudioEncoder mAudioEncoder;

    public Camera2Renderer(Context context, GLSurfaceView glSurfaceView, EncodingRunnable encodingRunnable) {
        mContext = context;
        mGlSurfaceView = glSurfaceView;
        mEncodingRunnable = encodingRunnable;
    }


    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        mCamera = new Camera2(mContext);
        mCamera.openCamera(1920, 1080, "0");
        mOesTexture = new OesTexture();
        mOesTexture.init();
        mEglContext = EGL14.eglGetCurrentContext();
        mVideoRecorder = new VideoRecorder(mEglContext, mEncodingRunnable);
        mVideoRecorder.startRecord();
        try {
            mAudioEncoder = new AudioEncoder(mEncodingRunnable);
            mAudioEncoder.start();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        GLES30.glGenTextures(1, mTexture, 0);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTexture[0]);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
        mSurfaceTexture = new SurfaceTexture(mTexture[0]);
        mSurfaceTexture.setDefaultBufferSize(1920, 1080);
        mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
            @Override
            public void onFrameAvailable(SurfaceTexture surfaceTexture) {
                mGlSurfaceView.requestRender();
            }
        });
        mSurface = new Surface(mSurfaceTexture);
        mCamera.startPreview(mSurface);
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        mSurfaceTexture.updateTexImage();
        mOesTexture.PrepareTexture(mTexture[0]);
        mVideoRecorder.requestRender(mTexture[0]);
    }

    public VideoRecorder getVideoRecorder() {
        return mVideoRecorder;
    }

    public AudioEncoder getAudioEncoder() {
        return mAudioEncoder;
    }
}

主函数比较简单,就是申请权限而已。

public class MainActivity extends AppCompatActivity {
    private static final String TAG = "Abbott MainActivity";
    private static final String FRAGMENT_DIALOG = "dialog";
    private final Object mLock = new Object();

    private GLSurfaceView mGlSurfaceView;
    private Button mRecordButton;
    private Button mExitButton;

    private Camera2Renderer mCamera2Renderer;
    private VideoRecorder mVideoRecorder;
    private EncodingRunnable mEncodingRunnable;
    private AudioEncoder mAudioEncoder;
    private static final int REQUEST_CAMERA_PERMISSION = 1;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED
                || ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED
                || ContextCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED
                || ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
            requestCameraPermission();
            return;
        }

        setContentView(R.layout.activity_main);
        mGlSurfaceView = findViewById(R.id.glView);
        mRecordButton = findViewById(R.id.recordBtn);
        mExitButton = findViewById(R.id.exit);

        mGlSurfaceView.setEGLContextClientVersion(3);
        mEncodingRunnable = new EncodingRunnable();
        mEncodingRunnable.start();
        mCamera2Renderer = new Camera2Renderer(this, mGlSurfaceView, mEncodingRunnable);
        mGlSurfaceView.setRenderer(mCamera2Renderer);
        mGlSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);

    }

    @Override
    protected void onResume() {
        super.onResume();
        mRecordButton.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                synchronized (MainActivity.this) {
                    startRecord();
                }
            }
        });

        mExitButton.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                stopRecord();
                Log.d(TAG, "onClick: exit program");
                finish();
            }
        });

    }

    private void requestCameraPermission() {
        if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) ||
                shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE) ||
                shouldShowRequestPermissionRationale(Manifest.permission.RECORD_AUDIO)) {
            new ConfirmationDialog().show(getSupportFragmentManager(), FRAGMENT_DIALOG);
        } else {
            requestPermissions(new String[]{Manifest.permission.CAMERA,
                    Manifest.permission.WRITE_EXTERNAL_STORAGE,
                    Manifest.permission.RECORD_AUDIO}, REQUEST_CAMERA_PERMISSION);
        }
    }

    public static class ConfirmationDialog extends DialogFragment {
        @NonNull
        @Override
        public Dialog onCreateDialog(Bundle savedInstanceState) {
            final Fragment parent = getParentFragment();
            return new AlertDialog.Builder(getActivity())
                    .setMessage(R.string.request_permission)
                    .setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
                        @Override
                        public void onClick(DialogInterface dialog, int which) {
                        }
                    })
                    .setNegativeButton(android.R.string.cancel,
                            new DialogInterface.OnClickListener() {
                                @Override
                                public void onClick(DialogInterface dialog, int which) {
                                    Activity activity = parent.getActivity();
                                    if (activity != null) {
                                        activity.finish();
                                    }
                                }
                            })
                    .create();
        }
    }


    private void startRecord() {
        synchronized (mLock) {
            try {
                if (mVideoRecorder == null) {
                    mVideoRecorder = mCamera2Renderer.getVideoRecorder();
                }
                if (mAudioEncoder == null) {
                    mAudioEncoder = mCamera2Renderer.getAudioEncoder();
                }
                mVideoRecorder.setAlarm();
                mAudioEncoder.setAlarm();
                mEncodingRunnable.setMediaMuxerSavaPath();
                Log.d(TAG, "Start Record ");
            } catch (Exception e) {
                e.printStackTrace();
            }
        }
    }

    private void stopRecord() {
        if (mVideoRecorder == null) {
            mVideoRecorder = mCamera2Renderer.getVideoRecorder();
        }
        if (mAudioEncoder == null) {
            mAudioEncoder = mCamera2Renderer.getAudioEncoder();
        }
        mEncodingRunnable.setExitThread();
        mVideoRecorder.stopVideoRecord();
        mAudioEncoder.stopAudioRecord();
    }


}

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/372677.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

RCS系统之:显示AGV预测路线

在AGV做业务过程中&#xff0c;常会看到AGV一直停在哪里&#xff0c;没有任何动作。所以显示AGV马上要行进的路线非常有必要。 好处有&#xff1a; AGV是否有任务&#xff0c;AGV是否已经规划出路线&#xff1b;AGV马上要行进的路线 那具体要如何实现呢&#xff1f;有兴趣的可…

ONLYOFFICE:一站式办公,探索高效办公新境界

写在前面ONLYOFFICE 介绍ONLYOFFICE 有哪些优势ONLYOFFICE 文档 8.0 发布如何体验 ONLYOFFICEONLYOFFICE 文档部分页面截图 写在前面 在当今这样一个数字化时代&#xff0c;办公软件已经成为我们日常工作中不可或缺的一部分&#xff0c;熟练使用 Office、WPS、腾讯文档、金山文…

如何在Mac上允许主流浏览器使用弹出式窗口?这里有详细步骤

这篇文章教你如何关闭流行的Mac浏览器上的弹出窗口阻止程序,包括Safari、Chrome和Firefox。它还探讨了你可能希望这样做的原因及其影响。 如何在Mac上允许Safari使用弹出窗口 如果你经常在Mac上使用Safari,你会注意到默认情况下弹出窗口阻止程序是打开的。有时,这并不方便…

[office] 教你实现Excel中工作表重命名的诀窍 #知识分享#职场发展#其他

教你实现Excel中工作表重命名的诀窍 在Excel中要实现工作表的重命名其实不是难事&#xff0c;重在你要掌握技巧。一些初学者&#xff0c;可能还不是特别的懂。今天&#xff0c;小编就要一步步来教一下大家了。有两种方法&#xff0c;大家学好了。 方法一、打开excel表格&#x…

YOLOv8改进 | 检测头篇 | 重参数化检测头RepHead解决困难样本检测(全网独家首发)

一、本文介绍 本文给大家带来的改进机制是RepHead,该检测头为我独家全网首发,该检测头由重参数化模块组成,加大对于特征学习的能力,却可以不增加GFLOPs(仅仅略微提升)从而不影响模型的推理速度和性能,保持较高的FPS能力,牺牲了少量GFLOPs的情况下确提高了模型的特征提…

进程和线程的区别详解

&#x1f3a5; 个人主页&#xff1a;Dikz12&#x1f4d5;格言&#xff1a;那些在暗处执拗生长的花&#xff0c;终有一日会馥郁传香欢迎大家&#x1f44d;点赞✍评论⭐收藏 目录 进程 进程在系统中是如何管理的 进一步认识PCB 线程 能否一直增加线程数目来提高效率 进程和线程…

极狐GitLab 与钉钉的集成实践

DingTalk OAuth 2.0 OmniAuth provider * 引入于 14.5 版本。 您可以使用您的钉钉账号登录极狐GitLab。 登录钉钉开放平台&#xff0c;创建应用。钉钉会生成一个客户端 ID 和密钥供您使用。 登录钉钉开放平台。 在顶部栏上&#xff0c;选择 应用程序开发 > 企业内部开发&am…

数据挖掘实战-基于决策树算法构建北京市空气质量预测模型

&#x1f935;‍♂️ 个人主页&#xff1a;艾派森的个人主页 ✍&#x1f3fb;作者简介&#xff1a;Python学习者 &#x1f40b; 希望大家多多支持&#xff0c;我们一起进步&#xff01;&#x1f604; 如果文章对你有帮助的话&#xff0c; 欢迎评论 &#x1f4ac;点赞&#x1f4…

看论文利器:paperswithcode

paperswithcode&#xff0c;从名字就可以看出来&#xff0c;有源代码的paper。 写论文&#xff0c;很关键的就是能够复现论文内容。 这个网站提供了“论文代码”的参考文献。 以【图像加密】领域为例&#xff0c;搜索一下&#xff1a; 图像分割&#xff1a; 除了论文&#x…

2024.2.5日总结(小程序开发2)

小程序的宿主环境 宿主环境 宿主环境指的是程序运行所必须的依赖环境。 Android系统和iOS系统是两个不同的宿主环境。安卓版的微信App不能再iOS环境下运行。Android是安卓软件的宿主环境&#xff0c;脱离了宿主环境的软件是没有意义的。 小程序的宿主环境 手机微信是小程序…

Uibot (RPA设计软件)智能识别信息+微信群发助手(升级版)———课后练习1

微信群发助手机器人的小项目友友们可以参考小北的课前材料二博客~ (本博客中会有部分课程ppt截屏,如有侵权请及请及时与小北我取得联系~&#xff09; 紧接着小北的前两篇博客&#xff0c;友友们我们即将开展新课的学习~RPA 培训前期准备指南——安装Uibot(RPA设计软件&#x…

jmeter-问题一:关于线程组,线程数,用户数详解

文章目录 jmeter参数介绍1.线程数2.准备时长(Ramp-up)3.循环次数4.same user on each iteratio5.调度器 场景一&#xff1a;当你的线程组中线程数为1,循环为1场景二&#xff1a;当你的线程组中线程数为2&#xff0c;循环为1场景三&#xff1a;当你的线程组中线程数为1&#xff…

Javascript | 打印菱形

Javascript打印菱形&#xff0c;在校大学生可以拿来糊弄作业&#xff08;笑&#xff09; var str ; for (var i 1; i < 9; i) {if (i < 5) {for (var k1 1; k1 < 5 - i; k1) {str ;}} else {for (var k2 1; k2 < i - 5; k2) {str ;}}if (i < 5) {for (…

升级Oracle 单实例数据库19.3到19.22

需求 我的Oracle Database Vagrant Box初始版本为19.3&#xff0c;需要升级到最新的RU&#xff0c;当前为19.22。 以下操作时间为为2024年2月5日。 补丁下载 补丁下载文档参见MOS文档&#xff1a;Primary Note for Database Proactive Patch Program (Doc ID 888.1)。 补丁…

大模型|基础_word2vec

文章目录 Word2Vec词袋模型CBOW Continuous Bag-of-WordsContinuous Skip-Gram存在的问题解决方案 其他技巧 Word2Vec 将词转化为向量后&#xff0c;会发现king和queen的差别与man和woman的差别是类似的&#xff0c;而在几何空间上&#xff0c;这样的差别将会以平行的关系进行表…

【算法与数据结构】718、1143、1035、392、115、LeetCode最长重复子数组+最长公共子序列+不相交的线+判断子序列+不同的子序列

文章目录 一、718、最长重复子数组二、1143、最长公共子序列三、1035、不相交的线四、392、判断子序列五、115、不同的子序列六、完整代码 所有的LeetCode题解索引&#xff0c;可以看这篇文章——【算法和数据结构】LeetCode题解。 一、718、最长重复子数组 思路分析&#xff1…

电阻一文搞懂!

1.品牌 厚声、风华&#xff0c;三星、罗姆、松下、KOA 2.分类 插件 碳膜电阻&#xff1a;精度-5 J 是在高阻&#xff0c;高压和高温应用中 属负温度系数电阻 金属膜&#xff1a;-1 F 薄膜电阻和厚膜电阻的区别&#xff1a;薄膜电阻和厚膜电阻区别&#xff0c;了解即可…

元数据驱动的思想

元数据驱动的思想 元数据驱动的思想应该不会陌生&#xff0c;但元数据驱动的实践应该会非常陌生。 因为元数据驱动架构是为了解决高频个性化的复杂业务而诞生的&#xff0c;而这种业务场景只存在2B领域。 有关元数据驱动的架构思想&#xff0c;在这里暂先简单抛几个点。&#…

SpringBoot接入微信公众号【服务号】

SpringBoot接入微信公众号【服务号】 一、服务号注册 注册地址&#xff1a;https://mp.weixin.qq.com/cgi-bin/registermidpage?actionindex&langzh_CN 注册流程参考&#xff1a;https://kf.qq.com/touch/faq/150804UVr222150804quq6B7.html?platform15 二、服务号配…

C#验证字符串的长度,用正则表达式 vs 字符数组长度或字符串的长度

目录 一、使用的方法 1.使用正则表达式 2.通过计算字符串的长度验证 二、实例 1.源码 2.生成效果 一、使用的方法 1.使用正则表达式 使用正则表达式可以判断和限制用户输入的字符串长度。 比如验证用户密码不得少于8为&#xff0c;匹配的正则表达式"^.{8,}$"…