介绍
google已经在Android5.1之后取消了对Camera1的更新,转而提供了功能更加强大的Camera2.虽然新版本依然可以使用Camera1但是,不管是各种机型适配还是拍照参数自定义都是很鸡肋的.跟上最新的技术了解Camera2是必要的.关于Camera2的兼容一般是支持API22之后包括API22的Android版本,但是也发现一些机型(比如三星)在API22版本上并没有支持Camera2.
需要使用的API介绍
因为Camera2提供的功能更加强大,所以使用比Camera1会复杂许多.需要调用的API和回调也更多.这里简单介绍一下这些API的对应功能.好初步认识Camera2.
CameraManager
摄像头管理类:
主要有4个功能:
- 获取摄像头的ID
- 获取摄像头的特征信息(比如摄像头前后位置信息和支持的分辨率信息等等)
- 打开指定id的摄像头
- 打开和关闭闪光灯
CameraDevice
摄像头设备类:
主要功能有3个
- 创建获取数据请求类CaptureRequest.Builder(或者叫捕获请求),下面会介绍这个类
- 创建获取数据会话(创建预览或者拍照的会话通道)
- 关闭摄像头
CameraDevice.StateCallback
摄像头状态接口回调类:
主要是负责回调摄像头的开启/断开/异常/销毁.我们使用CameraManager打开指定id的摄像头时需要添加这个回调.
CameraCaptureSession.StateCallback
获取数据会话的状态接口回调类:
我们创建相机预览图像/拍照/录像都需要这个回调类,来告诉我们获取数据会话的通道状态是配置成功或者配置失败.它还负责给我们回调一个重要的CameraCaptureSession提供给我们操作,这个CameraCaptureSession类我下面会介绍
CameraCaptureSession.CaptureCallback
获取数据会话的数据接口回调类:
负责回调获取数据的生命周期(比如开始/进行中/完成/失败等等),如果你并不需要对生命周期里做操作,所以有时候没有啥作用.但是它也是必需创建的一个回调接口类,是在创建预览图像/拍照/录像的时候添加进去,但是拍照或者录像的数据都不在这个回调接口里出来(一开始很容易误解,以为拍照数据会从这里返回).除了回调获取数据的生命周期,还可以在回调方法里获取拍照或者录制过程的的一些参数信息,比如图片的Size/分辨率等等.
CaptureRequest.Builder
获取数据请求配置类:
很重要,也是我们频繁操作的一个配置类.由CameraDevice类创建.主要负责
设置返回数据的surface(显示预览View比如TextureView的surface 或者 照片ImageReader的surface)
配置预览/拍照/录制的拍照参数,比如自动对焦/自动曝光/拍照自动闪光/设置HZ值/颜色校正等等你能在系统相机上看到的功能.
数据配置完成后交给CameraCaptureSession会话类,让CameraCaptureSession操作提供我们需要的数据,例如图像预览或者拍照/录制视频
CameraCaptureSession
获取数据会话类:
很重要,是我们频繁操作的一个数据会话类,比如创建预览/停止预览/拍照/录像都要它来操作,它由CameraCaptureSession.StateCallback这个接口回调方法里回调提供给我们.
ImageReader
图片读取类:
不属于Camera2Api的类,但是是拍照功能重要的类,照片的数据流由它缓存,然后我们提取保存到本地成为图片文件或者显示在ImageView里
Camera2的操作流程
在上面的API介绍里,你是不是对这么多的配置类/会话类/接口回调类感到眼花缭乱?是的,Camera2的使用是相当眼花缭乱的,但是我们抓住一条线慢慢从上面跟到下面就应该能明白是怎么一回事了.下面我们来简单介绍一些Camera2的操作流程:
初始化流程:
- 初始化动态授权,这是基本操作
- 初始化一个子线程的Handler,Camera2的操作可以放在主线程也可以放在子线程.按例一般都是子线程里,但是Camera2只需要我们提供一个子线程的Handler就行了.
- 初始化ImageReader,这个没有初始化顺序要求,并且它有数据回调接口,接口回调的图片数据我们直接保存到内部存储空间,所以提前初始化提供给后续使用.
- 初始化TextureView,添加TextureView的接口回调.
- 在TextureView的接口回调里回调启用成功方法后,我们开始初始化相机管理类initCameraManager
- 然后继续初始化CameraDevice.StateCallback 摄像头设备状态接口回调类,先初始化提供给后续使用.(在这个接口类的开启相机的回调方法里,我们需要实现创建预览图像请求配置和创建获取数据会话)
- 继续初始化CameraCaptureSession.StateCallback 摄像头获取数据会话类的状态接口回调类,先初始化提供给后续使用.(在这个接口类的配置成功回调方法里,我们需要实现预览图像或者实现拍照)
- 继续初始化CameraCaptureSession.CaptureCallback 摄像头获取数据会话类的获取接口回调类,先初始化提供给后续使用.(啥都不干)
- 判断摄像头前后,选择对应id
- 打开指定id的摄像头
- 实现拍照
逻辑流程:
动态相机权限获取 >> 设置TextureView回调 >> TextureView启用成功回调方法触发 >> 选择摄像头 >> 打开相机 >> 相机开启回调方法触发 >> 创建CaptureRequest.Builder配置类 >> 设置配置类图像预览模式 >> 配置类导入需要显示预览的TextureView的surface >> 创建数据会话 >> 数据会话的配置成功回调方法触发 >> 创建预览图像 >> 预览图像显示成功 >> 按键点击拍照 >> 创建新的CaptureRequest.Builder配置类,添加目标为拍照 >> 配置类导入ImageReader的surface >> 数据会话使用这个配置类创建拍照 >> ImageReader的接口类图片可用方法触发 >> 保存图片
代码部分
实现简单的拍照功能demo
package demo.yt.com.demo;
import android.Manifest;
import android.content.Context;
import android.content.pm.PackageManager;
import android.graphics.ImageFormat;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CaptureFailure;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.os.Bundle;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.util.Size;
import android.util.SparseIntArray;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.widget.Button;
import android.widget.Toast;
import androidx.annotation.NonNull;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.Arrays;
public class Demo2Activity extends AppCompatActivity {
private static final String TAG = Camera2Activity.class.getName();
private String[] permission = {Manifest.permission.CAMERA};
private TextureView mTextureView; //注意使用TextureView需要开启硬件加速,开启方法很简单在AndroidManifest.xml 清单文件里,你需要使用TextureView的activity添加android:hardwareAccelerated="true"
private Button mBtnPhotograph;
private HandlerThread mHandlerThread;
private Handler mChildHandler = null;
private CameraManager mCameraManager; //相机管理类,用于检测系统相机获取相机id
private CameraDevice mCameraDevice; //Camera设备类
private CameraCaptureSession.StateCallback mSessionStateCallback; //获取的会话类状态回调
private CameraCaptureSession.CaptureCallback mSessionCaptureCallback; //获取会话类的获取数据回调
private CaptureRequest.Builder mCaptureRequest; //获取数据请求配置类
private CameraDevice.StateCallback mStateCallback; //摄像头状态回调
private CameraCaptureSession mCameraCaptureSession; //获取数据会话类
private ImageReader mImageReader; //照片读取器
private Surface mSurface;
private SurfaceTexture mSurfaceTexture;
private String mCurrentCameraId;
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {// /为了使照片竖直显示
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera2);
mTextureView = findViewById(R.id.textureview);
mBtnPhotograph = findViewById(R.id.btn_Photograph);
mBtnPhotograph.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
try {
mCameraCaptureSession.stopRepeating();//停止重复 取消任何正在进行的重复捕获集 在这里就是停止画面预览
/* mCameraCaptureSession.abortCaptures(); //终止获取 尽可能快地放弃当前挂起和正在进行的所有捕获。
* 这里有一个坑,其实这个并不能随便调用(我是看到别的demo这么使用,但是其实是错误的,所以就在这里备注这个坑).
* 最好只在Activity里的onDestroy调用它,终止获取是耗时操作,需要一定时间重新打开会话通道.
* 在这个demo里我并没有恢复预览,如果你调用了这个方法关闭了会话又拍照后恢复图像预览,会话就会频繁的开关,
* 导致拍照图片在处理耗时缓存时你又关闭了会话.导致照片缓存不完整并且失败.
* 所以切记不要随便使用这个方法,会话开启后并不需要关闭刷新.后续其他拍照/预览/录制视频直接操作这个会话即可
*/
takePicture();//拍照
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
});
initPermission();
initChildThread();
initImageReader();
initTextureView();
}
/**
* 初始化权限
*/
private void initPermission() {
if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, permission, 1);
}
}
private void initTextureView() {
mTextureView.setSurfaceTextureListener(new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Log.e(TAG, "TextureView 启用成功");
initCameraManager();
initCameraCallback();
initCameraCaptureSessionStateCallback();
initCameraCaptureSessionCaptureCallback();
selectCamera();
openCamera();
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
Log.e(TAG, "SurfaceTexture 变化");
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
Log.e(TAG, "SurfaceTexture 的销毁");
//这里返回true则是交由系统执行释放,如果是false则需要自己调用surface.release();
return true;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
});
}
/**
* 初始化子线程
*/
private void initChildThread() {
mHandlerThread = new HandlerThread("camera2");
mHandlerThread.start();
mChildHandler = new Handler(mHandlerThread.getLooper());
}
/**
* 初始化相机管理
*/
private void initCameraManager() {
mCameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
}
/**
* 获取匹配的大小
*
* @return
*/
private Size getMatchingSize() {
Size selectSize = null;
float selectProportion = 0;
try {
float viewProportion = (float) mTextureView.getWidth() / (float) mTextureView.getHeight();
CameraCharacteristics cameraCharacteristics = mCameraManager.getCameraCharacteristics(mCurrentCameraId);
StreamConfigurationMap streamConfigurationMap = cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size[] sizes = streamConfigurationMap.getOutputSizes(ImageFormat.JPEG);
for (int i = 0; i < sizes.length; i++) {
Size itemSize = sizes[i];
float itemSizeProportion = (float) itemSize.getHeight() / (float) itemSize.getWidth();
float differenceProportion = Math.abs(viewProportion - itemSizeProportion);
Log.e(TAG, "相减差值比例=" + differenceProportion);
if (i == 0) {
selectSize = itemSize;
selectProportion = differenceProportion;
continue;
}
if (differenceProportion <= selectProportion) {
if (differenceProportion == selectProportion) {
if (selectSize.getWidth() + selectSize.getHeight() < itemSize.getWidth() + itemSize.getHeight()) {
selectSize = itemSize;
selectProportion = differenceProportion;
}
} else {
selectSize = itemSize;
selectProportion = differenceProportion;
}
}
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
Log.e(TAG, "getMatchingSize: 选择的比例是=" + selectProportion);
Log.e(TAG, "getMatchingSize: 选择的尺寸是 宽度=" + selectSize.getWidth() + "高度=" + selectSize.getHeight());
return selectSize;
}
/**
* 选择摄像头
*/
private void selectCamera() {
try {
String[] cameraIdList = mCameraManager.getCameraIdList();//获取摄像头id列表
if (cameraIdList.length == 0) {
return;
}
for (String cameraId : cameraIdList) {
Log.e(TAG, "selectCamera: cameraId=" + cameraId);
//获取相机特征,包含前后摄像头信息,分辨率等
CameraCharacteristics cameraCharacteristics = mCameraManager.getCameraCharacteristics(cameraId);
Integer facing = cameraCharacteristics.get(CameraCharacteristics.LENS_FACING);//获取这个摄像头的面向
//CameraCharacteristics.LENS_FACING_BACK 后摄像头
//CameraCharacteristics.LENS_FACING_FRONT 前摄像头
//CameraCharacteristics.LENS_FACING_EXTERNAL 外部摄像头,比如OTG插入的摄像头
if (facing == CameraCharacteristics.LENS_FACING_FRONT) {
mCurrentCameraId = cameraId;
}
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 初始化摄像头状态回调
*/
private void initCameraCallback() {
mStateCallback = new CameraDevice.StateCallback() {
/**
* 摄像头打开时
* @param camera
*/
@Override
public void onOpened(@NonNull CameraDevice camera) {
Log.e(TAG, "相机开启");
mCameraDevice = camera;
try {
mSurfaceTexture = mTextureView.getSurfaceTexture(); //surfaceTexture 需要手动释放
Size matchingSize = getMatchingSize();
mSurfaceTexture.setDefaultBufferSize(matchingSize.getWidth(), matchingSize.getHeight());//设置预览的图像尺寸
mSurface = new Surface(mSurfaceTexture);//surface最好在销毁的时候要释放,surface.release();
// CaptureRequest可以完全自定义拍摄参数,但是需要配置的参数太多了,所以Camera2提供了一些快速配置的参数,如下:
// TEMPLATE_PREVIEW :预览
// TEMPLATE_RECORD:拍摄视频
// TEMPLATE_STILL_CAPTURE:拍照
// TEMPLATE_VIDEO_SNAPSHOT:创建视视频录制时截屏的请求
// TEMPLATE_ZERO_SHUTTER_LAG:创建一个适用于零快门延迟的请求。在不影响预览帧率的情况下最大化图像质量。
// TEMPLATE_MANUAL:创建一个基本捕获请求,这种请求中所有的自动控制都是禁用的(自动曝光,自动白平衡、自动焦点)。
mCaptureRequest = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);//创建预览请求
mCaptureRequest.addTarget(mSurface); //添加surface 实际使用中这个surface最好是全局变量 在onDestroy的时候mCaptureRequest.removeTarget(mSurface);清除,否则会内存泄露
mCaptureRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);//自动对焦
/**
* 创建获取会话
* 这里会有一个容易忘记的坑,那就是Arrays.asList(surface, mImageReader.getSurface())这个方法
* 这个方法需要你导入后面需要操作功能的所有surface,比如预览/拍照如果你2个都要操作那就要导入2个
* 否则后续操作没有添加的那个功能就报错surface没有准备好,这也是我为什么先初始化ImageReader的原因,因为在这里就可以拿到ImageReader的surface了
*/
mCameraDevice.createCaptureSession(Arrays.asList(mSurface, mImageReader.getSurface()), mSessionStateCallback, mChildHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
*摄像头断开时
* @param camera
*/
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
}
/**
* 出现异常情况时
* @param camera
* @param error
*/
@Override
public void onError(@NonNull CameraDevice camera, int error) {
}
/**
* 摄像头关闭时
* @param camera
*/
@Override
public void onClosed(@NonNull CameraDevice camera) {
super.onClosed(camera);
}
};
}
/**
* 摄像头获取会话状态回调
*/
private void initCameraCaptureSessionStateCallback() {
mSessionStateCallback = new CameraCaptureSession.StateCallback() {
//摄像头完成配置,可以处理Capture请求了。
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
try {
mCameraCaptureSession = session;
//注意这里使用的是 setRepeatingRequest() 请求通过此捕获会话无休止地重复捕获图像。用它来一直请求预览图像
mCameraCaptureSession.setRepeatingRequest(mCaptureRequest.build(), mSessionCaptureCallback, mChildHandler);
// mCameraCaptureSession.stopRepeating();//停止重复 取消任何正在进行的重复捕获集
// mCameraCaptureSession.abortCaptures();//终止获取 尽可能快地放弃当前挂起和正在进行的所有捕获。请只在销毁activity的时候调用它
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
//摄像头配置失败
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
}
};
}
/**
* 摄像头获取会话数据回调
*/
private void initCameraCaptureSessionCaptureCallback() {
mSessionCaptureCallback = new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureStarted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, long timestamp, long frameNumber) {
super.onCaptureStarted(session, request, timestamp, frameNumber);
}
@Override
public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
super.onCaptureProgressed(session, request, partialResult);
}
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
// Log.e(TAG, "onCaptureCompleted: 触发接收数据");
// Size size = request.get(CaptureRequest.JPEG_THUMBNAIL_SIZE);
}
@Override
public void onCaptureFailed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureFailure failure) {
super.onCaptureFailed(session, request, failure);
}
@Override
public void onCaptureSequenceCompleted(@NonNull CameraCaptureSession session, int sequenceId, long frameNumber) {
super.onCaptureSequenceCompleted(session, sequenceId, frameNumber);
}
@Override
public void onCaptureSequenceAborted(@NonNull CameraCaptureSession session, int sequenceId) {
super.onCaptureSequenceAborted(session, sequenceId);
}
@Override
public void onCaptureBufferLost(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull Surface target, long frameNumber) {
super.onCaptureBufferLost(session, request, target, frameNumber);
}
};
}
/**
* 打开摄像头
*/
private void openCamera() {
try {
if (checkSelfPermission(Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
mCameraManager.openCamera(mCurrentCameraId, mStateCallback, mChildHandler);
return;
}
Toast.makeText(this, "没有授权", Toast.LENGTH_SHORT).show();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 初始化图片读取器
*/
private void initImageReader() {
//创建图片读取器,参数为分辨率宽度和高度/图片格式/需要缓存几张图片,我这里写的2意思是获取2张照片
mImageReader = ImageReader.newInstance(1080, 1920, ImageFormat.JPEG, 2);
mImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
// image.acquireLatestImage();//从ImageReader的队列中获取最新的image,删除旧的
// image.acquireNextImage();//从ImageReader的队列中获取下一个图像,如果返回null没有新图像可用
Image image = reader.acquireNextImage();
try {
File path = new File(Camera2Activity.this.getExternalCacheDir().getPath());
if (!path.exists()) {
Log.e(TAG, "onImageAvailable: 路径不存在");
path.mkdirs();
} else {
Log.e(TAG, "onImageAvailable: 路径存在");
}
File file = new File(path, "demo.jpg");
FileOutputStream fileOutputStream = new FileOutputStream(file);
// 这里的image.getPlanes()[0]其实是图层的意思,因为我的图片格式是JPEG只有一层所以是geiPlanes()[0],如果你是其他格式(例如png)的图片会有多个图层,就可以获取指定图层的图像数据
ByteBuffer byteBuffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[byteBuffer.remaining()];
byteBuffer.get(bytes);
fileOutputStream.write(bytes);
fileOutputStream.flush();
fileOutputStream.close();
image.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}, mChildHandler);
}
private void takePicture() {
CaptureRequest.Builder captureRequestBuilder = null;
try {
captureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);//自动对焦
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);//自动爆光
// // 获取手机方向,如果你的app有提供横屏和竖屏,那么就需要下面的方法来控制照片为竖立状态
// int rotation = getWindowManager().getDefaultDisplay().getRotation();
// Log.e(TAG, "takePicture: 手机方向="+rotation);
// Log.e(TAG, "takePicture: 照片方向="+ORIENTATIONS.get(rotation));
captureRequestBuilder.set(CaptureRequest.JPEG_ORIENTATION, 270);//我的项目不需要,直接写死270度 将照片竖立
Surface surface = mImageReader.getSurface();
captureRequestBuilder.addTarget(surface);
CaptureRequest request = captureRequestBuilder.build();
mCameraCaptureSession.capture(request, null, mChildHandler); //获取拍照
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
switch (requestCode) {
case 1:
if (permissions.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "授权成功", Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(this, "授权失败", Toast.LENGTH_SHORT).show();
finish();
}
break;
default:
}
}
@Override
protected void onDestroy() {
super.onDestroy();
if (mCaptureRequest != null) {
mCaptureRequest.removeTarget(mSurface);
mCaptureRequest = null;
}
if (mSurface != null) {
mSurface.release();
mSurface = null;
}
if (mSurfaceTexture != null){
mSurfaceTexture.release();
mSurfaceTexture = null;
}
if (mCameraCaptureSession != null) {
try {
mCameraCaptureSession.stopRepeating();
mCameraCaptureSession.abortCaptures();
mCameraCaptureSession.close();
} catch (CameraAccessException e) {
e.printStackTrace();
}
mCameraCaptureSession = null;
}
if (mCameraDevice != null) {
mCameraDevice.close();
mCameraDevice = null;
}
if (null != mImageReader) {
mImageReader.close();
mImageReader = null;
}
if (mChildHandler != null) {
mChildHandler.removeCallbacksAndMessages(null);
mChildHandler = null;
}
if (mHandlerThread != null) {
mHandlerThread.quitSafely();
mHandlerThread = null;
}
mCameraManager = null;
mSessionStateCallback = null;
mSessionCaptureCallback = null;
mStateCallback = null;
}
}
拍照成功后,你下一个要面临的麻烦可能就是内存泄露,注意要释放的资源,否则一直持有会导致内存泄露.
一般情况下会有以下几个操作导致内存泄露:
1.CaptureRequest 如果没有释放Surface, 一定操作释放 mCaptureRequest.removeTarget(mSurface);
2.SurfaceTexture 需要被释放,除了 mSurfaceTexture.release();
也可以用这种mTextureView.getSurfaceTextureListener().onSurfaceTextureDestroyed(mTextureView.getSurfaceTexture());方式释放SurfaceTextur,
但是在上面的public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) 回调中你需要返回true或者自己执行 surface.release(); 这样才会运行onSurfaceTextureDestroyed回调
3.Surface 需要被释放
4.释放是需要按顺序一个一个释放的
5.如果你想实现一个工具类来以建造者模式来创建相机拍照,请注意TextureView不能传入工具类里,因为你需要实现setSurfaceTextureListener监听,而这个监听你是无法在工具类里释放它的,它是需要在activity里才能被释放,请切记!
你最好只传入SurfaceTexture给工具类
另外:
关于理解Camera分辨率问题,获取最合适的预览与拍照像素请参考我这篇博客:Android开发 Camera2开发_2_预览分辨率或拍照分辨率的计算 - 观心静 - 博客园
比较规范的拍照功能demo
上面的只是简简单单的demo,下面是实际使用的样子,其实差不多,贴出来也是多一个参考
demo1
public class FaceCameraActivity extends BaseActivity implements View.OnClickListener {
private TextureView mTextureView;
private Button mBtnCamera;
private ImageView mBack;
private MaterialDialog mHandlerImageWaitDialog;
private CameraManager mCameraManager;
private CameraDevice mCameraDevice;
private ImageReader mImageReader;
private CaptureRequest.Builder mCaptureRequest;
private CameraDevice.StateCallback mCameraDeviceStateCallback;
private CameraCaptureSession.StateCallback mCameraCaptureSessionStateCallback;
private CameraCaptureSession.CaptureCallback mCameraCaptureSessionCaptureCallback;
private CameraCaptureSession mCameraCaptureSession;
private String mCurrentCameraId;
private Size mCurrentSelectSize;
private Handler mChildHandler;
private Surface mSurface;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
initChildThread();
initCameraManager();
initSelectCamera();
initHandlerMatchingSize();
initImageReader();
initTextureViewListener();
initCameraDeviceStateCallbackListener();
initCameraCaptureSessionStateCallbackListener();
initCameraCaptureSessionCaptureCallbackListener();
}
@Override
public int getLayout() {
return R.layout.activity_face_camera;
}
@Override
public void initView() {
mBack = findViewById(R.id.back);
mTextureView = findViewById(R.id.texture_view);
mBtnCamera = findViewById(R.id.btn_camera);
mBack.setOnClickListener(this);
mBtnCamera.setOnClickListener(this);
}
@Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.btn_camera:
if (ButtonDelayUtil.isFastClick()){
handlerImageWaitDialog().show();
stopPreview();
takePicture();
}
break;
case R.id.back:
finish();
break;
default:
break;
}
}
private void initChildThread() {
HandlerThread handlerThread = new HandlerThread("faceCamera");
handlerThread.start();
mChildHandler = new Handler(handlerThread.getLooper());
}
/**
* 初始化相机管理
*/
private void initCameraManager() {
mCameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
}
/**
* 初始化选择摄像头
*/
private void initSelectCamera() {
try {
String[] cameraIdArray = mCameraManager.getCameraIdList();
for (String itemId : cameraIdArray) {
CameraCharacteristics itemCharacteristics = mCameraManager.getCameraCharacteristics(itemId);
Integer facing = itemCharacteristics.get(CameraCharacteristics.LENS_FACING);
if (facing == CameraCharacteristics.LENS_FACING_FRONT) {
mCurrentCameraId = itemId;
break;
}
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
if (mCurrentCameraId == null) {
finish();
Toast.makeText(this, "此设备不支持前摄像头", Toast.LENGTH_SHORT).show();
}
}
/**
* 初始化计算适合当前屏幕分辨率的拍照分辨率
* @return
*/
private void initHandlerMatchingSize() {
try {
CameraCharacteristics cameraCharacteristics = mCameraManager.getCameraCharacteristics(mCurrentCameraId);
StreamConfigurationMap streamConfigurationMap = cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size[] sizes = streamConfigurationMap.getOutputSizes(ImageFormat.JPEG);
DisplayMetrics displayMetrics = getResources().getDisplayMetrics();
int deviceWidth = displayMetrics.widthPixels;
int deviceHeigh = displayMetrics.heightPixels;
L.e("当前屏幕密度宽度="+deviceWidth+"高度="+deviceHeigh);
for (int j = 1; j < 81; j++) {
for (int i = 0; i < sizes.length; i++) {
Size itemSize = sizes[i];
if (itemSize.getHeight() < (deviceWidth + j * 5) && itemSize.getHeight() > (deviceWidth - j * 5)) {
if (mCurrentSelectSize != null) { //如果之前已经找到一个匹配的宽度
if (Math.abs(deviceHeigh-itemSize.getWidth()) < Math.abs(deviceHeigh - mCurrentSelectSize.getWidth())){ //求绝对值算出最接近设备高度的尺寸
mCurrentSelectSize = itemSize;
continue;
}
}else {
mCurrentSelectSize = itemSize;
}
}
}
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
L.e("当前预览宽度="+mCurrentSelectSize.getWidth()+"高度="+mCurrentSelectSize.getHeight());
}
private void initImageReader() {
L.e("初始化图片ImageReader的宽="+mCurrentSelectSize.getWidth()+"高="+mCurrentSelectSize.getHeight());
mImageReader = ImageReader.newInstance(mCurrentSelectSize.getWidth()
, mCurrentSelectSize.getHeight()
, ImageFormat.JPEG
, 2);
mImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
FilePathSession.deleteFaceImageFile();
Image image = reader.acquireLatestImage();
ByteBuffer byteBuffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[byteBuffer.remaining()];
byteBuffer.get(bytes);
try {
FileOutputStream fileOutputStream = new FileOutputStream(FilePathSession.getFaceImagePath());
fileOutputStream.write(bytes);
fileOutputStream.flush();
fileOutputStream.close();
image.close();
startPreview();
handlerImageWaitDialog().dismiss();
runOnUiThread(new Runnable() {
@Override
public void run() {
Intent startFaceConfirm = new Intent(FaceCameraActivity.this, FaceConfirmActivity.class);
startActivity(startFaceConfirm);
FaceCameraActivity.this.finish();
}
});
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}, mChildHandler);
}
private void initTextureViewListener() {
mTextureView.setSurfaceTextureListener(new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
openCamera();
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return true;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
});
}
private void initCameraDeviceStateCallbackListener() {
mCameraDeviceStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice camera) {
//相机开启
mCameraDevice = camera;
try {
SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture();
surfaceTexture.setDefaultBufferSize(mCurrentSelectSize.getWidth(),mCurrentSelectSize.getHeight());
mSurface = new Surface(surfaceTexture);
mCaptureRequest = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mCaptureRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mCaptureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);//自动爆光
mCaptureRequest.addTarget(mSurface);
mCameraDevice.createCaptureSession(Arrays.asList(mSurface, mImageReader.getSurface())
, mCameraCaptureSessionStateCallback
, mChildHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
finish();
Toast.makeText(FaceCameraActivity.this, "相机打开失败", Toast.LENGTH_SHORT).show();
L.e("CameraDevice.StateCallback onError : 相机异常 error code="+error);
}
};
}
private void initCameraCaptureSessionStateCallbackListener() {
mCameraCaptureSessionStateCallback = new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
mCameraCaptureSession = session;
startPreview();
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
finish();
Toast.makeText(FaceCameraActivity.this, "相机打开失败", Toast.LENGTH_SHORT).show();
L.e("CameraCaptureSession.StateCallback onConfigureFailed : CameraCaptureSession会话通道创建失败");
}
};
}
private void initCameraCaptureSessionCaptureCallbackListener() {
mCameraCaptureSessionCaptureCallback = new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureStarted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, long timestamp, long frameNumber) {
super.onCaptureStarted(session, request, timestamp, frameNumber);
//获取开始
}
@Override
public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
super.onCaptureProgressed(session, request, partialResult);
//获取中
}
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
//获取结束
}
@Override
public void onCaptureFailed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureFailure failure) {
super.onCaptureFailed(session, request, failure);
//获取失败
Toast.makeText(FaceCameraActivity.this, "拍照失败", Toast.LENGTH_SHORT).show();
L.e("失败报告Reason="+failure.getReason());
}
};
}
@SuppressLint("MissingPermission")
private void openCamera() {
try {
mCameraManager.openCamera(mCurrentCameraId, mCameraDeviceStateCallback, mChildHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private MaterialDialog handlerImageWaitDialog(){
if (mHandlerImageWaitDialog == null){
mHandlerImageWaitDialog = new MaterialDialog.Builder(this)
.content("正在处理图像中...")
.progress(true,-1)
.cancelable(false)
.build();
}
return mHandlerImageWaitDialog;
}
/**
* 开始预览
*/
private void startPreview(){
try {
mCameraCaptureSession.setRepeatingRequest(mCaptureRequest.build(), mCameraCaptureSessionCaptureCallback, mChildHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 停止预览
*/
private void stopPreview(){
try {
mCameraCaptureSession.stopRepeating();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 拍照
*/
private void takePicture(){
try {
CaptureRequest.Builder takePictureRequest = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
takePictureRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);//自动对焦
takePictureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);//自动爆光
int rotation = getWindowManager().getDefaultDisplay().getRotation();
int angle = getJpegOrientation(mCameraManager.getCameraCharacteristics(mCurrentCameraId), rotation);
L.i("人脸拍照 照片角度="+angle);
takePictureRequest.set(CaptureRequest.JPEG_ORIENTATION, angle);
Surface surface = mImageReader.getSurface();
takePictureRequest.addTarget(surface);
CaptureRequest request = takePictureRequest.build();
mCameraCaptureSession.capture(request, null, mChildHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 官方提供的JPEG图片方向算法
* @param c
* @param deviceOrientation
* @return
*/
private int getJpegOrientation(CameraCharacteristics c, int deviceOrientation) {
if (deviceOrientation == OrientationEventListener.ORIENTATION_UNKNOWN){
return 0;
}
int sensorOrientation = c.get(CameraCharacteristics.SENSOR_ORIENTATION);//获取传感器方向
// Round device orientation to a multiple of 90
deviceOrientation = (deviceOrientation + 45) / 90 * 90;
// Reverse device orientation for front-facing cameras
boolean facingFront = c.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT;//判断摄像头面向
if (facingFront) {
deviceOrientation = -deviceOrientation;
}
// Calculate desired JPEG orientation relative to camera orientation to make
// the image upright relative to the device orientation
int jpegOrientation = (sensorOrientation + deviceOrientation + 360) % 360;
return jpegOrientation;
}
@Override
protected void onDestroy() {
super.onDestroy();
if (mImageReader != null){
mImageReader.close();
mImageReader = null;
}
if (mCameraCaptureSession != null){
stopPreview();
try {
mCameraCaptureSession.abortCaptures();
} catch (CameraAccessException e) {
e.printStackTrace();
}
mCameraCaptureSession.close();
mCameraCaptureSession = null;
}
if (mCaptureRequest != null){
mCaptureRequest.removeTarget(mSurface);//注意释放mSurface
mCaptureRequest = null;
}
if (mSurface != null){
mSurface.release();//注意释放mSurface
mSurface = null;
}
//也可以用onSurfaceTextureDestroyed这种方式释放SurfaceTexture 但是在上面的public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) 回调中你需要返回true或者自己执行 surface.release(); 这步资源释放很重要
mTextureView.getSurfaceTextureListener().onSurfaceTextureDestroyed(mTextureView.getSurfaceTexture());
mCameraDeviceStateCallback = null;
mCameraCaptureSessionStateCallback = null;
mCameraCaptureSessionCaptureCallback = null;
mCameraManager = null;
if (mCameraDevice != null){
mCameraDevice.close();
mCameraDevice = null;
}
mCameraManager = null;
if (mChildHandler != null){
mChildHandler.removeCallbacksAndMessages(null);
mChildHandler = null;
}
}
}
Demo2
import android.annotation.SuppressLint;
import android.content.Context;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.ImageFormat;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureFailure;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.net.Uri;
import android.os.Bundle;
import android.os.Handler;
import android.os.HandlerThread;
import android.text.TextUtils;
import android.util.DisplayMetrics;
import android.util.Log;
import android.util.Range;
import android.util.Size;
import android.view.OrientationEventListener;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.view.ViewGroup;
import android.widget.ImageView;
import android.widget.Toast;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import com.afollestad.materialdialogs.MaterialDialog;
import com.soundcloud.android.crop.Crop;
import net.wt.gate.dev.R;
import net.wt.gate.dev.base.BaseActivity;
import net.wt.gate.dev.constant.FileConstant;
import net.wt.gate.dev.libs.imageHandle.FileImageHandleListener;
import net.wt.gate.dev.libs.imageHandle.ImageHandle;
import net.wt.gate.dev.libs.log.L;
import net.wt.gate.dev.service.state.State;
import net.wt.gate.dev.service.state.StateConfig;
import net.wt.gate.dev.service.state.StateMachine;
import net.wt.gate.dev.util.ButtonDelayUtil;
import java.io.File;
import java.io.FileOutputStream;
import java.nio.ByteBuffer;
import java.util.Arrays;
/**
* @content:拍照activity
* @time:2019-9-24
* @build:
*/
public class CameraTakeActivity extends BaseActivity implements View.OnClickListener {
private static final String TAG = CameraTakeActivity.class.getSimpleName();
private File mTempImageSavePath = null; //临时拍照图片路径
private File mTempCropImageSavePath = null; //临时裁剪图片路径
public static final String IMAGE_SAVE_PATH_KEY = "imageSavePath"; //图片保存路径
public static final String CAMERA_FACING_KEY = "cameraFacing"; //摄像头面向
public static final String CAMERA_TAKE_RESULT_KEY = "cameraTakeResult"; //回调结果 boolean值 true=成功 false=失败
public static final String CAMERA_TAKE_RESULT_PATH_KEY = "cameraTakeResultPath"; //回调路径 String值
public static final int FACING_BACK = 1; //后摄像头
public static final int FACING_FRONT = 2; //前摄像头
public static final int FACING_EXTERNAL = 3; //外置摄像头
private File mImageSavePath = null;
private Integer mCameraFacing = null;
private TextureView mTextureView;
private ImageView mBack //返回
, mTake //拍照
, mDelete //删除
, mSubmit //确认
, mCropFinishImage; //裁剪完成image
private TextureView.SurfaceTextureListener mSurfaceTextureListener;
private Surface mSurface;
private MaterialDialog mHandlerImageWaitDialog
, mCompressionImageWaitDialog;
private CameraManager mCameraManager;
private CameraDevice mCameraDevice;
private ImageReader mImageReader;
private CaptureRequest.Builder mCaptureRequest;
private CameraDevice.StateCallback mCameraDeviceStateCallback;
private CameraCaptureSession.StateCallback mCameraCaptureSessionStateCallback;
private CameraCaptureSession.CaptureCallback mCameraCaptureSessionCaptureCallback;
private CameraCaptureSession mCameraCaptureSession;
private String mCurrentCameraId;
private Size mCurrentSelectSize;
private HandlerThread mHandlerThread;
private Handler mChildHandler;
private Bitmap mCropBitmap;
private boolean isRelease = false;
private boolean isStopPreview = false;
private boolean isCroping = false;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getIntentData();
initPath();
initChildThread();
initCameraManager();
if (!initSelectCamera()) {
Toast.makeText(CameraTakeActivity.this, R.string.this_device_camera_is_unavailable, Toast.LENGTH_SHORT).show();
Intent intent = new Intent();
intent.putExtra(CAMERA_TAKE_RESULT_KEY, false);
setResult(RESULT_OK, intent);
finish();
return;
}
initHandlerMatchingSize();
initImageReader();
initTextureViewListener();
initCameraDeviceStateCallbackListener();
initCameraCaptureSessionStateCallbackListener();
initCameraCaptureSessionCaptureCallbackListener();
}
@Override
protected int getLayoutID() {
return R.layout.activity_camera_take;
}
@Override
protected void initViews() {
mBack = findViewById(R.id.back);
mTextureView = findViewById(R.id.texture_view);
mTake = findViewById(R.id.take);
mDelete = findViewById(R.id.delete);
mSubmit = findViewById(R.id.submit);
mCropFinishImage = findViewById(R.id.crop_finish_image);
mDelete.setOnClickListener(this);
mSubmit.setOnClickListener(this);
mBack.setOnClickListener(this);
mTake.setOnClickListener(this);
}
@Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.take:
if (ButtonDelayUtil.isFastClick()) {
handlerImageWaitDialog().show();
takePicture();
stopPreview();
}
break;
case R.id.back:
finish();
break;
case R.id.delete:
if (ButtonDelayUtil.isFastClick()) {
if (mTempImageSavePath.exists()) {
mTempImageSavePath.delete();
}
mTake.setVisibility(View.VISIBLE);
mDelete.setVisibility(View.GONE);
mSubmit.setVisibility(View.GONE);
startPreview();
}
break;
case R.id.submit:
if (ButtonDelayUtil.isFastClick()) {
isCroping = true;
Uri inputFileUri = Uri.fromFile(mTempImageSavePath);
Uri outputFileUri = Uri.fromFile(mTempCropImageSavePath);
mTake.setVisibility(View.GONE);
mSubmit.setVisibility(View.GONE);
mDelete.setVisibility(View.GONE);
mTextureView.setVisibility(View.GONE);
mCropFinishImage.setVisibility(View.VISIBLE);
Crop.of(inputFileUri, outputFileUri).asSquare().start(CameraTakeActivity.this);
}
break;
default:
break;
}
}
private void getIntentData() {
Intent intentData = getIntent();
String path = intentData.getStringExtra(IMAGE_SAVE_PATH_KEY);
if (TextUtils.isEmpty(path)) {
L.ee(TAG, "您没有传入图片保存地址");
Intent intent = new Intent();
intent.putExtra(CAMERA_TAKE_RESULT_KEY, false);
setResult(RESULT_OK, intent);
finish();
}
mImageSavePath = new File(path);
mCameraFacing = intentData.getIntExtra(CAMERA_FACING_KEY, 0);
if (mCameraFacing == 0) {
L.ee(TAG, "您没有传入需要使用的摄像头");
Intent intent = new Intent();
intent.putExtra(CAMERA_TAKE_RESULT_KEY, false);
setResult(RESULT_OK, intent);
finish();
}
}
/**
* 初始化路径
*/
private void initPath() {
mTempImageSavePath = new File(FileConstant.CAMERA_TAKE_TEMP_IMAGE_FILE);
mTempCropImageSavePath = new File(FileConstant.TEMP_CROP_IMAGE_FILE);
if (!mTempImageSavePath.getParentFile().isDirectory()) {
mTempImageSavePath.getParentFile().mkdirs();
}
if (!mTempCropImageSavePath.getParentFile().isDirectory()) {
mTempCropImageSavePath.getParentFile().mkdirs();
}
if (!mImageSavePath.getParentFile().isDirectory()) {
mImageSavePath.getParentFile().mkdirs();
}
if (mTempImageSavePath.exists()) {
mTempImageSavePath.delete();
}
if (mTempCropImageSavePath.exists()) {
mTempCropImageSavePath.delete();
}
if (mImageSavePath.exists()){
mImageSavePath.delete();
}
}
private void initChildThread() {
mHandlerThread = new HandlerThread("faceCamera");
mHandlerThread.start();
mChildHandler = new Handler(mHandlerThread.getLooper());
}
/**
* 初始化相机管理
*/
private void initCameraManager() {
mCameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
}
/**
* 初始化选择摄像头
*
* @return ture=初始化成功 false=初始化失败
*/
private boolean initSelectCamera() {
try {
String[] cameraIdArray = mCameraManager.getCameraIdList();
Loop:for (String itemId : cameraIdArray) {
CameraCharacteristics itemCharacteristics = mCameraManager.getCameraCharacteristics(itemId);
Integer facing = itemCharacteristics.get(CameraCharacteristics.LENS_FACING);
switch (mCameraFacing){
case FACING_BACK:
if (facing == CameraCharacteristics.LENS_FACING_BACK) {
mCurrentCameraId = itemId;
break Loop;
}
break;
case FACING_FRONT:
if (facing == CameraCharacteristics.LENS_FACING_FRONT) {
mCurrentCameraId = itemId;
break Loop;
}
break;
case FACING_EXTERNAL:
if (facing == CameraCharacteristics.LENS_FACING_EXTERNAL) {
mCurrentCameraId = itemId;
break Loop;
}
break;
default:
break;
}
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
if (mCurrentCameraId == null) {
return false;
}
return true;
}
/**
* 初始化计算适合当前屏幕分辨率的拍照分辨率
*
* @return
*/
private void initHandlerMatchingSize() {
try {
CameraCharacteristics cameraCharacteristics = mCameraManager.getCameraCharacteristics(mCurrentCameraId);
StreamConfigurationMap streamConfigurationMap = cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size[] sizes = streamConfigurationMap.getOutputSizes(ImageFormat.JPEG);
DisplayMetrics displayMetrics = getResources().getDisplayMetrics();
int deviceWidth = displayMetrics.widthPixels;
int deviceHeigh = displayMetrics.heightPixels;
for (int j = 1; j < 81; j++) {
for (int i = 0; i < sizes.length; i++) {
Size itemSize = sizes[i];
if (itemSize.getHeight() < (deviceWidth + j * 5) && itemSize.getHeight() > (deviceWidth - j * 5)) {
if (mCurrentSelectSize != null) { //如果之前已经找到一个匹配的宽度
if (Math.abs(deviceHeigh - itemSize.getWidth()) < Math.abs(deviceHeigh - mCurrentSelectSize.getWidth())) { //求绝对值算出最接近设备高度的尺寸
mCurrentSelectSize = itemSize;
}
} else {
mCurrentSelectSize = itemSize;
}
}
}
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
ViewGroup.LayoutParams layoutParams = mTextureView.getLayoutParams();
layoutParams.height = mCurrentSelectSize.getWidth();
mTextureView.setLayoutParams(layoutParams);
}
private void initImageReader() {
mImageReader = ImageReader.newInstance(mCurrentSelectSize.getWidth()
, mCurrentSelectSize.getHeight()
, ImageFormat.JPEG
, 1);
mImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
ByteBuffer byteBuffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[byteBuffer.remaining()];
byteBuffer.get(bytes);
try {
FileOutputStream fileOutputStream = new FileOutputStream(mTempImageSavePath);
fileOutputStream.write(bytes);
fileOutputStream.flush();
fileOutputStream.close();
image.close();
runOnUiThread(new Runnable() {
@Override
public void run() {
mTake.setVisibility(View.GONE);
mDelete.setVisibility(View.VISIBLE);
mSubmit.setVisibility(View.VISIBLE);
handlerImageWaitDialog().dismiss();
}
});
} catch (Exception e) {
e.printStackTrace();
}
}
}, mChildHandler);
}
private void initTextureViewListener() {
mSurfaceTextureListener = new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
openCamera();
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
surface.release();
return true;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
};
mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
}
private void initCameraDeviceStateCallbackListener() {
mCameraDeviceStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice camera) {
//相机开启
mCameraDevice = camera;
try {
SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture();
surfaceTexture.setDefaultBufferSize(mCurrentSelectSize.getWidth(), mCurrentSelectSize.getHeight());
mSurface = new Surface(surfaceTexture);
mCaptureRequest = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mCaptureRequest.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, getRange());
mCaptureRequest.set(CaptureRequest.CONTROL_AE_LOCK, false);
mCaptureRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
mCaptureRequest.set(CaptureRequest.CONTROL_AF_MODE, CameraMetadata.CONTROL_AF_MODE_OFF);//关闭自动对焦
mCaptureRequest.addTarget(mSurface);
mCameraDevice.createCaptureSession(Arrays.asList(mSurface, mImageReader.getSurface())
, mCameraCaptureSessionStateCallback
, mChildHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
Intent intent = new Intent();
intent.putExtra(CAMERA_TAKE_RESULT_KEY, false);
setResult(RESULT_OK, intent);
finish();
Toast.makeText(CameraTakeActivity.this, "相机打开失败", Toast.LENGTH_SHORT).show();
}
};
}
private void initCameraCaptureSessionStateCallbackListener() {
mCameraCaptureSessionStateCallback = new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
mCameraCaptureSession = session;
startPreview();
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
Intent intent = new Intent();
intent.putExtra(CAMERA_TAKE_RESULT_KEY, false);
setResult(RESULT_OK, intent);
finish();
Toast.makeText(CameraTakeActivity.this, "相机打开失败", Toast.LENGTH_SHORT).show();
}
};
}
private void initCameraCaptureSessionCaptureCallbackListener() {
mCameraCaptureSessionCaptureCallback = new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureStarted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, long timestamp, long frameNumber) {
super.onCaptureStarted(session, request, timestamp, frameNumber);
//获取开始
}
@Override
public void onCaptureProgressed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureResult partialResult) {
super.onCaptureProgressed(session, request, partialResult);
//获取中
}
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
//获取结束
}
@Override
public void onCaptureFailed(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull CaptureFailure failure) {
super.onCaptureFailed(session, request, failure);
//获取失败
}
};
}
@SuppressLint("MissingPermission")
private void openCamera() {
try {
mCameraManager.openCamera(mCurrentCameraId, mCameraDeviceStateCallback, mChildHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private MaterialDialog handlerImageWaitDialog() {
if (mHandlerImageWaitDialog == null) {
mHandlerImageWaitDialog = new MaterialDialog.Builder(this)
.content("正在处理图像中...")
.progress(true, -1)
.cancelable(false)
.build();
}
return mHandlerImageWaitDialog;
}
private MaterialDialog compressionImageWaitDialog() {
if (mCompressionImageWaitDialog == null) {
mCompressionImageWaitDialog = new MaterialDialog.Builder(this)
.content("压缩图像中...")
.progress(true, -1)
.cancelable(false)
.build();
}
return mCompressionImageWaitDialog;
}
/**
* 开始预览
*/
private void startPreview() {
try {
mCameraCaptureSession.setRepeatingRequest(mCaptureRequest.build(), mCameraCaptureSessionCaptureCallback, mChildHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 停止预览
*/
private void stopPreview() {
try {
mCameraCaptureSession.stopRepeating();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 拍照
*/
private void takePicture() {
try {
CaptureRequest.Builder takePictureRequest = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
mCaptureRequest.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, getRange());//This line of code is used for adjusting the fps range and fixing the dark preview
mCaptureRequest.set(CaptureRequest.CONTROL_AE_LOCK, false);
mCaptureRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
mCaptureRequest.set(CaptureRequest.CONTROL_AF_MODE, CameraMetadata.CONTROL_AF_MODE_OFF);//关闭自动对焦
int rotation = getWindowManager().getDefaultDisplay().getRotation();
int angle = getJpegOrientation(mCameraManager.getCameraCharacteristics(mCurrentCameraId), rotation);
takePictureRequest.set(CaptureRequest.JPEG_ORIENTATION, angle);
Surface surface = mImageReader.getSurface();
takePictureRequest.addTarget(surface);
CaptureRequest request = takePictureRequest.build();
if (mCameraCaptureSession != null) {
mCameraCaptureSession.capture(request, null, mChildHandler);
}else {
Toast.makeText(context, "拍照异常", Toast.LENGTH_SHORT).show();
finish();
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* 官方提供的JPEG图片方向算法
*
* @param c
* @param deviceOrientation
* @return
*/
private int getJpegOrientation(CameraCharacteristics c, int deviceOrientation) {
if (deviceOrientation == OrientationEventListener.ORIENTATION_UNKNOWN) {
return 0;
}
int sensorOrientation = c.get(CameraCharacteristics.SENSOR_ORIENTATION);//获取传感器方向
// Round device orientation to a multiple of 90
deviceOrientation = (deviceOrientation + 45) / 90 * 90;
// Reverse device orientation for front-facing cameras
boolean facingFront = c.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT;//判断摄像头方向
if (facingFront) {
deviceOrientation = -deviceOrientation;
}
// Calculate desired JPEG orientation relative to camera orientation to make
// the image upright relative to the device orientation
return (sensorOrientation + deviceOrientation + 360) % 360;
}
/**
* 获取曝光范围
*
* @return
*/
private Range<Integer> getRange() {
CameraCharacteristics chars = null;
try {
chars = mCameraManager.getCameraCharacteristics(mCurrentCameraId);
} catch (CameraAccessException e) {
e.printStackTrace();
}
Range<Integer>[] ranges = chars.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
Range<Integer> result = null;
for (Range<Integer> range : ranges) {
//帧率不能太低,大于10
if (range.getLower() < 10)
continue;
if (result == null)
result = range;
//FPS下限小于15,弱光时能保证足够曝光时间,提高亮度。range范围跨度越大越好,光源足够时FPS较高,预览更流畅,光源不够时FPS较低,亮度更好。
else if (range.getLower() <= 15 && (range.getUpper() - range.getLower()) > (result.getUpper() - result.getLower()))
result = range;
}
return result;
}
/**
* 压缩图片
*/
private void compressionImage() {
ImageHandle.fileImageConfig(mTempCropImageSavePath, mImageSavePath)
.setTargetKB(30)
.setHandleListener(new FileImageHandleListener() {
@Override
public boolean onReady(File inpFile, File outFile) {
return true;
}
@Override
public void onSuccess() {
Log.e(TAG, "onFailure: 图片压缩成功");
runOnUiThread(new Runnable() {
@Override
public void run() {
compressionImageWaitDialog().dismiss();
Intent intent = new Intent();
intent.putExtra(CAMERA_TAKE_RESULT_KEY, true);
intent.putExtra(CAMERA_TAKE_RESULT_PATH_KEY, mImageSavePath.toString());
setResult(RESULT_OK, intent);
CameraTakeActivity.this.finish();
}
});
}
@Override
public void onFailure(String text) {
Log.e(TAG, "onFailure: 失败原因=" + text);
runOnUiThread(new Runnable() {
@Override
public void run() {
compressionImageWaitDialog().dismiss();
Toast.makeText(CameraTakeActivity.this, "压缩失败", Toast.LENGTH_SHORT).show();
Intent intent = new Intent();
intent.putExtra(CAMERA_TAKE_RESULT_KEY, false);
setResult(RESULT_OK, intent);
CameraTakeActivity.this.finish();
}
});
}
@Override
public void onError(Exception e) {
Log.e(TAG, "onFailure: 失败原因=" + e);
runOnUiThread(new Runnable() {
@Override
public void run() {
compressionImageWaitDialog().dismiss();
Toast.makeText(CameraTakeActivity.this, "压缩失败", Toast.LENGTH_SHORT).show();
Intent intent = new Intent();
intent.putExtra(CAMERA_TAKE_RESULT_KEY, false);
setResult(RESULT_OK, intent);
CameraTakeActivity.this.finish();
}
});
}
}).build();
}
@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == Crop.REQUEST_CROP && resultCode == RESULT_OK) {
compressionImageWaitDialog().show();
mTake.setVisibility(View.GONE);
mSubmit.setVisibility(View.GONE);
mDelete.setVisibility(View.GONE);
mTextureView.setVisibility(View.GONE);
mCropFinishImage.setVisibility(View.VISIBLE);
mCropBitmap = BitmapFactory.decodeFile(mTempCropImageSavePath.getAbsolutePath());
mCropFinishImage.setImageBitmap(mCropBitmap);
compressionImage();
return;
}
if (requestCode == Crop.REQUEST_CROP && resultCode == RESULT_CANCELED){
Toast.makeText(this, R.string.cancel_cropping, Toast.LENGTH_SHORT).show();
}else {
Toast.makeText(this, R.string.unknown_exception_cropping_failed, Toast.LENGTH_SHORT).show();
}
Intent intent = new Intent();
intent.putExtra(CAMERA_TAKE_RESULT_KEY, false);
setResult(RESULT_OK, intent);
CameraTakeActivity.this.finish();
}
private void release(){
if (isRelease){
return;
}
if (isFinishing()){
if (mTempImageSavePath != null && mTempImageSavePath.exists()) {
mTempImageSavePath.delete();
}
if (mTempCropImageSavePath != null && mTempCropImageSavePath.exists()) {
mTempCropImageSavePath.delete();
}
if (mImageReader != null) {
mImageReader.close();
mImageReader = null;
}
if (mCameraCaptureSession != null) {
mCameraCaptureSession.close();
mCameraCaptureSession = null;
}
if (mCameraDevice != null) {
mCameraDevice.close();
mCameraDevice = null;
}
if (mCaptureRequest != null){
mCaptureRequest.removeTarget(mSurface);
mCaptureRequest = null;
}
if (mSurface != null){
mSurface.release();
mSurface = null;
}
if (mTextureView.getSurfaceTextureListener() != null) {
mTextureView.getSurfaceTextureListener().onSurfaceTextureDestroyed(mTextureView.getSurfaceTexture());
}
mCameraDeviceStateCallback = null;
mCameraCaptureSessionStateCallback = null;
mCameraCaptureSessionCaptureCallback = null;
mCameraManager = null;
mSurfaceTextureListener = null;
if (mChildHandler != null) {
mChildHandler.removeCallbacksAndMessages(null);
mChildHandler = null;
}
if (mHandlerThread != null){
mHandlerThread.quitSafely();
mHandlerThread = null;
}
if (mCompressionImageWaitDialog != null) {
mCompressionImageWaitDialog.dismiss();
mCompressionImageWaitDialog = null;
}
if (mHandlerImageWaitDialog != null) {
mHandlerImageWaitDialog.dismiss();
mHandlerImageWaitDialog = null;
}
mCropFinishImage.setImageDrawable(null);
if (mCropBitmap != null) {
mCropBitmap.recycle();
mCropBitmap = null;
}
StateMachine.instance().finishExit(StateConfig.TAKE_PICTURE_KEY);
isRelease = true;
}
}
@Override
protected void onResume() {
super.onResume();
if (isCroping){
return;
}
if (isStopPreview){
if (mCameraCaptureSession == null || !mCameraCaptureSession.isReprocessable()){
finish();
return;
}
startPreview();
}
}
@Override
protected void onPause() {
super.onPause();
if(isFinishing()){
release();
}else {
isStopPreview = true;
stopPreview();
}
}
@Override
protected void onDestroy() {
super.onDestroy();
release();
}
}
原文:Android开发 Camera2开发_1_拍照功能开发-CSDN博客