WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机

WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机

在之前已经实现了WebRTC调用ossrs服务,实现直播视频通话功能。但是在使用过程中,RTCCameraVideoCapturer类提供的方法不能修改及调节相机的灯光等设置,那就需要自定义RTCVideoCapturer自行采集画面了。

在这里插入图片描述

iOS端WebRTC调用ossrs相关,实现直播视频通话功能请查看:
https://blog.csdn.net/gloryFlow/article/details/132262724

这里自定义RTCVideoCapturer

一、自定义相机需要的几个类

需要了解的几个类

  • AVCaptureSession
    AVCaptureSession是iOS提供的一个管理和协调输入设备到输出设备之间数据流的对象。

  • AVCaptureDevice
    AVCaptureDevice是指硬件设备。

  • AVCaptureDeviceInput
    AVCaptureDeviceInput是用来从AVCaptureDevice对象捕获Input数据。

  • AVCaptureMetadataOutput
    AVCaptureMetadataOutput是用来处理AVCaptureSession产生的定时元数据的捕获输出的。

  • AVCaptureVideoDataOutput
    AVCaptureVideoDataOutput是用来处理视频数据输出的。

  • AVCaptureVideoPreviewLayer
    AVCaptureVideoPreviewLayer是相机捕获的视频预览层,是用来展示视频的。

二、自定义RTCVideoCapturer相机采集

自定义相机,我们需要为AVCaptureSession添加AVCaptureVideoDataOutput的output

 self.dataOutput = [[AVCaptureVideoDataOutput alloc] init];
        [self.dataOutput setAlwaysDiscardsLateVideoFrames:YES];
        self.dataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(needYuvOutput ? kCVPixelFormatType_420YpCbCr8BiPlanarFullRange : kCVPixelFormatType_32BGRA)};
        self.dataOutput.alwaysDiscardsLateVideoFrames = YES;
        [self.dataOutput setSampleBufferDelegate:self queue:self.bufferQueue];

if ([self.session canAddOutput:self.dataOutput]) {
            
            [self.session addOutput:self.dataOutput];
        }else{
            
            NSLog( @"Could not add video data output to the session" );
            return nil;
        }

需要实现AVCaptureVideoDataOutputSampleBufferDelegate代理方法

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;

将CMSampleBufferRef转换成CVPixelBufferRef,将处理后的CVPixelBufferRef生成RTCVideoFrame,通过调用WebRTC的localVideoSource中实现的didCaptureVideoFrame方法。

完整代码如下

SDCustomRTCCameraCapturer.h

#import <AVFoundation/AVFoundation.h>
#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>

@protocol SDCustomRTCCameraCapturerDelegate <NSObject>

- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

@end

@interface SDCustomRTCCameraCapturer : RTCVideoCapturer

@property(nonatomic, weak) id<SDCustomRTCCameraCapturerDelegate>delegate;

// Capture session that is used for capturing. Valid from initialization to dealloc.
@property(nonatomic, strong) AVCaptureSession *captureSession;

// Returns list of available capture devices that support video capture.
+ (NSArray<AVCaptureDevice *> *)captureDevices;
// Returns list of formats that are supported by this class for this device.
+ (NSArray<AVCaptureDeviceFormat *> *)supportedFormatsForDevice:(AVCaptureDevice *)device;

// Returns the most efficient supported output pixel format for this capturer.
- (FourCharCode)preferredOutputPixelFormat;

// Starts the capture session asynchronously and notifies callback on completion.
// The device will capture video in the format given in the `format` parameter. If the pixel format
// in `format` is supported by the WebRTC pipeline, the same pixel format will be used for the
// output. Otherwise, the format returned by `preferredOutputPixelFormat` will be used.
- (void)startCaptureWithDevice:(AVCaptureDevice *)device
                        format:(AVCaptureDeviceFormat *)format
                           fps:(NSInteger)fps
             completionHandler:(nullable void (^)(NSError *))completionHandler;
// Stops the capture session asynchronously and notifies callback on completion.
- (void)stopCaptureWithCompletionHandler:(nullable void (^)(void))completionHandler;

// Starts the capture session asynchronously.
- (void)startCaptureWithDevice:(AVCaptureDevice *)device
                        format:(AVCaptureDeviceFormat *)format
                           fps:(NSInteger)fps;
// Stops the capture session asynchronously.
- (void)stopCapture;

#pragma mark - 自定义相机
@property (nonatomic, readonly) dispatch_queue_t bufferQueue;

@property (nonatomic, assign) AVCaptureDevicePosition devicePosition; // default AVCaptureDevicePositionFront

@property (nonatomic, assign) AVCaptureVideoOrientation videoOrientation;

@property (nonatomic, assign) BOOL needVideoMirrored;

@property (nonatomic, strong , readonly) AVCaptureConnection *videoConnection;

@property (nonatomic, copy) NSString *sessionPreset;  // default 640x480

@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;

@property (nonatomic, assign) BOOL bSessionPause;

@property (nonatomic, assign) int iExpectedFPS;

@property (nonatomic, readwrite, strong) NSDictionary *videoCompressingSettings;


- (instancetype)initWithDevicePosition:(AVCaptureDevicePosition)iDevicePosition
                        sessionPresset:(AVCaptureSessionPreset)sessionPreset
                                   fps:(int)iFPS
                         needYuvOutput:(BOOL)needYuvOutput;

- (void)setExposurePoint:(CGPoint)point inPreviewFrame:(CGRect)frame;

- (void)setISOValue:(float)value;

- (void)startRunning;

- (void)stopRunning;

- (void)rotateCamera;

- (void)rotateCamera:(BOOL)isUseFrontCamera;

- (void)setWhiteBalance;

- (CGRect)getZoomedRectWithRect:(CGRect)rect scaleToFit:(BOOL)bScaleToFit;

@end

SDCustomRTCCameraCapturer.m

#import "SDCustomRTCCameraCapturer.h"
#import <UIKit/UIKit.h>
#import "EFMachineVersion.h"

//#import "base/RTCLogging.h"
//#import "base/RTCVideoFrameBuffer.h"
//#import "components/video_frame_buffer/RTCCVPixelBuffer.h"
//
//#if TARGET_OS_IPHONE
//#import "helpers/UIDevice+RTCDevice.h"
//#endif
//
//#import "helpers/AVCaptureSession+DevicePosition.h"
//#import "helpers/RTCDispatcher+Private.h"

typedef NS_ENUM(NSUInteger, STExposureModel) {
    STExposureModelPositive5,
    STExposureModelPositive4,
    STExposureModelPositive3,
    STExposureModelPositive2,
    STExposureModelPositive1,
    STExposureModel0,
    STExposureModelNegative1,
    STExposureModelNegative2,
    STExposureModelNegative3,
    STExposureModelNegative4,
    STExposureModelNegative5,
    STExposureModelNegative6,
    STExposureModelNegative7,
    STExposureModelNegative8,
};

static char * kEffectsCamera = "EffectsCamera";

static STExposureModel currentExposureMode;

@interface SDCustomRTCCameraCapturer ()<AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate, AVCaptureMetadataOutputObjectsDelegate>
@property(nonatomic, readonly) dispatch_queue_t frameQueue;
@property(nonatomic, strong) AVCaptureDevice *currentDevice;
@property(nonatomic, assign) BOOL hasRetriedOnFatalError;
@property(nonatomic, assign) BOOL isRunning;
// Will the session be running once all asynchronous operations have been completed?
@property(nonatomic, assign) BOOL willBeRunning;

@property (nonatomic, strong) AVCaptureDeviceInput *deviceInput;
@property (nonatomic, strong) AVCaptureVideoDataOutput *dataOutput;
@property (nonatomic, strong) AVCaptureMetadataOutput *metaOutput;
@property (nonatomic, strong) AVCaptureStillImageOutput *stillImageOutput;

@property (nonatomic, readwrite) dispatch_queue_t bufferQueue;

@property (nonatomic, strong, readwrite) AVCaptureConnection *videoConnection;

@property (nonatomic, strong) AVCaptureDevice *videoDevice;
@property (nonatomic, strong) AVCaptureSession *session;

@end

@implementation SDCustomRTCCameraCapturer {
  AVCaptureVideoDataOutput *_videoDataOutput;
  AVCaptureSession *_captureSession;
  FourCharCode _preferredOutputPixelFormat;
  FourCharCode _outputPixelFormat;
  RTCVideoRotation _rotation;
    float _autoISOValue;
#if TARGET_OS_IPHONE
  UIDeviceOrientation _orientation;
  BOOL _generatingOrientationNotifications;
#endif
}

@synthesize frameQueue = _frameQueue;
@synthesize captureSession = _captureSession;
@synthesize currentDevice = _currentDevice;
@synthesize hasRetriedOnFatalError = _hasRetriedOnFatalError;
@synthesize isRunning = _isRunning;
@synthesize willBeRunning = _willBeRunning;

- (instancetype)init {
  return [self initWithDelegate:nil captureSession:[[AVCaptureSession alloc] init]];
}

- (instancetype)initWithDelegate:(__weak id<RTCVideoCapturerDelegate>)delegate {
  return [self initWithDelegate:delegate captureSession:[[AVCaptureSession alloc] init]];
}

// This initializer is used for testing.
- (instancetype)initWithDelegate:(__weak id<RTCVideoCapturerDelegate>)delegate
                  captureSession:(AVCaptureSession *)captureSession {
  if (self = [super initWithDelegate:delegate]) {
    // Create the capture session and all relevant inputs and outputs. We need
    // to do this in init because the application may want the capture session
    // before we start the capturer for e.g. AVCapturePreviewLayer. All objects
    // created here are retained until dealloc and never recreated.
    if (![self setupCaptureSession:captureSession]) {
      return nil;
    }
    NSNotificationCenter *center = [NSNotificationCenter defaultCenter];
#if TARGET_OS_IPHONE
    _orientation = UIDeviceOrientationPortrait;
    _rotation = RTCVideoRotation_90;
    [center addObserver:self
               selector:@selector(deviceOrientationDidChange:)
                   name:UIDeviceOrientationDidChangeNotification
                 object:nil];
    [center addObserver:self
               selector:@selector(handleCaptureSessionInterruption:)
                   name:AVCaptureSessionWasInterruptedNotification
                 object:_captureSession];
    [center addObserver:self
               selector:@selector(handleCaptureSessionInterruptionEnded:)
                   name:AVCaptureSessionInterruptionEndedNotification
                 object:_captureSession];
    [center addObserver:self
               selector:@selector(handleApplicationDidBecomeActive:)
                   name:UIApplicationDidBecomeActiveNotification
                 object:[UIApplication sharedApplication]];
#endif
    [center addObserver:self
               selector:@selector(handleCaptureSessionRuntimeError:)
                   name:AVCaptureSessionRuntimeErrorNotification
                 object:_captureSession];
    [center addObserver:self
               selector:@selector(handleCaptureSessionDidStartRunning:)
                   name:AVCaptureSessionDidStartRunningNotification
                 object:_captureSession];
    [center addObserver:self
               selector:@selector(handleCaptureSessionDidStopRunning:)
                   name:AVCaptureSessionDidStopRunningNotification
                 object:_captureSession];
  }
  return self;
}

//- (void)dealloc {
//  NSAssert(
//      !_willBeRunning,
//      @"Session was still running in RTCCameraVideoCapturer dealloc. Forgot to call stopCapture?");
//  [[NSNotificationCenter defaultCenter] removeObserver:self];
//}

+ (NSArray<AVCaptureDevice *> *)captureDevices {
#if defined(WEBRTC_IOS) && defined(__IPHONE_10_0) && \
    __IPHONE_OS_VERSION_MIN_REQUIRED >= __IPHONE_10_0
  AVCaptureDeviceDiscoverySession *session = [AVCaptureDeviceDiscoverySession
      discoverySessionWithDeviceTypes:@[ AVCaptureDeviceTypeBuiltInWideAngleCamera ]
                            mediaType:AVMediaTypeVideo
                             position:AVCaptureDevicePositionUnspecified];
  return session.devices;
#else
  return [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#endif
}

+ (NSArray<AVCaptureDeviceFormat *> *)supportedFormatsForDevice:(AVCaptureDevice *)device {
  // Support opening the device in any format. We make sure it's converted to a format we
  // can handle, if needed, in the method `-setupVideoDataOutput`.
  return device.formats;
}

- (FourCharCode)preferredOutputPixelFormat {
  return _preferredOutputPixelFormat;
}

- (void)startCaptureWithDevice:(AVCaptureDevice *)device
                        format:(AVCaptureDeviceFormat *)format
                           fps:(NSInteger)fps {
  [self startCaptureWithDevice:device format:format fps:fps completionHandler:nil];
}

- (void)stopCapture {
  [self stopCaptureWithCompletionHandler:nil];
}

- (void)startCaptureWithDevice:(AVCaptureDevice *)device
                        format:(AVCaptureDeviceFormat *)format
                           fps:(NSInteger)fps
             completionHandler:(nullable void (^)(NSError *))completionHandler {
  _willBeRunning = YES;
  [RTCDispatcher
      dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                    block:^{
                      RTCLogInfo("startCaptureWithDevice %@ @ %ld fps", format, (long)fps);

#if TARGET_OS_IPHONE
                      dispatch_async(dispatch_get_main_queue(), ^{
                        if (!self->_generatingOrientationNotifications) {
                          [[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];
                          self->_generatingOrientationNotifications = YES;
                        }
                      });
#endif

                      self.currentDevice = device;

                      NSError *error = nil;
                      if (![self.currentDevice lockForConfiguration:&error]) {
                        RTCLogError(@"Failed to lock device %@. Error: %@",
                                    self.currentDevice,
                                    error.userInfo);
                        if (completionHandler) {
                          completionHandler(error);
                        }
                        self.willBeRunning = NO;
                        return;
                      }
                      [self reconfigureCaptureSessionInput];
                      [self updateOrientation];
                      [self updateDeviceCaptureFormat:format fps:fps];
                      [self updateVideoDataOutputPixelFormat:format];
                      [self.captureSession startRunning];
                      [self.currentDevice unlockForConfiguration];
                      self.isRunning = YES;
                      if (completionHandler) {
                        completionHandler(nil);
                      }
                    }];
}

- (void)stopCaptureWithCompletionHandler:(nullable void (^)(void))completionHandler {
  _willBeRunning = NO;
  [RTCDispatcher
      dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                    block:^{
                      RTCLogInfo("Stop");
                      self.currentDevice = nil;
                      for (AVCaptureDeviceInput *oldInput in [self.captureSession.inputs copy]) {
                        [self.captureSession removeInput:oldInput];
                      }
                      [self.captureSession stopRunning];

#if TARGET_OS_IPHONE
                      dispatch_async(dispatch_get_main_queue(), ^{
                        if (self->_generatingOrientationNotifications) {
                          [[UIDevice currentDevice] endGeneratingDeviceOrientationNotifications];
                          self->_generatingOrientationNotifications = NO;
                        }
                      });
#endif
                      self.isRunning = NO;
                      if (completionHandler) {
                        completionHandler();
                      }
                    }];
}

#pragma mark iOS notifications

#if TARGET_OS_IPHONE
- (void)deviceOrientationDidChange:(NSNotification *)notification {
  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
                                 [self updateOrientation];
                               }];
}
#endif

#pragma mark AVCaptureVideoDataOutputSampleBufferDelegate

//- (void)captureOutput:(AVCaptureOutput *)captureOutput
//    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
//           fromConnection:(AVCaptureConnection *)connection {
//  NSParameterAssert(captureOutput == _videoDataOutput);
//
//  if (CMSampleBufferGetNumSamples(sampleBuffer) != 1 || !CMSampleBufferIsValid(sampleBuffer) ||
//      !CMSampleBufferDataIsReady(sampleBuffer)) {
//    return;
//  }
//
//  CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//  if (pixelBuffer == nil) {
//    return;
//  }
//
//#if TARGET_OS_IPHONE
//  // Default to portrait orientation on iPhone.
//  BOOL usingFrontCamera = NO;
//  // Check the image's EXIF for the camera the image came from as the image could have been
//  // delayed as we set alwaysDiscardsLateVideoFrames to NO.
//
//    AVCaptureDeviceInput *deviceInput =
//        (AVCaptureDeviceInput *)((AVCaptureInputPort *)connection.inputPorts.firstObject).input;
//    usingFrontCamera = AVCaptureDevicePositionFront == deviceInput.device.position;
//
//    switch (_orientation) {
//    case UIDeviceOrientationPortrait:
//      _rotation = RTCVideoRotation_90;
//      break;
//    case UIDeviceOrientationPortraitUpsideDown:
//      _rotation = RTCVideoRotation_270;
//      break;
//    case UIDeviceOrientationLandscapeLeft:
//      _rotation = usingFrontCamera ? RTCVideoRotation_180 : RTCVideoRotation_0;
//      break;
//    case UIDeviceOrientationLandscapeRight:
//      _rotation = usingFrontCamera ? RTCVideoRotation_0 : RTCVideoRotation_180;
//      break;
//    case UIDeviceOrientationFaceUp:
//    case UIDeviceOrientationFaceDown:
//    case UIDeviceOrientationUnknown:
//      // Ignore.
//      break;
//  }
//#else
//  // No rotation on Mac.
//  _rotation = RTCVideoRotation_0;
//#endif
//
//    if (self.delegate && [self.delegate respondsToSelector:@selector(rtcCameraVideoCapturer:didOutputSampleBuffer:)]) {
//        [self.delegate rtcCameraVideoCapturer:self didOutputSampleBuffer:sampleBuffer];
//    }
//}

#pragma mark - AVCaptureSession notifications

- (void)handleCaptureSessionInterruption:(NSNotification *)notification {
  NSString *reasonString = nil;
#if TARGET_OS_IPHONE
  NSNumber *reason = notification.userInfo[AVCaptureSessionInterruptionReasonKey];
  if (reason) {
    switch (reason.intValue) {
      case AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableInBackground:
        reasonString = @"VideoDeviceNotAvailableInBackground";
        break;
      case AVCaptureSessionInterruptionReasonAudioDeviceInUseByAnotherClient:
        reasonString = @"AudioDeviceInUseByAnotherClient";
        break;
      case AVCaptureSessionInterruptionReasonVideoDeviceInUseByAnotherClient:
        reasonString = @"VideoDeviceInUseByAnotherClient";
        break;
      case AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps:
        reasonString = @"VideoDeviceNotAvailableWithMultipleForegroundApps";
        break;
    }
  }
#endif
  RTCLog(@"Capture session interrupted: %@", reasonString);
}

- (void)handleCaptureSessionInterruptionEnded:(NSNotification *)notification {
  RTCLog(@"Capture session interruption ended.");
}

- (void)handleCaptureSessionRuntimeError:(NSNotification *)notification {
  NSError *error = [notification.userInfo objectForKey:AVCaptureSessionErrorKey];
  RTCLogError(@"Capture session runtime error: %@", error);

  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
#if TARGET_OS_IPHONE
                                 if (error.code == AVErrorMediaServicesWereReset) {
                                   [self handleNonFatalError];
                                 } else {
                                   [self handleFatalError];
                                 }
#else
                                [self handleFatalError];
#endif
                               }];
}

- (void)handleCaptureSessionDidStartRunning:(NSNotification *)notification {
  RTCLog(@"Capture session started.");

  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
                                 // If we successfully restarted after an unknown error,
                                 // allow future retries on fatal errors.
                                 self.hasRetriedOnFatalError = NO;
                               }];
}

- (void)handleCaptureSessionDidStopRunning:(NSNotification *)notification {
  RTCLog(@"Capture session stopped.");
}

- (void)handleFatalError {
  [RTCDispatcher
      dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                    block:^{
                      if (!self.hasRetriedOnFatalError) {
                        RTCLogWarning(@"Attempting to recover from fatal capture error.");
                        [self handleNonFatalError];
                        self.hasRetriedOnFatalError = YES;
                      } else {
                        RTCLogError(@"Previous fatal error recovery failed.");
                      }
                    }];
}

- (void)handleNonFatalError {
  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
                                 RTCLog(@"Restarting capture session after error.");
                                 if (self.isRunning) {
                                   [self.captureSession startRunning];
                                 }
                               }];
}

#if TARGET_OS_IPHONE

#pragma mark - UIApplication notifications

- (void)handleApplicationDidBecomeActive:(NSNotification *)notification {
  [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession
                               block:^{
                                 if (self.isRunning && !self.captureSession.isRunning) {
                                   RTCLog(@"Restarting capture session on active.");
                                   [self.captureSession startRunning];
                                 }
                               }];
}

#endif  // TARGET_OS_IPHONE

#pragma mark - Private

- (dispatch_queue_t)frameQueue {
  if (!_frameQueue) {
    _frameQueue =
        dispatch_queue_create("org.webrtc.cameravideocapturer.video", DISPATCH_QUEUE_SERIAL);
    dispatch_set_target_queue(_frameQueue,
                              dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0));
  }
  return _frameQueue;
}

- (BOOL)setupCaptureSession:(AVCaptureSession *)captureSession {
  NSAssert(_captureSession == nil, @"Setup capture session called twice.");
  _captureSession = captureSession;
#if defined(WEBRTC_IOS)
  _captureSession.sessionPreset = AVCaptureSessionPresetInputPriority;
  _captureSession.usesApplicationAudioSession = NO;
#endif
  [self setupVideoDataOutput];
  // Add the output.
  if (![_captureSession canAddOutput:_videoDataOutput]) {
    RTCLogError(@"Video data output unsupported.");
    return NO;
  }
  [_captureSession addOutput:_videoDataOutput];

  return YES;
}

- (void)setupVideoDataOutput {
  NSAssert(_videoDataOutput == nil, @"Setup video data output called twice.");
  AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];

  // `videoDataOutput.availableVideoCVPixelFormatTypes` returns the pixel formats supported by the
  // device with the most efficient output format first. Find the first format that we support.
  NSSet<NSNumber *> *supportedPixelFormats = [RTCCVPixelBuffer supportedPixelFormats];
  NSMutableOrderedSet *availablePixelFormats =
      [NSMutableOrderedSet orderedSetWithArray:videoDataOutput.availableVideoCVPixelFormatTypes];
  [availablePixelFormats intersectSet:supportedPixelFormats];
  NSNumber *pixelFormat = availablePixelFormats.firstObject;
  NSAssert(pixelFormat, @"Output device has no supported formats.");

  _preferredOutputPixelFormat = [pixelFormat unsignedIntValue];
  _outputPixelFormat = _preferredOutputPixelFormat;
//  videoDataOutput.videoSettings = @{(NSString *)kCVPixelBufferPixelFormatTypeKey : pixelFormat};
    
//    [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];
    videoDataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};
    videoDataOutput.alwaysDiscardsLateVideoFrames = YES;
    
  [videoDataOutput setSampleBufferDelegate:self queue:self.frameQueue];
  _videoDataOutput = videoDataOutput;
    
//    // 设置视频方向和旋转
//    AVCaptureConnection *connection = [videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
//    [connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
//    [connection setVideoMirrored:NO];
}

- (void)updateVideoDataOutputPixelFormat:(AVCaptureDeviceFormat *)format {
//  FourCharCode mediaSubType = CMFormatDescriptionGetMediaSubType(format.formatDescription);
//  if (![[RTCCVPixelBuffer supportedPixelFormats] containsObject:@(mediaSubType)]) {
//    mediaSubType = _preferredOutputPixelFormat;
//  }
//
//  if (mediaSubType != _outputPixelFormat) {
//    _outputPixelFormat = mediaSubType;
//    _videoDataOutput.videoSettings =
//        @{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(mediaSubType) };
//  }
}

#pragma mark - Private, called inside capture queue

- (void)updateDeviceCaptureFormat:(AVCaptureDeviceFormat *)format fps:(NSInteger)fps {
  NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],
           @"updateDeviceCaptureFormat must be called on the capture queue.");
  @try {
    _currentDevice.activeFormat = format;
    _currentDevice.activeVideoMinFrameDuration = CMTimeMake(1, fps);
  } @catch (NSException *exception) {
    RTCLogError(@"Failed to set active format!\n User info:%@", exception.userInfo);
    return;
  }
}

- (void)reconfigureCaptureSessionInput {
  NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],
           @"reconfigureCaptureSessionInput must be called on the capture queue.");
  NSError *error = nil;
  AVCaptureDeviceInput *input =
      [AVCaptureDeviceInput deviceInputWithDevice:_currentDevice error:&error];
  if (!input) {
    RTCLogError(@"Failed to create front camera input: %@", error.localizedDescription);
    return;
  }
  [_captureSession beginConfiguration];
  for (AVCaptureDeviceInput *oldInput in [_captureSession.inputs copy]) {
    [_captureSession removeInput:oldInput];
  }
  if ([_captureSession canAddInput:input]) {
    [_captureSession addInput:input];
  } else {
    RTCLogError(@"Cannot add camera as an input to the session.");
  }
  [_captureSession commitConfiguration];
}

- (void)updateOrientation {
  NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],
           @"updateOrientation must be called on the capture queue.");
#if TARGET_OS_IPHONE
  _orientation = [UIDevice currentDevice].orientation;
#endif
}

- (instancetype)initWithDevicePosition:(AVCaptureDevicePosition)iDevicePosition
                        sessionPresset:(AVCaptureSessionPreset)sessionPreset
                                   fps:(int)iFPS
                         needYuvOutput:(BOOL)needYuvOutput
{
    self = [super init];
    if (self) {
        
        self.bSessionPause = YES;
        
        self.bufferQueue = dispatch_queue_create("STCameraBufferQueue", NULL);
        
        self.session = [[AVCaptureSession alloc] init];
        
        self.videoDevice = [self cameraDeviceWithPosition:iDevicePosition];
        _devicePosition = iDevicePosition;
        NSError *error = nil;
        self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.videoDevice
                                                                 error:&error];
        
        if (!self.deviceInput || error) {

            NSLog(@"create input error");

            return nil;
        }
        
        
        self.dataOutput = [[AVCaptureVideoDataOutput alloc] init];
        [self.dataOutput setAlwaysDiscardsLateVideoFrames:YES];
        self.dataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(needYuvOutput ? kCVPixelFormatType_420YpCbCr8BiPlanarFullRange : kCVPixelFormatType_32BGRA)};
        self.dataOutput.alwaysDiscardsLateVideoFrames = YES;
        [self.dataOutput setSampleBufferDelegate:self queue:self.bufferQueue];
        self.metaOutput = [[AVCaptureMetadataOutput alloc] init];
        [self.metaOutput setMetadataObjectsDelegate:self queue:self.bufferQueue];
        
        self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
        self.stillImageOutput.outputSettings = @{AVVideoCodecKey : AVVideoCodecJPEG};
        if ([self.stillImageOutput respondsToSelector:@selector(setHighResolutionStillImageOutputEnabled:)]) {

            self.stillImageOutput.highResolutionStillImageOutputEnabled = YES;
        }
        
        
        [self.session beginConfiguration];
        
        if ([self.session canAddInput:self.deviceInput]) {
            [self.session addInput:self.deviceInput];
        }else{
            NSLog( @"Could not add device input to the session" );
            return nil;
        }
        
        if ([self.session canSetSessionPreset:sessionPreset]) {
            [self.session setSessionPreset:sessionPreset];
            _sessionPreset = sessionPreset;
        }else if([self.session canSetSessionPreset:AVCaptureSessionPreset1920x1080]){
            [self.session setSessionPreset:AVCaptureSessionPreset1920x1080];
            _sessionPreset = AVCaptureSessionPreset1920x1080;
        }else if([self.session canSetSessionPreset:AVCaptureSessionPreset1280x720]){
            [self.session setSessionPreset:AVCaptureSessionPreset1280x720];
            _sessionPreset = AVCaptureSessionPreset1280x720;
        }else{
            [self.session setSessionPreset:AVCaptureSessionPreset640x480];
            _sessionPreset = AVCaptureSessionPreset640x480;
        }
        
        if ([self.session canAddOutput:self.dataOutput]) {
            
            [self.session addOutput:self.dataOutput];
        }else{
            
            NSLog( @"Could not add video data output to the session" );
            return nil;
        }
        
        if ([self.session canAddOutput:self.metaOutput]) {
            [self.session addOutput:self.metaOutput];
            self.metaOutput.metadataObjectTypes = @[AVMetadataObjectTypeFace].copy;
        }
        
        if ([self.session canAddOutput:self.stillImageOutput]) {

            [self.session addOutput:self.stillImageOutput];
        }else {

            NSLog(@"Could not add still image output to the session");
        }
        
        self.videoConnection =  [self.dataOutput connectionWithMediaType:AVMediaTypeVideo];
        
        
        if ([self.videoConnection isVideoOrientationSupported]) {
            
            [self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
            self.videoOrientation = AVCaptureVideoOrientationPortrait;
        }
        
        
        if ([self.videoConnection isVideoMirroringSupported]) {
            
            [self.videoConnection setVideoMirrored:YES];
            self.needVideoMirrored = YES;
        }
        
        if ([_videoDevice lockForConfiguration:NULL] == YES) {
//            _videoDevice.activeFormat = bestFormat;
            _videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, iFPS);
            _videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, iFPS);
            [_videoDevice unlockForConfiguration];
        }
        
        
        [self.session commitConfiguration];
        
        NSMutableDictionary *tmpSettings = [[self.dataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] mutableCopy];
        if (!EFMachineVersion.isiPhone5sOrLater) {
            NSNumber *tmpSettingValue = tmpSettings[AVVideoHeightKey];
            tmpSettings[AVVideoHeightKey] = tmpSettings[AVVideoWidthKey];
            tmpSettings[AVVideoWidthKey] = tmpSettingValue;
        }
        self.videoCompressingSettings = [tmpSettings copy];
        
        self.iExpectedFPS = iFPS;
        
        [self addObservers];
    }
    
    return self;
}

- (void)rotateCamera {
    
    if (self.devicePosition == AVCaptureDevicePositionFront) {
        self.devicePosition = AVCaptureDevicePositionBack;
    }else{
        self.devicePosition = AVCaptureDevicePositionFront;
    }
}

- (void)rotateCamera:(BOOL)isUseFrontCamera {
    
    if (isUseFrontCamera) {
        self.devicePosition = AVCaptureDevicePositionFront;
    }else{
        self.devicePosition = AVCaptureDevicePositionBack;
    }
}

- (void)setExposurePoint:(CGPoint)point inPreviewFrame:(CGRect)frame {
    
    BOOL isFrontCamera = self.devicePosition == AVCaptureDevicePositionFront;
    float fX = point.y / frame.size.height;
    float fY = isFrontCamera ? point.x / frame.size.width : (1 - point.x / frame.size.width);
    
    [self focusWithMode:self.videoDevice.focusMode exposureMode:self.videoDevice.exposureMode atPoint:CGPointMake(fX, fY)];
}

- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{
    NSError *error = nil;
    AVCaptureDevice * device = self.videoDevice;
    
    if ( [device lockForConfiguration:&error] ) {
        device.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
//        - (void)setISOValue:(float)value{
        [self setISOValue:0];
//device.exposureTargetBias
        // Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation
        // Call -set(Focus/Exposure)Mode: to apply the new point of interest
        if ( focusMode != AVCaptureFocusModeLocked && device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {
            device.focusPointOfInterest = point;
            device.focusMode = focusMode;
        }
        
        if ( exposureMode != AVCaptureExposureModeCustom && device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {
            device.exposurePointOfInterest = point;
            device.exposureMode = exposureMode;
        }
        
        device.subjectAreaChangeMonitoringEnabled = YES;
        [device unlockForConfiguration];
        
        
    }
}

- (void)setWhiteBalance{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        if ([captureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]) {
            [captureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
        }
    }];
}

- (void)changeDeviceProperty:(void(^)(AVCaptureDevice *))propertyChange{
    AVCaptureDevice *captureDevice= self.videoDevice;
    NSError *error;
    if ([captureDevice lockForConfiguration:&error]) {
        propertyChange(captureDevice);
        [captureDevice unlockForConfiguration];
    }else{
        NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);
    }
}

- (void)setISOValue:(float)value exposeDuration:(int)duration{
    float currentISO = (value < self.videoDevice.activeFormat.minISO) ? self.videoDevice.activeFormat.minISO: value;
    currentISO = value > self.videoDevice.activeFormat.maxISO ? self.videoDevice.activeFormat.maxISO : value;
    NSError *error;
    if ([self.videoDevice lockForConfiguration:&error]){
        [self.videoDevice setExposureModeCustomWithDuration:AVCaptureExposureDurationCurrent ISO:currentISO completionHandler:nil];
        [self.videoDevice unlockForConfiguration];
    }
}

- (void)setISOValue:(float)value{
//    float newVlaue = (value - 0.5) * (5.0 / 0.5); // mirror [0,1] to [-8,8]
    NSLog(@"%f", value);
    NSError *error = nil;
    if ( [self.videoDevice lockForConfiguration:&error] ) {
        [self.videoDevice setExposureTargetBias:value completionHandler:nil];
        [self.videoDevice unlockForConfiguration];
    }
    else {
        NSLog( @"Could not lock device for configuration: %@", error );
    }
}

- (void)setDevicePosition:(AVCaptureDevicePosition)devicePosition
{
    if (_devicePosition != devicePosition && devicePosition != AVCaptureDevicePositionUnspecified) {
        
        if (_session) {
            
            AVCaptureDevice *targetDevice = [self cameraDeviceWithPosition:devicePosition];
            
            if (targetDevice && [self judgeCameraAuthorization]) {
                
                NSError *error = nil;
                AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:targetDevice error:&error];
                
                if(!deviceInput || error) {
                    
                    NSLog(@"Error creating capture device input: %@", error.localizedDescription);
                    return;
                }
                
                _bSessionPause = YES;
                
                [_session beginConfiguration];
                
                [_session removeInput:_deviceInput];
                
                if ([_session canAddInput:deviceInput]) {
                    
                    [_session addInput:deviceInput];
                    
                    _deviceInput = deviceInput;
                    _videoDevice = targetDevice;
                    
                    _devicePosition = devicePosition;
                }
                
                _videoConnection =  [_dataOutput connectionWithMediaType:AVMediaTypeVideo];
                
                if ([_videoConnection isVideoOrientationSupported]) {
                    
                    [_videoConnection setVideoOrientation:_videoOrientation];
                }
                
                if ([_videoConnection isVideoMirroringSupported]) {
                    
                    [_videoConnection setVideoMirrored:devicePosition == AVCaptureDevicePositionFront];
                }
                
                [_session commitConfiguration];
                
                [self setSessionPreset:_sessionPreset];
                
                _bSessionPause = NO;
            }
        }
    }
}

- (void)setSessionPreset:(NSString *)sessionPreset
{
    if (_session && _sessionPreset) {
        
//        if (![sessionPreset isEqualToString:_sessionPreset]) {
        
        _bSessionPause = YES;
        
        [_session beginConfiguration];
        
        if ([_session canSetSessionPreset:sessionPreset]) {
            
            [_session setSessionPreset:sessionPreset];
            
            _sessionPreset = sessionPreset;
        }
        
        
        [_session commitConfiguration];
        
        self.videoCompressingSettings = [[self.dataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] copy];
        
//        [self setIExpectedFPS:_iExpectedFPS];
        
        _bSessionPause = NO;
//        }
    }
}

- (void)setIExpectedFPS:(int)iExpectedFPS
{
    _iExpectedFPS = iExpectedFPS;
    
    if (iExpectedFPS <= 0 || !_dataOutput.videoSettings || !_videoDevice) {
        
        return;
    }
    
    CGFloat fWidth = [[_dataOutput.videoSettings objectForKey:@"Width"] floatValue];
    CGFloat fHeight = [[_dataOutput.videoSettings objectForKey:@"Height"] floatValue];
    
    AVCaptureDeviceFormat *bestFormat = nil;
    AVFrameRateRange *bestFrameRateRange = nil;
    
    for (AVCaptureDeviceFormat *format in [_videoDevice formats]) {
        
        CMFormatDescriptionRef description = format.formatDescription;
        
        if (CMFormatDescriptionGetMediaSubType(description) != kCVPixelFormatType_420YpCbCr8BiPlanarFullRange) {
            
            continue;
        }
        
        CMVideoDimensions videoDimension = CMVideoFormatDescriptionGetDimensions(description);
        if ((videoDimension.width == fWidth && videoDimension.height == fHeight)
            ||
            (videoDimension.height == fWidth && videoDimension.width == fHeight)) {
            
            for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
                
                if (range.maxFrameRate >= bestFrameRateRange.maxFrameRate) {
                    bestFormat = format;
                    bestFrameRateRange = range;
                }
            }
        }
    }
    
    if (bestFormat) {
        
        CMTime minFrameDuration;
        
        if (bestFrameRateRange.minFrameDuration.timescale / bestFrameRateRange.minFrameDuration.value < iExpectedFPS) {
            
            minFrameDuration = bestFrameRateRange.minFrameDuration;
        }else{
            
            minFrameDuration = CMTimeMake(1, iExpectedFPS);
        }
        
        if ([_videoDevice lockForConfiguration:NULL] == YES) {
            _videoDevice.activeFormat = bestFormat;
            _videoDevice.activeVideoMinFrameDuration = minFrameDuration;
            _videoDevice.activeVideoMaxFrameDuration = minFrameDuration;
            [_videoDevice unlockForConfiguration];
        }
    }
}

- (void)startRunning
{
    if (![self judgeCameraAuthorization]) {
        
        return;
    }
    
    if (!self.dataOutput) {

        return;
    }
    
    if (self.session && ![self.session isRunning]) {
        if (self.bufferQueue) {
            dispatch_async(self.bufferQueue, ^{
                [self.session startRunning];
            });
        }
        self.bSessionPause = NO;
    }
}


- (void)stopRunning
{
    if (self.session && [self.session isRunning]) {
        if (self.bufferQueue) {
            dispatch_async(self.bufferQueue, ^{
                [self.session stopRunning];
            });
        }
        self.bSessionPause = YES;
    }
}

- (CGRect)getZoomedRectWithRect:(CGRect)rect scaleToFit:(BOOL)bScaleToFit
{
    CGRect rectRet = rect;
    
    if (self.dataOutput.videoSettings) {
        
        CGFloat fWidth = [[self.dataOutput.videoSettings objectForKey:@"Width"] floatValue];
        CGFloat fHeight = [[self.dataOutput.videoSettings objectForKey:@"Height"] floatValue];
        
        float fScaleX = fWidth / CGRectGetWidth(rect);
        float fScaleY = fHeight / CGRectGetHeight(rect);
        float fScale = bScaleToFit ? fmaxf(fScaleX, fScaleY) : fminf(fScaleX, fScaleY);
        
        fWidth /= fScale;
        fHeight /= fScale;
        
        CGFloat fX = rect.origin.x - (fWidth - rect.size.width) / 2.0f;
        CGFloat fY = rect.origin.y - (fHeight - rect.size.height) / 2.0f;
        
        rectRet = CGRectMake(fX, fY, fWidth, fHeight);
    }
    
    return rectRet;
}

- (BOOL)judgeCameraAuthorization
{
    AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
    
    if (authStatus == AVAuthorizationStatusRestricted || authStatus == AVAuthorizationStatusDenied) {
        return NO;
    }
    
    return YES;
}

- (AVCaptureDevice *)cameraDeviceWithPosition:(AVCaptureDevicePosition)position
{
    AVCaptureDevice *deviceRet = nil;
    
    if (position != AVCaptureDevicePositionUnspecified) {
        
        NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        
        for (AVCaptureDevice *device in devices) {
            
            if ([device position] == position) {
                
                deviceRet = device;
            }
        }
    }
    
    return deviceRet;
}

- (AVCaptureVideoPreviewLayer *)previewLayer
{
    if (!_previewLayer) {
        
        _previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
    }
    
    return _previewLayer;
}

- (void)snapStillImageCompletionHandler:(void (^)(CMSampleBufferRef imageDataSampleBuffer, NSError *error))handler
{
    if ([self judgeCameraAuthorization]) {
        self.bSessionPause = YES;
        
        NSString *strSessionPreset = [self.sessionPreset mutableCopy];
        self.sessionPreset = AVCaptureSessionPresetPhoto;
        
        // 改变preset会黑一下
        [NSThread sleepForTimeInterval:0.3];
        
        dispatch_async(self.bufferQueue, ^{
            
            [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:[self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
                
                self.bSessionPause = NO;
                self.sessionPreset = strSessionPreset;
                handler(imageDataSampleBuffer , error);
            }];
        } );
    }
}
BOOL updateExporureModel(STExposureModel model){
    if (currentExposureMode == model) return NO;
    currentExposureMode = model;
    return YES;
}

- (void)test:(AVCaptureExposureMode)model{
    if ([self.videoDevice lockForConfiguration:nil]) {
        if ([self.videoDevice  isExposureModeSupported:model]) {
            [self.videoDevice setExposureMode:model];
        }
        [self.videoDevice unlockForConfiguration];
    }
}

- (void)setExposureTime:(CMTime)time{
    if ([self.videoDevice lockForConfiguration:nil]) {
        if (@available(iOS 12.0, *)) {
            self.videoDevice.activeMaxExposureDuration = time;
        } else {
            // Fallback on earlier versions
        }
        [self.videoDevice unlockForConfiguration];
    }
}
- (void)setFPS:(float)fps{
    if ([_videoDevice lockForConfiguration:NULL] == YES) {
//            _videoDevice.activeFormat = bestFormat;
        _videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, fps);
        _videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, fps);
        [_videoDevice unlockForConfiguration];
    }
}

- (void)updateExposure:(CMSampleBufferRef)sampleBuffer{
    CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,sampleBuffer, kCMAttachmentMode_ShouldPropagate);
    NSDictionary * metadata = [[NSMutableDictionary alloc] initWithDictionary:(__bridge NSDictionary *)metadataDict];
    CFRelease(metadataDict);
    NSDictionary *exifMetadata = [[metadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
    float brightnessValue = [[exifMetadata objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
    
    if(brightnessValue > 2 && updateExporureModel(STExposureModelPositive2)){
        [self setISOValue:500 exposeDuration:30];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
        [self setFPS:30];
    }else if(brightnessValue > 1 && brightnessValue < 2 && updateExporureModel(STExposureModelPositive1)){
        [self setISOValue:500 exposeDuration:30];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
        [self setFPS:30];
    }else if(brightnessValue > 0 && brightnessValue < 1 && updateExporureModel(STExposureModel0)){
        [self setISOValue:500 exposeDuration:30];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
        [self setFPS:30];
    }else if (brightnessValue > -1 && brightnessValue < 0 && updateExporureModel(STExposureModelNegative1)){
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:40];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -2 && brightnessValue < -1 && updateExporureModel(STExposureModelNegative2)){
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:35];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -2.5 && brightnessValue < -2 && updateExporureModel(STExposureModelNegative3)){
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:30];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -3 && brightnessValue < -2.5 && updateExporureModel(STExposureModelNegative4)){
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:25];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -3.5 && brightnessValue < -3 && updateExporureModel(STExposureModelNegative5)){
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:20];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -4 && brightnessValue < -3.5 && updateExporureModel(STExposureModelNegative6)){
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 250 exposeDuration:15];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if (brightnessValue > -5 && brightnessValue < -4 && updateExporureModel(STExposureModelNegative7)){
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:10];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }else if(brightnessValue < -5 && updateExporureModel(STExposureModelNegative8)){
        [self setISOValue:self.videoDevice.activeFormat.maxISO - 150 exposeDuration:5];
        [self test:AVCaptureExposureModeContinuousAutoExposure];
    }

//    NSLog(@"current brightness %f min iso %f max iso %f min exposure %f max exposure %f", brightnessValue, self.videoDevice.activeFormat.minISO, self.videoDevice.activeFormat.maxISO, CMTimeGetSeconds(self.videoDevice.activeFormat.minExposureDuration),   CMTimeGetSeconds(self.videoDevice.activeFormat.maxExposureDuration));
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    if (!self.bSessionPause) {
        if (self.delegate && [self.delegate respondsToSelector:@selector(rtcCameraVideoCapturer:didOutputSampleBuffer:)]) {
            [self.delegate rtcCameraVideoCapturer:self didOutputSampleBuffer:sampleBuffer];
        }
//        if (self.delegate && [self.delegate respondsToSelector:@selector(captureOutput:didOutputSampleBuffer:fromConnection:)]) {
//            //[connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
//            [self.delegate captureOutput:captureOutput didOutputSampleBuffer:sampleBuffer fromConnection:connection];
//        }
    }
//    [self updateExposure:sampleBuffer];
}


- (void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<__kindof AVMetadataObject *> *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
    AVMetadataFaceObject *faceObject = nil;
    for(AVMetadataObject *object  in metadataObjects){
        if (AVMetadataObjectTypeFace == object.type) {
            faceObject = (AVMetadataFaceObject*)object;
        }
    }
    static BOOL hasFace = NO;
    if (!hasFace && faceObject.faceID) {
        hasFace = YES;
    }
    if (!faceObject.faceID) {
        hasFace = NO;
    }
}

#pragma mark - Notifications
- (void)addObservers {
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(appWillResignActive) name:UIApplicationWillResignActiveNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(appDidBecomeActive) name:UIApplicationDidBecomeActiveNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self
               selector:@selector(dealCaptureSessionRuntimeError:)
                   name:AVCaptureSessionRuntimeErrorNotification
                 object:self.session];
}

- (void)removeObservers {
    [[NSNotificationCenter defaultCenter] removeObserver:self];
}

#pragma mark - Notification
- (void)appWillResignActive {
    RTCLogInfo("SDCustomRTCCameraCapturer appWillResignActive");
}

- (void)appDidBecomeActive {
    RTCLogInfo("SDCustomRTCCameraCapturer appDidBecomeActive");
    [self startRunning];
}

- (void)dealCaptureSessionRuntimeError:(NSNotification *)notification {
    NSError *error = [notification.userInfo objectForKey:AVCaptureSessionErrorKey];
    RTCLogError(@"SDCustomRTCCameraCapturer dealCaptureSessionRuntimeError error: %@", error);

    if (error.code == AVErrorMediaServicesWereReset) {
        [self startRunning];
    }
}

#pragma mark - Dealloc
- (void)dealloc {
    DebugLog(@"SDCustomRTCCameraCapturer dealloc");
    [self removeObservers];

    if (self.session) {
        self.bSessionPause = YES;
        [self.session beginConfiguration];
        [self.session removeOutput:self.dataOutput];
        [self.session removeInput:self.deviceInput];
        [self.session commitConfiguration];
        if ([self.session isRunning]) {
            [self.session stopRunning];
        }
        self.session = nil;
    }
}

@end

三、实现WebRTC结合ossrs视频通话功能

iOS端WebRTC调用ossrs相关,实现直播视频通话功能请查看:
https://blog.csdn.net/gloryFlow/article/details/132262724

这里列出来需要改动的地方

在createVideoTrack中使用SDCustomRTCCameraCapturer类

- (RTCVideoTrack *)createVideoTrack {
    RTCVideoSource *videoSource = [self.factory videoSource];
    self.localVideoSource = videoSource;

    // 如果是模拟器
    if (TARGET_IPHONE_SIMULATOR) {
        if (@available(iOS 10, *)) {
            self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];
        } else {
            // Fallback on earlier versions
        }
    } else{
        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDevicePosition:AVCaptureDevicePositionFront sessionPresset:AVCaptureSessionPreset1920x1080 fps:20 needYuvOutput:NO];
        
//        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDelegate:self];
    }
    
    RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];
    
    return videoTrack;
}

当需要渲染到界面上的时候,需要设置startCaptureLocalVideo,为 localVideoTrack添加渲染的界面View,renderer是一个RTCMTLVideoView

self.localRenderer = [[RTCMTLVideoView alloc] initWithFrame:CGRectZero];
    self.localRenderer.delegate = self;
- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {
    if (!self.isPublish) {
        return;
    }
    
    if (!renderer) {
        return;
    }
    
    if (!self.videoCapturer) {
        return;
    }
    
    [self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];
    
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {
        AVCaptureDevice *camera = [self findDeviceForPosition:self.usingFrontCamera?AVCaptureDevicePositionFront:AVCaptureDevicePositionBack];
        
        SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;
        [cameraVideoCapturer setISOValue:0.0];
        [cameraVideoCapturer rotateCamera:self.usingFrontCamera];
        self.videoCapturer.delegate = self;

        AVCaptureDeviceFormat *formatNilable = [self selectFormatForDevice:camera];;
        
        if (!formatNilable) {
            return;
        }
        
        DebugLog(@"formatNilable:%@", formatNilable);
        
        NSInteger fps = [self selectFpsForFormat:formatNilable];
        
        CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatNilable.formatDescription);
        float width = videoVideoDimensions.width;
        float height = videoVideoDimensions.height;
                
        DebugLog(@"videoVideoDimensions width:%f,height:%f", width, height);
        [cameraVideoCapturer startRunning];
//        [cameraVideoCapturer startCaptureWithDevice:camera format:formatNilable fps:fps completionHandler:^(NSError *error) {
//            DebugLog(@"startCaptureWithDevice error:%@", error);
//        }];
        [self changeResolution:width height:height fps:(int)fps];
    }
    
    if (@available(iOS 10, *)) {
        if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {
            RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;
            [fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {
                DebugLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);
            }];
        }
    } else {
        // Fallback on earlier versions
    }
    
    [self.localVideoTrack addRenderer:renderer];
}

获得相机采集的画面- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

具体代码如下

- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    RTCCVPixelBuffer *rtcPixelBuffer =
    [[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];
    RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
    int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *
    1000000000;
    RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer     

    
    [self.localVideoSource capturer:capturer didCaptureVideoFrame:rtcVideoFrame];
}

WebRTCClient完整代码如下

WebRTCClient.h

#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>
#import <UIKit/UIKit.h>
#import "SDCustomRTCCameraCapturer.h"

#define kSelectedResolution @"kSelectedResolutionIndex"
#define kFramerateLimit 30.0

@protocol WebRTCClientDelegate;
@interface WebRTCClient : NSObject

@property (nonatomic, weak) id<WebRTCClientDelegate> delegate;

/**
 connect工厂
 */
@property (nonatomic, strong) RTCPeerConnectionFactory *factory;

/**
 是否push
 */
@property (nonatomic, assign) BOOL isPublish;

/**
 connect
 */
@property (nonatomic, strong) RTCPeerConnection *peerConnection;

/**
 RTCAudioSession
 */
@property (nonatomic, strong) RTCAudioSession *rtcAudioSession;

/**
 DispatchQueue
 */
@property (nonatomic) dispatch_queue_t audioQueue;

/**
 mediaConstrains
 */
@property (nonatomic, strong) NSDictionary *mediaConstrains;

/**
 publishMediaConstrains
 */
@property (nonatomic, strong) NSDictionary *publishMediaConstrains;

/**
 playMediaConstrains
 */
@property (nonatomic, strong) NSDictionary *playMediaConstrains;

/**
 optionalConstraints
 */
@property (nonatomic, strong) NSDictionary *optionalConstraints;

/**
 RTCVideoCapturer摄像头采集器
 */
@property (nonatomic, strong) RTCVideoCapturer *videoCapturer;

/**
 local语音localAudioTrack
 */
@property (nonatomic, strong) RTCAudioTrack *localAudioTrack;

/**
 localVideoTrack
 */
@property (nonatomic, strong) RTCVideoTrack *localVideoTrack;

/**
 remoteVideoTrack
 */
@property (nonatomic, strong) RTCVideoTrack *remoteVideoTrack;

/**
 RTCVideoRenderer
 */
@property (nonatomic, weak) id<RTCVideoRenderer> remoteRenderView;

/**
 localDataChannel
 */
@property (nonatomic, strong) RTCDataChannel *localDataChannel;

/**
 localDataChannel
 */
@property (nonatomic, strong) RTCDataChannel *remoteDataChannel;

/**
 RTCVideoSource
 */
@property (nonatomic, strong) RTCVideoSource *localVideoSource;


- (instancetype)initWithPublish:(BOOL)isPublish;

- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer;

- (void)addIceCandidate:(RTCIceCandidate *)candidate;

- (void)answer:(void (^)(RTCSessionDescription *sdp))completionHandler;

- (void)offer:(void (^)(RTCSessionDescription *sdp))completionHandler;

- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError *error))completion;

- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate;

- (BOOL)changeResolution:(int)width height:(int)height fps:(int)fps;

- (NSArray<NSString *> *)availableVideoResolutions;

#pragma mark - switchCamera
- (void)switchCamera:(id<RTCVideoRenderer>)renderer;

#pragma mark - Hiden or show Video
- (void)hidenVideo;

- (void)showVideo;

#pragma mark - Hiden or show Audio
- (void)muteAudio;

- (void)unmuteAudio;

- (void)speakOff;

- (void)speakOn;

#pragma mark - 设置视频码率BitrateBps
- (void)setMaxBitrate:(int)maxBitrate;

- (void)setMinBitrate:(int)minBitrate;

#pragma mark - 设置视频帧率
- (void)setMaxFramerate:(int)maxFramerate;

- (void)close;

@end


@protocol WebRTCClientDelegate <NSObject>

// 处理美颜设置
- (RTCVideoFrame *)webRTCClient:(WebRTCClient *)client didCaptureSampleBuffer:(CMSampleBufferRef)sampleBuffer;

- (void)webRTCClient:(WebRTCClient *)client didDiscoverLocalCandidate:(RTCIceCandidate *)candidate;
- (void)webRTCClient:(WebRTCClient *)client didChangeConnectionState:(RTCIceConnectionState)state;
- (void)webRTCClient:(WebRTCClient *)client didReceiveData:(NSData *)data;

@end

WebRTCClient.m

#import "WebRTCClient.h"


@interface WebRTCClient ()<RTCPeerConnectionDelegate, RTCDataChannelDelegate, RTCVideoCapturerDelegate, SDCustomRTCCameraCapturerDelegate>

@property (nonatomic, assign) BOOL usingFrontCamera;

@end

@implementation WebRTCClient

- (instancetype)initWithPublish:(BOOL)isPublish {
    self = [super init];
    if (self) {
        self.isPublish = isPublish;
        self.usingFrontCamera = YES;
                                        
        RTCMediaConstraints *constraints = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.publishMediaConstrains optionalConstraints:self.optionalConstraints];
        
        RTCConfiguration *newConfig = [[RTCConfiguration alloc] init];
        newConfig.sdpSemantics = RTCSdpSemanticsUnifiedPlan;
        newConfig.continualGatheringPolicy = RTCContinualGatheringPolicyGatherContinually;

        self.peerConnection = [self.factory peerConnectionWithConfiguration:newConfig constraints:constraints delegate:nil];
        
        [self createMediaSenders];
        
        [self createMediaReceivers];
        
        // srs not support data channel.
        // self.createDataChannel()
        [self configureAudioSession];
        
        self.peerConnection.delegate = self;
    }
    return self;
}


- (void)createMediaSenders {
    if (!self.isPublish) {
        return;
    }
    
    NSString *streamId = @"stream";
    
    // Audio
    RTCAudioTrack *audioTrack = [self createAudioTrack];
    self.localAudioTrack = audioTrack;
    
    RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
    audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
    audioTrackTransceiver.streamIds = @[streamId];
    
    [self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];
    
    // Video
    RTCRtpEncodingParameters *encodingParameters = [[RTCRtpEncodingParameters alloc] init];
    encodingParameters.maxBitrateBps = @(6000000);
//    encodingParameters.bitratePriority = 1.0;
    
    RTCVideoTrack *videoTrack = [self createVideoTrack];
    self.localVideoTrack = videoTrack;
    RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
    videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
    videoTrackTransceiver.streamIds = @[streamId];
    
    // 设置该属性后,SRS服务无法播放视频画面
//    videoTrackTransceiver.sendEncodings = @[encodingParameters];
    [self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];
    
    [self setDegradationPreference:RTCDegradationPreferenceBalanced];
    [self setMaxBitrate:6000000];
}

- (void)createMediaReceivers {
    if (!self.isPublish) {
        return;
    }
    
    if (self.peerConnection.transceivers.count > 0) {
        RTCRtpTransceiver *transceiver = self.peerConnection.transceivers.firstObject;
        if (transceiver.mediaType == RTCRtpMediaTypeVideo) {
            RTCVideoTrack *track = (RTCVideoTrack *)transceiver.receiver.track;
            self.remoteVideoTrack = track;
        }
    }
}

- (void)configureAudioSession {
    [self.rtcAudioSession lockForConfiguration];
    @try {
        RTCAudioSessionConfiguration *configuration =
            [[RTCAudioSessionConfiguration alloc] init];
        configuration.category = AVAudioSessionCategoryPlayAndRecord;
        configuration.categoryOptions = AVAudioSessionCategoryOptionDefaultToSpeaker;
        configuration.mode = AVAudioSessionModeDefault;
        BOOL hasSucceeded = NO;
        NSError *error = nil;
        if (self.rtcAudioSession.isActive) {
          hasSucceeded = [self.rtcAudioSession setConfiguration:configuration error:&error];
        } else {
          hasSucceeded = [self.rtcAudioSession setConfiguration:configuration
                                            active:YES
                                             error:&error];
        }
        if (!hasSucceeded) {
            DebugLog(@"Error setting configuration: %@", error.localizedDescription);
        }
    } @catch (NSException *exception) {
        DebugLog(@"configureAudioSession exception:%@", exception);
    }
    
    [self.rtcAudioSession unlockForConfiguration];
}

- (RTCAudioTrack *)createAudioTrack {
    /// enable google 3A algorithm.
    NSDictionary *mandatory = @{
        @"googEchoCancellation": kRTCMediaConstraintsValueTrue,
        @"googAutoGainControl": kRTCMediaConstraintsValueTrue,
        @"googNoiseSuppression": kRTCMediaConstraintsValueTrue,
    };
    
    RTCMediaConstraints *audioConstrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:mandatory optionalConstraints:self.optionalConstraints];
    RTCAudioSource *audioSource = [self.factory audioSourceWithConstraints:audioConstrains];
    
    RTCAudioTrack *audioTrack = [self.factory audioTrackWithSource:audioSource trackId:@"audio0"];
    
    return audioTrack;
}

- (RTCVideoTrack *)createVideoTrack {
    RTCVideoSource *videoSource = [self.factory videoSource];
    self.localVideoSource = videoSource;

    // 如果是模拟器
    if (TARGET_IPHONE_SIMULATOR) {
        if (@available(iOS 10, *)) {
            self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];
        } else {
            // Fallback on earlier versions
        }
    } else{
        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDevicePosition:AVCaptureDevicePositionFront sessionPresset:AVCaptureSessionPreset1920x1080 fps:20 needYuvOutput:NO];
        
//        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDelegate:self];
    }
    
    RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];
    
    return videoTrack;
}

- (void)addIceCandidate:(RTCIceCandidate *)candidate {
    [self.peerConnection addIceCandidate:candidate];
}

- (void)offer:(void (^)(RTCSessionDescription *sdp))completion {
    if (self.isPublish) {
        self.mediaConstrains = self.publishMediaConstrains;
    } else {
        self.mediaConstrains = self.playMediaConstrains;
    }
    
    RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];
    DebugLog(@"peerConnection:%@",self.peerConnection);
    __weak typeof(self) weakSelf = self;
    [weakSelf.peerConnection offerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {
        if (error) {
            DebugLog(@"offer offerForConstraints error:%@", error);
        }
        
        if (sdp) {
            [weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {
                if (error) {
                    DebugLog(@"offer setLocalDescription error:%@", error);
                }
                if (completion) {
                    completion(sdp);
                }
            }];
        }
    }];
}

- (void)answer:(void (^)(RTCSessionDescription *sdp))completion {
    RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];
    __weak typeof(self) weakSelf = self;
    [weakSelf.peerConnection answerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {
        if (error) {
            DebugLog(@"answer answerForConstraints error:%@", error);
        }
        
        if (sdp) {
            [weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {
                if (error) {
                    DebugLog(@"answer setLocalDescription error:%@", error);
                }
                if (completion) {
                    completion(sdp);
                }
            }];
        }
    }];
}

- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError *error))completion {
    [self.peerConnection setRemoteDescription:remoteSdp completionHandler:completion];
}

- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate {
    [self.peerConnection addIceCandidate:remoteCandidate];
}

- (void)setMaxBitrate:(int)maxBitrate {
    if (self.peerConnection.senders.count > 0) {
        RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;
        RTCRtpParameters *parametersToModify = videoSender.parameters;
        for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {
            encoding.maxBitrateBps = @(maxBitrate);
        }
        [videoSender setParameters:parametersToModify];
    }
}

- (void)setDegradationPreference:(RTCDegradationPreference)degradationPreference {
    // RTCDegradationPreferenceMaintainResolution
    NSMutableArray *videoSenders = [NSMutableArray arrayWithCapacity:0];
    for (RTCRtpSender *sender in self.peerConnection.senders) {
        if (sender.track && [kRTCMediaStreamTrackKindVideo isEqualToString:sender.track.kind]) {
            // [videoSenders addObject:sender];
            RTCRtpParameters *parameters = sender.parameters;
            parameters.degradationPreference = [NSNumber numberWithInteger:degradationPreference];
            [sender setParameters:parameters];
            
            [videoSenders addObject:sender];
        }
    }
    
    for (RTCRtpSender *sender in videoSenders) {
        RTCRtpParameters *parameters = sender.parameters;
        parameters.degradationPreference = [NSNumber numberWithInteger:degradationPreference];
        [sender setParameters:parameters];
    }
}

- (void)setMinBitrate:(int)minBitrate {
    if (self.peerConnection.senders.count > 0) {
        RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;
        RTCRtpParameters *parametersToModify = videoSender.parameters;
        for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {
            encoding.minBitrateBps = @(minBitrate);
        }
        [videoSender setParameters:parametersToModify];
    }
}

- (void)setMaxFramerate:(int)maxFramerate {
    if (self.peerConnection.senders.count > 0) {
        RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;
        RTCRtpParameters *parametersToModify = videoSender.parameters;
        // 该版本暂时没有maxFramerate,需要更新到最新版本
        for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {
            encoding.maxFramerate = @(maxFramerate);
        }
        [videoSender setParameters:parametersToModify];
    }
}

#pragma mark - Private
- (AVCaptureDevice *)findDeviceForPosition:(AVCaptureDevicePosition)position {
  NSArray<AVCaptureDevice *> *captureDevices =
      [SDCustomRTCCameraCapturer captureDevices];
  for (AVCaptureDevice *device in captureDevices) {
    if (device.position == position) {
      return device;
    }
  }
  return captureDevices[0];
}

- (NSArray<NSString *> *)availableVideoResolutions {
  NSMutableSet<NSArray<NSNumber *> *> *resolutions =
      [[NSMutableSet<NSArray<NSNumber *> *> alloc] init];
  for (AVCaptureDevice *device in [SDCustomRTCCameraCapturer captureDevices]) {
    for (AVCaptureDeviceFormat *format in
         [SDCustomRTCCameraCapturer supportedFormatsForDevice:device]) {
      CMVideoDimensions resolution =
          CMVideoFormatDescriptionGetDimensions(format.formatDescription);
      NSArray<NSNumber *> *resolutionObject = @[@(resolution.width), @(resolution.height) ];
      [resolutions addObject:resolutionObject];
    }
  }

  NSArray<NSArray<NSNumber *> *> *sortedResolutions =
      [[resolutions allObjects] sortedArrayUsingComparator:^NSComparisonResult(
                                    NSArray<NSNumber *> *obj1, NSArray<NSNumber *> *obj2) {
        NSComparisonResult cmp = [obj1.firstObject compare:obj2.firstObject];
        if (cmp != NSOrderedSame) {
          return cmp;
        }
        return [obj1.lastObject compare:obj2.lastObject];
      }];

  NSMutableArray<NSString *> *resolutionStrings = [[NSMutableArray<NSString *> alloc] init];
  for (NSArray<NSNumber *> *resolution in sortedResolutions) {
    NSString *resolutionString =
        [NSString stringWithFormat:@"%@x%@", resolution.firstObject, resolution.lastObject];
    [resolutionStrings addObject:resolutionString];
  }

  return [resolutionStrings copy];
}


- (int)videoResolutionComponentAtIndex:(int)index inString:(NSString *)resolution {
    if (index != 0 && index != 1) {
        return 0;
    }
    NSArray<NSString *> *components = [resolution componentsSeparatedByString:@"x"];
    if (components.count != 2) {
        return 0;
    }
    return components[index].intValue;
}

- (AVCaptureDeviceFormat *)selectFormatForDevice:(AVCaptureDevice *)device {
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {
        SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;
        NSArray *availableVideoResolutions = [self availableVideoResolutions];
        NSString *selectedIndexStr = [[NSUserDefaults standardUserDefaults] objectForKey:kSelectedResolution];
        int selectedIndex = 0;
        if (selectedIndexStr && selectedIndexStr.length > 0) {
            selectedIndex = [selectedIndexStr intValue];
        }
        
        NSString *videoResolution = [availableVideoResolutions objectAtIndex:selectedIndex];
        DebugLog(@"availableVideoResolutions:%@, videoResolution:%@", availableVideoResolutions, videoResolution);
        NSArray<AVCaptureDeviceFormat *> *formats =
          [SDCustomRTCCameraCapturer supportedFormatsForDevice:device];
        int targetWidth = [self videoResolutionComponentAtIndex:0 inString:videoResolution];
        int targetHeight = [self videoResolutionComponentAtIndex:1 inString:videoResolution];
        AVCaptureDeviceFormat *selectedFormat = nil;
        int currentDiff = INT_MAX;

        for (AVCaptureDeviceFormat *format in formats) {
            CMVideoDimensions dimension = CMVideoFormatDescriptionGetDimensions(format.formatDescription);
            FourCharCode pixelFormat = CMFormatDescriptionGetMediaSubType(format.formatDescription);
            int diff = abs(targetWidth - dimension.width) + abs(targetHeight - dimension.height);
            if (diff < currentDiff) {
              selectedFormat = format;
              currentDiff = diff;
            } else if (diff == currentDiff && pixelFormat == [cameraVideoCapturer preferredOutputPixelFormat]) {
              selectedFormat = format;
            }
        }

        return selectedFormat;
    }
    
    return nil;
}

- (NSInteger)selectFpsForFormat:(AVCaptureDeviceFormat *)format {
    Float64 maxSupportedFramerate = 0;
    for (AVFrameRateRange *fpsRange in format.videoSupportedFrameRateRanges) {
        maxSupportedFramerate = fmax(maxSupportedFramerate, fpsRange.maxFrameRate);
    }
//    return fmin(maxSupportedFramerate, kFramerateLimit);
    return 20;
}

- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {
    if (!self.isPublish) {
        return;
    }
    
    if (!renderer) {
        return;
    }
    
    if (!self.videoCapturer) {
        return;
    }
    
    [self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];
    
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {
        AVCaptureDevice *camera = [self findDeviceForPosition:self.usingFrontCamera?AVCaptureDevicePositionFront:AVCaptureDevicePositionBack];
        
        SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;
        [cameraVideoCapturer setISOValue:0.0];
        [cameraVideoCapturer rotateCamera:self.usingFrontCamera];
        self.videoCapturer.delegate = self;

        AVCaptureDeviceFormat *formatNilable = [self selectFormatForDevice:camera];;
        
        if (!formatNilable) {
            return;
        }
        
        DebugLog(@"formatNilable:%@", formatNilable);
        
        NSInteger fps = [self selectFpsForFormat:formatNilable];
        
        CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatNilable.formatDescription);
        float width = videoVideoDimensions.width;
        float height = videoVideoDimensions.height;
                
        DebugLog(@"videoVideoDimensions width:%f,height:%f", width, height);
        [cameraVideoCapturer startRunning];
//        [cameraVideoCapturer startCaptureWithDevice:camera format:formatNilable fps:fps completionHandler:^(NSError *error) {
//            DebugLog(@"startCaptureWithDevice error:%@", error);
//        }];
        [self changeResolution:width height:height fps:(int)fps];
    }
    
    if (@available(iOS 10, *)) {
        if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {
            RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;
            [fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {
                DebugLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);
            }];
        }
    } else {
        // Fallback on earlier versions
    }
    
    [self.localVideoTrack addRenderer:renderer];
}

- (void)renderRemoteVideo:(id<RTCVideoRenderer>)renderer {
    if (!self.isPublish) {
        return;
    }
    
    self.remoteRenderView = renderer;
}

- (RTCDataChannel *)createDataChannel {
    RTCDataChannelConfiguration *config = [[RTCDataChannelConfiguration alloc] init];
    RTCDataChannel *dataChannel = [self.peerConnection dataChannelForLabel:@"WebRTCData" configuration:config];
    if (!dataChannel) {
        return nil;
    }
    
    dataChannel.delegate = self;
    self.localDataChannel = dataChannel;
    return dataChannel;
}

- (void)sendData:(NSData *)data {
    RTCDataBuffer *buffer = [[RTCDataBuffer alloc] initWithData:data isBinary:YES];
    [self.remoteDataChannel sendData:buffer];
}

#pragma mark - switchCamera
- (void)switchCamera:(id<RTCVideoRenderer>)renderer {
    self.usingFrontCamera = !self.usingFrontCamera;
    [self startCaptureLocalVideo:renderer];
}

#pragma mark - 更改分辨率
- (BOOL)changeResolution:(int)width height:(int)height fps:(int)fps {
    if (!self.localVideoSource) {
        RTCVideoSource *videoSource = [self.factory videoSource];
        [videoSource adaptOutputFormatToWidth:width height:height fps:fps];
    } else {
        [self.localVideoSource adaptOutputFormatToWidth:width height:height fps:fps];
    }
    return YES;
}

#pragma mark - Hiden or show Video
- (void)hidenVideo {
    [self setVideoEnabled:NO];
    self.localVideoTrack.isEnabled = NO;
}

- (void)showVideo {
    [self setVideoEnabled:YES];
    self.localVideoTrack.isEnabled = YES;
}

- (void)setVideoEnabled:(BOOL)isEnabled {
    [self setTrackEnabled:[RTCVideoTrack class] isEnabled:isEnabled];
}

- (void)setTrackEnabled:(Class)track isEnabled:(BOOL)isEnabled {
    for (RTCRtpTransceiver *transceiver in self.peerConnection.transceivers) {
        if (transceiver && [transceiver isKindOfClass:track]) {
            transceiver.sender.track.isEnabled = isEnabled;
        }
    }
}


#pragma mark - Hiden or show Audio
- (void)muteAudio {
    [self setAudioEnabled:NO];
    self.localAudioTrack.isEnabled = NO;
}

- (void)unmuteAudio {
    [self setAudioEnabled:YES];
    self.localAudioTrack.isEnabled = YES;
}

- (void)speakOff {
    __weak typeof(self) weakSelf = self;
    dispatch_async(self.audioQueue, ^{
        [weakSelf.rtcAudioSession lockForConfiguration];
        @try {
            
            NSError *error;
            [self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];
            
            NSError *ooapError;
            [self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&ooapError];
            DebugLog(@"speakOff error:%@, ooapError:%@", error, ooapError);
        } @catch (NSException *exception) {
            DebugLog(@"speakOff exception:%@", exception);
        }
        [weakSelf.rtcAudioSession unlockForConfiguration];
    });
}

- (void)speakOn {
    __weak typeof(self) weakSelf = self;
    dispatch_async(self.audioQueue, ^{
        [weakSelf.rtcAudioSession lockForConfiguration];
        @try {
            
            NSError *error;
            [self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];
            
            NSError *ooapError;
            [self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&ooapError];
            
            NSError *activeError;
            [self.rtcAudioSession setActive:YES error:&activeError];
            
            DebugLog(@"speakOn error:%@, ooapError:%@, activeError:%@", error, ooapError, activeError);
        } @catch (NSException *exception) {
            DebugLog(@"speakOn exception:%@", exception);
        }
        [weakSelf.rtcAudioSession unlockForConfiguration];
    });
}

- (void)setAudioEnabled:(BOOL)isEnabled {
    [self setTrackEnabled:[RTCAudioTrack class] isEnabled:isEnabled];
}

- (void)close {
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {
        SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;
        [cameraVideoCapturer stopRunning];
    }
    [self.peerConnection close];
    self.peerConnection = nil;
    self.videoCapturer = nil;
}

#pragma mark - RTCPeerConnectionDelegate
/** Called when the SignalingState changed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
didChangeSignalingState:(RTCSignalingState)stateChanged {
    DebugLog(@"peerConnection didChangeSignalingState:%ld", (long)stateChanged);
}

/** Called when media is received on a new stream from remote peer. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection didAddStream:(RTCMediaStream *)stream {
    DebugLog(@"peerConnection didAddStream");
    if (self.isPublish) {
        return;
    }
    
    NSArray *videoTracks = stream.videoTracks;
    if (videoTracks && videoTracks.count > 0) {
        RTCVideoTrack *track = videoTracks.firstObject;
        self.remoteVideoTrack = track;
    }
    
    if (self.remoteVideoTrack && self.remoteRenderView) {
        id<RTCVideoRenderer> remoteRenderView = self.remoteRenderView;
        RTCVideoTrack *remoteVideoTrack = self.remoteVideoTrack;
        [remoteVideoTrack addRenderer:remoteRenderView];
    }
    
    /**
     if let audioTrack = stream.audioTracks.first{
     print("audio track faund")
     audioTrack.source.volume = 8
     }
     */
}

/** Called when a remote peer closes a stream.
 *  This is not called when RTCSdpSemanticsUnifiedPlan is specified.
 */
- (void)peerConnection:(RTCPeerConnection *)peerConnection didRemoveStream:(RTCMediaStream *)stream {
    DebugLog(@"peerConnection didRemoveStream");
}

/** Called when negotiation is needed, for example ICE has restarted. */
- (void)peerConnectionShouldNegotiate:(RTCPeerConnection *)peerConnection {
    DebugLog(@"peerConnection peerConnectionShouldNegotiate");
}

/** Called any time the IceConnectionState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didChangeIceConnectionState:(RTCIceConnectionState)newState {
    DebugLog(@"peerConnection didChangeIceConnectionState:%ld", newState);
    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didChangeConnectionState:)]) {
        [self.delegate webRTCClient:self didChangeConnectionState:newState];
    }
    
    if (RTCIceConnectionStateConnected == newState || RTCIceConnectionStateChecking == newState) {
        [self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];
    }
}

/** Called any time the IceGatheringState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didChangeIceGatheringState:(RTCIceGatheringState)newState {
    DebugLog(@"peerConnection didChangeIceGatheringState:%ld", newState);
}

/** New ice candidate has been found. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didGenerateIceCandidate:(RTCIceCandidate *)candidate {
    DebugLog(@"peerConnection didGenerateIceCandidate:%@", candidate);
    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didDiscoverLocalCandidate:)]) {
        [self.delegate webRTCClient:self didDiscoverLocalCandidate:candidate];
    }
}

/** Called when a group of local Ice candidates have been removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didRemoveIceCandidates:(NSArray<RTCIceCandidate *> *)candidates {
    DebugLog(@"peerConnection didRemoveIceCandidates:%@", candidates);
}

/** New data channel has been opened. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didOpenDataChannel:(RTCDataChannel *)dataChannel {
    DebugLog(@"peerConnection didOpenDataChannel:%@", dataChannel);
    self.remoteDataChannel = dataChannel;
}

/** Called when signaling indicates a transceiver will be receiving media from
 *  the remote endpoint.
 *  This is only called with RTCSdpSemanticsUnifiedPlan specified.
 */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didStartReceivingOnTransceiver:(RTCRtpTransceiver *)transceiver {
    DebugLog(@"peerConnection didStartReceivingOnTransceiver:%@", transceiver);
}

/** Called when a receiver and its track are created. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
        didAddReceiver:(RTCRtpReceiver *)rtpReceiver
               streams:(NSArray<RTCMediaStream *> *)mediaStreams {
    DebugLog(@"peerConnection didAddReceiver");
}

/** Called when the receiver and its track are removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
     didRemoveReceiver:(RTCRtpReceiver *)rtpReceiver {
    DebugLog(@"peerConnection didRemoveReceiver");
}

#pragma mark - RTCDataChannelDelegate
/** The data channel state changed. */
- (void)dataChannelDidChangeState:(RTCDataChannel *)dataChannel {
    DebugLog(@"dataChannelDidChangeState:%@", dataChannel);
}

/** The data channel successfully received a data buffer. */
- (void)dataChannel:(RTCDataChannel *)dataChannel
didReceiveMessageWithBuffer:(RTCDataBuffer *)buffer {
    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didReceiveData:)]) {
        [self.delegate webRTCClient:self didReceiveData:buffer.data];
    }
}

#pragma mark - RTCVideoCapturerDelegate处理代理
- (void)capturer:(RTCVideoCapturer *)capturer didCaptureVideoFrame:(RTCVideoFrame *)frame {
//    DebugLog(@"capturer:%@ didCaptureVideoFrame:%@", capturer, frame);
//    RTCVideoFrame *aFilterVideoFrame;
//    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didCaptureVideoFrame:)]) {
//        aFilterVideoFrame = [self.delegate webRTCClient:self didCaptureVideoFrame:frame];
//    }
    
    //  操作C 需要手动释放  否则内存暴涨
//      CVPixelBufferRelease(_buffer)
    //    拿到pixelBuffersampleBuffer
//        ((RTCCVPixelBuffer*)frame.buffer).pixelBuffer
    
//    if (!aFilterVideoFrame) {
//        aFilterVideoFrame = frame;
//    }
//
//    [self.localVideoSource capturer:capturer didCaptureVideoFrame:frame];
}

- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    RTCCVPixelBuffer *rtcPixelBuffer =
    [[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];
    RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
    int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *
    1000000000;
    RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer     

    
    [self.localVideoSource capturer:capturer didCaptureVideoFrame:rtcVideoFrame];
}

#pragma mark - Lazy
- (RTCPeerConnectionFactory *)factory {
    if (!_factory) {
        RTCInitializeSSL();
        RTCDefaultVideoEncoderFactory *videoEncoderFactory = [[RTCDefaultVideoEncoderFactory alloc] init];
        
        RTCDefaultVideoDecoderFactory *videoDecoderFactory = [[RTCDefaultVideoDecoderFactory alloc] init];
        _factory = [[RTCPeerConnectionFactory alloc] initWithEncoderFactory:videoEncoderFactory decoderFactory:videoDecoderFactory];
    }
    return _factory;
}

- (dispatch_queue_t)audioQueue {
    if (!_audioQueue) {
        _audioQueue = dispatch_queue_create("cn.dface.webrtc", NULL);
    }
    return _audioQueue;
}

- (RTCAudioSession *)rtcAudioSession {
    if (!_rtcAudioSession) {
        _rtcAudioSession = [RTCAudioSession sharedInstance];
    }
    
    return _rtcAudioSession;
}

- (NSDictionary *)mediaConstrains {
    if (!_mediaConstrains) {
        _mediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:
                            kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,
                            kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,
                            kRTCMediaConstraintsValueTrue, @"IceRestart",
                            nil];
    }
    return _mediaConstrains;
}

- (NSDictionary *)publishMediaConstrains {
    if (!_publishMediaConstrains) {
        _publishMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:
                                   kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,
                                   kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,
                                   kRTCMediaConstraintsValueTrue, @"IceRestart",
                                   kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",
                                   kRTCMediaConstraintsValueFalse, @"googBandwidthLimitedResolution",
//                                   @"2160", kRTCMediaConstraintsMinWidth,
//                                   @"3840", kRTCMediaConstraintsMinHeight,
//                                   @"0.25", kRTCMediaConstraintsMinAspectRatio,
//                                   @"1", kRTCMediaConstraintsMaxAspectRatio,
                                   nil];
    }
    return _publishMediaConstrains;
}

- (NSDictionary *)playMediaConstrains {
    if (!_playMediaConstrains) {
        _playMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:
                                kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveAudio,
                                kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveVideo,
                                kRTCMediaConstraintsValueTrue, @"IceRestart",
                                kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",
                                kRTCMediaConstraintsValueFalse, @"googBandwidthLimitedResolution",
//                                @"2160", kRTCMediaConstraintsMinWidth,
//                                @"3840", kRTCMediaConstraintsMinHeight,
//                                @"0.25", kRTCMediaConstraintsMinAspectRatio,
//                                @"1", kRTCMediaConstraintsMaxAspectRatio,
                                nil];
    }
    return _playMediaConstrains;
}

- (NSDictionary *)optionalConstraints {
    if (!_optionalConstraints) {
        _optionalConstraints = [[NSDictionary alloc] initWithObjectsAndKeys:
                                kRTCMediaConstraintsValueTrue, @"DtlsSrtpKeyAgreement",
                                kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",
                                nil];
    }
    return _optionalConstraints;
}

- (void)dealloc {
    [self.peerConnection close];
    self.peerConnection = nil;
}

@end

至此WebRTC视频自定义RTCVideoCapturer相机完毕。

其他
之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196
之前实现iOS端调用ossrs音视频通话,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
之前WebRTC音视频通话高分辨率不显示画面问题,可以查看:https://blog.csdn.net/gloryFlow/article/details/132240952
修改SDP中的码率Bitrate,可以查看:https://blog.csdn.net/gloryFlow/article/details/132263021
GPUImage视频通话视频美颜滤镜,可以查看:https://blog.csdn.net/gloryFlow/article/details/132265842
RTC直播本地视频或相册视频,可以查看:https://blog.csdn.net/gloryFlow/article/details/132267068

四、小结

WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机。主要获得相机采集的画面CVPixelBufferRef,将处理后的CVPixelBufferRef生成RTCVideoFrame,通过调用WebRTC的localVideoSource中实现的didCaptureVideoFrame方法。内容较多,描述可能不准确,请见谅。

https://blog.csdn.net/gloryFlow/article/details/132308673

学习记录,每天不停进步。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/75475.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

应届生运维简历攻略

导语&#xff1a; 当下&#xff0c;计算机科学与技术已经成为一个炙手可热的行业&#xff0c;而作为这个行业中的一份子&#xff0c;运维人员的角色无疑至关重要。如果你是一位即将毕业的应届生&#xff0c;并希望在运维领域打拼&#xff0c;那么一份出色的运维简历将是你踏入…

【hive】hive分桶表的学习

hive分桶表的学习 前言&#xff1a; 每一个表或者分区&#xff0c;hive都可以进一步组织成桶&#xff0c;桶是更细粒度的数据划分&#xff0c;他本质不会改变表或分区的目录组织方式&#xff0c;他会改变数据在文件中的分布方式。 分桶规则&#xff1a; 对分桶字段值进行哈…

CXL registers

目录 DVSEC CXL PCIe DVSEC for CXL Device//ID 0 DVSEC CXL Capability (Offset 0Ah) DVSEC CXL Control (Offset 0Ch) DVSEC CXL Status (Offset 0Eh) DVSEC CXL Control2 (Offset 10h) DVSEC CXL Status2 (Offset 12h) DVSEC CXL Lock (Offset 14h) DVSEC CXL Capabilit…

1€滤波器(1 Euro Filter)使用介绍

怎么调整欧拉角x、y、z的抖动问题&#xff1f;

视频集中存储EasyCVR视频汇聚平台定制项目增加AI智能算法

安防视频集中存储EasyCVR视频汇聚平台&#xff0c;可支持海量视频的轻量化接入与汇聚管理。平台能提供视频存储磁盘阵列、视频监控直播、视频轮播、视频录像、云存储、回放与检索、智能告警、服务器集群、语音对讲、云台控制、电子地图、平台级联、H.265自动转码等功能。为了便…

如何应用项目管理软件进行敏捷开发管理

敏捷开发&#xff08;Agile Development&#xff09;是一种软件开发方法论&#xff0c;强调在不断变化的需求和环境下&#xff0c;通过迭代、协作和自适应的方式来开发软件。敏捷方法的目标是提供更快、更灵活、更高质量的软件交付&#xff0c;以满足客户需求并实现项目成功。 …

使用pymupdf实现PDF内容搜索并显示功能

简介&#xff1a; 在日常工作和学习中&#xff0c;我们可能需要查找和提取PDF文件中的特定内容。本文将介绍如何使用Python编程语言和wxPython图形用户界面库来实现一个简单的PDF内容搜索工具。我们将使用PyMuPDF模块来处理PDF文件&#xff0c;并结合wxPython构建一个用户友好的…

C语言刷题训练【第11天】

大家好&#xff0c;我是纪宁。 今天是C语言笔试刷题训练的第11天&#xff0c;加油&#xff01; 文章目录 1、声明以下变量&#xff0c;则表达式: ch/i (f*d – i) 的结果类型为&#xff08; &#xff09;2、关于代码的说法正确的是&#xff08; &#xff09;3、已知有如下各变…

根据源码,模拟实现 RabbitMQ - 从需求分析到实现核心类(1)

目录 一、需求分析 1.1、对 Message Queue 的认识 1.2、消息队列核心概念 1.3、Broker Server 内部关键概念 1.4、Broker Server 核心 API &#xff08;重点实现&#xff09; 1.5、交换机类型 Direct 直接交换机 Fanout 扇出交换机 Topic 主题交换机 1.6、持久化 1.7…

pytest自动化测试框架tep环境变量、fixtures、用例三者之间的关系

tep是一款测试工具&#xff0c;在pytest测试框架基础上集成了第三方包&#xff0c;提供项目脚手架&#xff0c;帮助以写Python代码方式&#xff0c;快速实现自动化项目落地。 在tep项目中&#xff0c;自动化测试用例都是放到tests目录下的&#xff0c;每个.py文件相互独立&…

分布式图数据库 NebulaGraph v3.6.0 正式发布,强化全文索引能力

本次 v3.6.0 版本&#xff0c;主要强化全文索引能力&#xff0c;以及优化部分场景下的 MATCH 性能。 强化 强化增强全文索引功能&#xff0c;具体 pr 参见&#xff1a;#5567、#5575、#5577、#5580、#5584、#5587 优化 支持使用 MATCH 子句检索 VID 或属性索引时使用变量&am…

【Windows系统编程】02.进程与线程(一)-笔记

进程&#xff0c;进程对象 虚拟内存 进程不能执行代码&#xff0c;数据结构&#xff0c;三环PEB&#xff0c;0怀EPROCESS对进程进行管理 线程列表 线程才是真正执行代码 主线程&#xff1a;主函数 线程依赖于cpu时间片切换 单核&#xff0c;多核 主线程消息&#xff0c…

ChatGPT等人工智能编写文章的内容今后将成为常态

BuzzFeed股价上涨200%可能标志着“转向人工智能”媒体趋势的开始。 周四&#xff0c;一份内部备忘录被华尔街日报透露BuzzFeed正计划使用ChatGPT聊天机器人-风格文本合成技术来自OpenAI&#xff0c;用于创建个性化盘问和将来可能的其他内容。消息传出后&#xff0c;BuzzFeed的…

【数据结构与算法】十大经典排序算法-归并排序

&#x1f31f;个人博客&#xff1a;www.hellocode.top &#x1f3f0;Java知识导航&#xff1a;Java-Navigate &#x1f525;CSDN&#xff1a;HelloCode. &#x1f31e;知乎&#xff1a;HelloCode &#x1f334;掘金&#xff1a;HelloCode ⚡如有问题&#xff0c;欢迎指正&#…

地理数据的双重呈现:GIS与数据可视化

前一篇文章带大家了解了GIS与三维GIS的关系&#xff0c;本文就GIS话题带大家一起探讨一下GIS和数据可视化之间的关系。 GIS&#xff08;地理信息系统&#xff09;和数据可视化在地理信息科学领域扮演着重要的角色&#xff0c;它们之间密切相关且相互增强。GIS是一种用于采集、…

unity新输入系统的简单使用(New InputSystem)

1、在包管理器 unity注册表中下载安装InputSystem 2、给玩家添加组件PlayerInput&#xff0c;点击CreatAction,创建一个InputAct InputAct,这是玩家的输入文件&#xff0c;在里面可以设置玩家输入 3、使用 例如玩家控制角色移动 在InputAct中&#xff0c;默认已经设置好了移…

誉天HCIP-Datacom课程简介

HCIP-Datacom课程介绍&#xff1a;HCIP-Datacom分为一个核心技术方向&#xff1a;HCIP-Datacom-Core Technology H12-821 &#xff08;核心技术&#xff09;六个可选子方向&#xff1a;HCIP-Datacom-Advanced Routing & Switching Technology H12-831 &#xff08;高级路…

用Node.js吭哧吭哧撸一个运动主页

简单唠唠 某乎问题&#xff1a;人这一生&#xff0c;应该养成哪些好习惯&#xff1f; 问题链接&#xff1a;https://www.zhihu.com/question/460674063 如果我来回答肯定会有定期运动的字眼。 平日里也有煅练的习惯&#xff0c;时间久了后一直想把运动数据公开&#xff0c;…