WebRTC音视频通话-实现iOS端调用ossrs视频通话服务

WebRTC音视频通话-实现iOS端调用ossrs视频通话服务

之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196
这里iOS端使用GoogleWebRTC联调ossrs实现视频通话功能。

一、iOS端调用ossrs视频通话效果图

iOS端端效果图
在这里插入图片描述

ossrs效果图
在这里插入图片描述

一、WebRTC是什么?

WebRTC (Web Real-Time Communications) 是一项实时通讯技术,它允许网络应用或者站点,在不借助中间媒介的情况下,建立浏览器之间点对点(Peer-to-Peer)的连接,实现视频流、音频流或者其他任意数据的传输。

查看https://zhuanlan.zhihu.com/p/421503695

需要了解的关键

  • NAT
    Network Address Translation(网络地址转换)
  • STUN
    Session Traversal Utilities for NAT(NAT会话穿越应用程序)
  • TURN
    Traversal Using Relay NAT(通过Relay方式穿越NAT)
  • ICE
    Interactive Connectivity Establishment(交互式连接建立)
  • SDP
    Session Description Protocol(会话描述协议)
  • WebRTC
    Web Real-Time Communications(web实时通讯技术)

WebRTC offer交换流程如图所示
在这里插入图片描述

二、实现iOS端调用ossrs视频通话

创建好实现iOS端调用ossrs视频通话的工程。如果使用P2P点对点的音视频通话,信令服务器,stun/trunP2P穿透和转发服务器这类需要自己搭建了。ossrs中包含stun/trun穿透和转发服务器。我这边实现iOS端调用ossrs服务。

2.1、权限设置

在iOS端调用ossrs视频通话需要相机、语音权限

在info.plist中添加

<key>NSCameraUsageDescription</key>
<string>APP需要获取相机权限</string>
<key>NSMicrophoneUsageDescription</key>
<string>APP需要获取麦克风权限</string>

2.2、工程需要用到GoogleWebRTC

工程需要用到GoogleWebRTC库,在podfile文件中引入库,注意不同版本的GoogleWebRTC代码还是有些差别的。

target 'WebRTCApp' do
    
pod 'GoogleWebRTC'
pod 'ReactiveObjC'
pod 'SocketRocket'
pod 'HGAlertViewController', '~> 1.0.1'

end

之后执行pod install

2.3、GoogleWebRTC主要API

在使用GoogleWebRTC前,先看下主要的类

  • RTCPeerConnection

RTCPeerConnection是WebRTC用于构建点对点连接器

  • RTCPeerConnectionFactory

RTCPeerConnectionFactory是RTCPeerConnection工厂类

  • RTCVideoCapturer

RTCVideoCapturer是摄像头采集器,获取画面与音频,这个之后可以替换掉。可以自定义,方便获取CMSampleBufferRef进行画面的美颜滤镜、虚拟头像等处理。

  • RTCVideoTrack

RTCVideoTrack是视频轨Track

  • RTCAudioTrack

RTCAudioTrack是音频轨Track

  • RTCDataChannel

RTCDataChannel是建立高吞吐量、低延时的信道,可以传输数据。

  • RTCMediaStream

RTCMediaStream是媒体流(摄像头的视频、麦克风的音频)的同步流。

  • SDP

SDP即Session Description Protocol(会话描述协议)
SDP由一行或多行UTF-8文本组成,每行以一个字符的类型开头,后跟等号(=),然后是包含值或描述的结构化文本,其格式取决于类型。如下为一个SDP内容示例:

v=0
o=alice 2890844526 2890844526 IN IP4
s=
c=IN IP4
t=0 0
m=audio 49170 RTP/AVP 0
a=rtpmap:0 PCMU/8000
m=video 51372 RTP/AVP 31
a=rtpmap:31 H261/90000
m=video 53000 RTP/AVP 32
a=rtpmap:32 MPV/90000

这是会用到的WebRTC主要的API类。

2.4、使用WebRTC代码实现

使用WebRTC实现P2P音视频流程如图
在这里插入图片描述

这里调用ossrs实现步骤如下
在这里插入图片描述

关键点设置

初始化RTCPeerConnectionFactory

#pragma mark - Lazy
- (RTCPeerConnectionFactory *)factory {
    if (!_factory) {
        RTCInitializeSSL();
        RTCDefaultVideoEncoderFactory *videoEncoderFactory = [[RTCDefaultVideoEncoderFactory alloc] init];
        
        RTCDefaultVideoDecoderFactory *videoDecoderFactory = [[RTCDefaultVideoDecoderFactory alloc] init];
        
        _factory = [[RTCPeerConnectionFactory alloc] initWithEncoderFactory:videoEncoderFactory decoderFactory:videoDecoderFactory];
    }
    return _factory;
}

通过RTCPeerConnectionFactory生成RTCPeerConnection

self.peerConnection = [self.factory peerConnectionWithConfiguration:newConfig constraints:constraints delegate:nil];

将RTCAudioTrack及RTCVideoTrack添加到peerConnection

NSString *streamId = @"stream";

// Audio
RTCAudioTrack *audioTrack = [self createAudioTrack];
self.localAudioTrack = audioTrack;

RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
audioTrackTransceiver.streamIds = @[streamId];

[self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];

// Video
RTCVideoTrack *videoTrack = [self createVideoTrack];
self.localVideoTrack = videoTrack;
RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
videoTrackTransceiver.streamIds = @[streamId];
[self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];

设置摄像头RTCCameraVideoCapturer及文件视频Capturer

- (RTCVideoTrack *)createVideoTrack {
    RTCVideoSource *videoSource = [self.factory videoSource];
    
    // 经过测试比1920*1080大的尺寸,无法通过srs播放
    [videoSource adaptOutputFormatToWidth:1920 height:1080 fps:20];

    // 如果是模拟器
    if (TARGET_IPHONE_SIMULATOR) {
        self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:videoSource];
    } else{
        self.videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:videoSource];
    }
    
    RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];
    
    return videoTrack;
}

摄像头本地采集的画面本地显示
startCaptureLocalVideo的renderer为RTCEAGLVideoView

- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {
    if (!self.isPublish) {
        return;
    }
    
    if (!renderer) {
        return;
    }
    
    if (!self.videoCapturer) {
        return;
    }
    
    
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[RTCCameraVideoCapturer class]]) {
        if (!([RTCCameraVideoCapturer captureDevices].count > 0)) {
            return;
        }
        AVCaptureDevice *frontCamera = RTCCameraVideoCapturer.captureDevices.firstObject;
//        if (frontCamera.position != AVCaptureDevicePositionFront) {
//            return;
//        }
        
        RTCCameraVideoCapturer *cameraVideoCapturer = (RTCCameraVideoCapturer *)capturer;
        
        AVCaptureDeviceFormat *formatNilable;
        
        NSArray *supportDeviceFormats = [RTCCameraVideoCapturer supportedFormatsForDevice:frontCamera];
        NSLog(@"supportDeviceFormats:%@",supportDeviceFormats);
        formatNilable = supportDeviceFormats[4];
//        if (supportDeviceFormats && supportDeviceFormats.count > 0) {
//            NSMutableArray *formats = [NSMutableArray arrayWithCapacity:0];
//            for (AVCaptureDeviceFormat *format in supportDeviceFormats) {
//                CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription);
//                float width = videoVideoDimensions.width;
//                float height = videoVideoDimensions.height;
//                // only use 16:9 format.
//                if ((width / height) >= (16.0/9.0)) {
//                    [formats addObject:format];
//                }
//            }
//
//            if (formats.count > 0) {
//                NSArray *sortedFormats = [formats sortedArrayUsingComparator:^NSComparisonResult(AVCaptureDeviceFormat *obj1, AVCaptureDeviceFormat *obj2) {
//                    CMVideoDimensions f1VD = CMVideoFormatDescriptionGetDimensions(obj1.formatDescription);
//                    CMVideoDimensions f2VD = CMVideoFormatDescriptionGetDimensions(obj2.formatDescription);
//                    float width1 = f1VD.width;
//                    float width2 = f2VD.width;
//                    float height2 = f2VD.height;
//                    // only use 16:9 format.
//                    if ((width2 / height2) >= (1.7)) {
//                        return NSOrderedAscending;
//                    } else {
//                        return NSOrderedDescending;
//                    }
//                }];
//
//                if (sortedFormats && sortedFormats.count > 0) {
//                    formatNilable = sortedFormats.lastObject;
//                }
//            }
//        }
        
        if (!formatNilable) {
            return;
        }
        
        NSArray *formatArr = [RTCCameraVideoCapturer supportedFormatsForDevice:frontCamera];
        for (AVCaptureDeviceFormat *format in formatArr) {
            NSLog(@"AVCaptureDeviceFormat format:%@", format);
        }
        
        [cameraVideoCapturer startCaptureWithDevice:frontCamera format:formatNilable fps:20 completionHandler:^(NSError *error) {
            NSLog(@"startCaptureWithDevice error:%@", error);
        }];
    }
    
    if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {
        RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;
        [fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {
            NSLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);
        }];
    }
    
    [self.localVideoTrack addRenderer:renderer];
}

创建的createOffer

- (void)offer:(void (^)(RTCSessionDescription *sdp))completion {
    if (self.isPublish) {
        self.mediaConstrains = self.publishMediaConstrains;
    } else {
        self.mediaConstrains = self.playMediaConstrains;
    }
    
    RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];
    NSLog(@"peerConnection:%@",self.peerConnection);
    __weak typeof(self) weakSelf = self;
    [weakSelf.peerConnection offerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {
        if (error) {
            NSLog(@"offer offerForConstraints error:%@", error);
        }
        
        if (sdp) {
            [weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {
                if (error) {
                    NSLog(@"offer setLocalDescription error:%@", error);
                }
                if (completion) {
                    completion(sdp);
                }
            }];
        }
    }];
}

设置setRemoteDescription

- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError * _Nullable error))completion {
    [self.peerConnection setRemoteDescription:remoteSdp completionHandler:completion];
}

整体代码如下

WebRTCClient.h

#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>
#import <UIKit/UIKit.h>

@protocol WebRTCClientDelegate;
@interface WebRTCClient : NSObject

@property (nonatomic, weak) id<WebRTCClientDelegate> delegate;

/**
 connect工厂
 */
@property (nonatomic, strong) RTCPeerConnectionFactory *factory;

/**
 是否push
 */
@property (nonatomic, assign) BOOL isPublish;

/**
 connect
 */
@property (nonatomic, strong) RTCPeerConnection *peerConnection;

/**
 RTCAudioSession
 */
@property (nonatomic, strong) RTCAudioSession *rtcAudioSession;

/**
 DispatchQueue
 */
@property (nonatomic) dispatch_queue_t audioQueue;

/**
 mediaConstrains
 */
@property (nonatomic, strong) NSDictionary *mediaConstrains;

/**
 publishMediaConstrains
 */
@property (nonatomic, strong) NSDictionary *publishMediaConstrains;

/**
 playMediaConstrains
 */
@property (nonatomic, strong) NSDictionary *playMediaConstrains;

/**
 optionalConstraints
 */
@property (nonatomic, strong) NSDictionary *optionalConstraints;

/**
 RTCVideoCapturer摄像头采集器
 */
@property (nonatomic, strong) RTCVideoCapturer *videoCapturer;

/**
 local语音localAudioTrack
 */
@property (nonatomic, strong) RTCAudioTrack *localAudioTrack;

/**
 localVideoTrack
 */
@property (nonatomic, strong) RTCVideoTrack *localVideoTrack;

/**
 remoteVideoTrack
 */
@property (nonatomic, strong) RTCVideoTrack *remoteVideoTrack;

/**
 RTCVideoRenderer
 */
@property (nonatomic, weak) id<RTCVideoRenderer> remoteRenderView;

/**
 localDataChannel
 */
@property (nonatomic, strong) RTCDataChannel *localDataChannel;

/**
 localDataChannel
 */
@property (nonatomic, strong) RTCDataChannel *remoteDataChannel;


- (instancetype)initWithPublish:(BOOL)isPublish;

- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer;

- (void)answer:(void (^)(RTCSessionDescription *sdp))completionHandler;

- (void)offer:(void (^)(RTCSessionDescription *sdp))completionHandler;

#pragma mark - Hiden or show Video
- (void)hidenVideo;

- (void)showVideo;

#pragma mark - Hiden or show Audio
- (void)muteAudio;

- (void)unmuteAudio;

- (void)speakOff;

- (void)speakOn;

- (void)changeSDP2Server:(RTCSessionDescription *)sdp
                  urlStr:(NSString *)urlStr
               streamUrl:(NSString *)streamUrl
                 closure:(void (^)(BOOL isServerRetSuc))closure;

@end


@protocol WebRTCClientDelegate <NSObject>

- (void)webRTCClient:(WebRTCClient *)client didDiscoverLocalCandidate:(RTCIceCandidate *)candidate;
- (void)webRTCClient:(WebRTCClient *)client didChangeConnectionState:(RTCIceConnectionState)state;
- (void)webRTCClient:(WebRTCClient *)client didReceiveData:(NSData *)data;

@end

WebRTCClient.m

#import "WebRTCClient.h"
#import "HttpClient.h"

@interface WebRTCClient ()<RTCPeerConnectionDelegate, RTCDataChannelDelegate>

@property (nonatomic, strong) HttpClient *httpClient;

@end

@implementation WebRTCClient

- (instancetype)initWithPublish:(BOOL)isPublish {
    self = [super init];
    if (self) {
        self.isPublish = isPublish;
        
        self.httpClient = [[HttpClient alloc] init];
                        
        RTCMediaConstraints *constraints = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:nil optionalConstraints:self.optionalConstraints];
        
        RTCConfiguration *newConfig = [[RTCConfiguration alloc] init];
        newConfig.sdpSemantics = RTCSdpSemanticsUnifiedPlan;
        
        self.peerConnection = [self.factory peerConnectionWithConfiguration:newConfig constraints:constraints delegate:nil];
        
        [self createMediaSenders];
        
        [self createMediaReceivers];
        
        // srs not support data channel.
        // self.createDataChannel()
        [self configureAudioSession];
        
        self.peerConnection.delegate = self;
    }
    return self;
}


- (void)createMediaSenders {
    if (!self.isPublish) {
        return;
    }
    
    NSString *streamId = @"stream";
    
    // Audio
    RTCAudioTrack *audioTrack = [self createAudioTrack];
    self.localAudioTrack = audioTrack;
    
    RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
    audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
    audioTrackTransceiver.streamIds = @[streamId];
    
    [self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];
    
    // Video
    RTCVideoTrack *videoTrack = [self createVideoTrack];
    self.localVideoTrack = videoTrack;
    RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];
    videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;
    videoTrackTransceiver.streamIds = @[streamId];
    [self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];
}

- (void)createMediaReceivers {
    if (!self.isPublish) {
        return;
    }
    
    if (self.peerConnection.transceivers.count > 0) {
        RTCRtpTransceiver *transceiver = self.peerConnection.transceivers.firstObject;
        if (transceiver.mediaType == RTCRtpMediaTypeVideo) {
            RTCVideoTrack *track = (RTCVideoTrack *)transceiver.receiver.track;
            self.remoteVideoTrack = track;
        }
    }
}

- (void)configureAudioSession {
    [self.rtcAudioSession lockForConfiguration];
    @try {
        NSError *error;
        [self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];
        
        NSError *modeError;
        [self.rtcAudioSession setMode:AVAudioSessionModeVoiceChat error:&modeError];
        NSLog(@"configureAudioSession error:%@, modeError:%@", error, modeError);
    } @catch (NSException *exception) {
        NSLog(@"configureAudioSession exception:%@", exception);
    }
    
    [self.rtcAudioSession unlockForConfiguration];
}

- (RTCAudioTrack *)createAudioTrack {
    /// enable google 3A algorithm.
    NSDictionary *mandatory = @{
        @"googEchoCancellation": kRTCMediaConstraintsValueTrue,
        @"googAutoGainControl": kRTCMediaConstraintsValueTrue,
        @"googNoiseSuppression": kRTCMediaConstraintsValueTrue,
    };
    
    RTCMediaConstraints *audioConstrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:mandatory optionalConstraints:self.optionalConstraints];
    RTCAudioSource *audioSource = [self.factory audioSourceWithConstraints:audioConstrains];
    
    RTCAudioTrack *audioTrack = [self.factory audioTrackWithSource:audioSource trackId:@"audio0"];
    
    return audioTrack;
}

- (RTCVideoTrack *)createVideoTrack {
    RTCVideoSource *videoSource = [self.factory videoSource];
    
    // 经过测试比1920*1080大的尺寸,无法通过srs播放
    [videoSource adaptOutputFormatToWidth:1920 height:1080 fps:20];

    // 如果是模拟器
    if (TARGET_IPHONE_SIMULATOR) {
        self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:videoSource];
    } else{
        self.videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:videoSource];
    }
    
    RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];
    
    return videoTrack;
}

- (void)offer:(void (^)(RTCSessionDescription *sdp))completion {
    if (self.isPublish) {
        self.mediaConstrains = self.publishMediaConstrains;
    } else {
        self.mediaConstrains = self.playMediaConstrains;
    }
    
    RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];
    NSLog(@"peerConnection:%@",self.peerConnection);
    __weak typeof(self) weakSelf = self;
    [weakSelf.peerConnection offerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {
        if (error) {
            NSLog(@"offer offerForConstraints error:%@", error);
        }
        
        if (sdp) {
            [weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {
                if (error) {
                    NSLog(@"offer setLocalDescription error:%@", error);
                }
                if (completion) {
                    completion(sdp);
                }
            }];
        }
    }];
}

- (void)answer:(void (^)(RTCSessionDescription *sdp))completion {
    RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];
    __weak typeof(self) weakSelf = self;
    [weakSelf.peerConnection answerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {
        if (error) {
            NSLog(@"answer answerForConstraints error:%@", error);
        }
        
        if (sdp) {
            [weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {
                if (error) {
                    NSLog(@"answer setLocalDescription error:%@", error);
                }
                if (completion) {
                    completion(sdp);
                }
            }];
        }
    }];
}

- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError * _Nullable error))completion {
    [self.peerConnection setRemoteDescription:remoteSdp completionHandler:completion];
}

- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate {
    [self.peerConnection addIceCandidate:remoteCandidate];
}

- (void)setMaxBitrate:(int)maxBitrate {
    NSMutableArray *videoSenders = [NSMutableArray arrayWithCapacity:0];
    for (RTCRtpSender *sender in self.peerConnection.senders) {
        if (sender.track && [kRTCMediaStreamTrackKindVideo isEqualToString:sender.track.kind]) {
            [videoSenders addObject:sender];
        }
    }
    
    if (videoSenders.count > 0) {
        RTCRtpSender *firstSender = [videoSenders firstObject];
        RTCRtpParameters *parameters = firstSender.parameters;
        NSNumber *maxBitrateBps = [NSNumber numberWithInt:maxBitrate];
        parameters.encodings.firstObject.maxBitrateBps = maxBitrateBps;
    }
}

- (void)setMaxFramerate:(int)maxFramerate {
    NSMutableArray *videoSenders = [NSMutableArray arrayWithCapacity:0];
    for (RTCRtpSender *sender in self.peerConnection.senders) {
        if (sender.track && [kRTCMediaStreamTrackKindVideo isEqualToString:sender.track.kind]) {
            [videoSenders addObject:sender];
        }
    }
    
    if (videoSenders.count > 0) {
        RTCRtpSender *firstSender = [videoSenders firstObject];
        RTCRtpParameters *parameters = firstSender.parameters;
        NSNumber *maxFramerateNum = [NSNumber numberWithInt:maxFramerate];
        
        // 该版本暂时没有maxFramerate,需要更新到最新版本
        parameters.encodings.firstObject.maxFramerate = maxFramerateNum;
    }
}

- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {
    if (!self.isPublish) {
        return;
    }
    
    if (!renderer) {
        return;
    }
    
    if (!self.videoCapturer) {
        return;
    }
    
    
    RTCVideoCapturer *capturer = self.videoCapturer;
    if ([capturer isKindOfClass:[RTCCameraVideoCapturer class]]) {
        if (!([RTCCameraVideoCapturer captureDevices].count > 0)) {
            return;
        }
        AVCaptureDevice *frontCamera = RTCCameraVideoCapturer.captureDevices.firstObject;
//        if (frontCamera.position != AVCaptureDevicePositionFront) {
//            return;
//        }
        
        RTCCameraVideoCapturer *cameraVideoCapturer = (RTCCameraVideoCapturer *)capturer;
        
        AVCaptureDeviceFormat *formatNilable;
        
        NSArray *supportDeviceFormats = [RTCCameraVideoCapturer supportedFormatsForDevice:frontCamera];
        NSLog(@"supportDeviceFormats:%@",supportDeviceFormats);
        formatNilable = supportDeviceFormats[4];
//        if (supportDeviceFormats && supportDeviceFormats.count > 0) {
//            NSMutableArray *formats = [NSMutableArray arrayWithCapacity:0];
//            for (AVCaptureDeviceFormat *format in supportDeviceFormats) {
//                CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription);
//                float width = videoVideoDimensions.width;
//                float height = videoVideoDimensions.height;
//                // only use 16:9 format.
//                if ((width / height) >= (16.0/9.0)) {
//                    [formats addObject:format];
//                }
//            }
//
//            if (formats.count > 0) {
//                NSArray *sortedFormats = [formats sortedArrayUsingComparator:^NSComparisonResult(AVCaptureDeviceFormat *obj1, AVCaptureDeviceFormat *obj2) {
//                    CMVideoDimensions f1VD = CMVideoFormatDescriptionGetDimensions(obj1.formatDescription);
//                    CMVideoDimensions f2VD = CMVideoFormatDescriptionGetDimensions(obj2.formatDescription);
//                    float width1 = f1VD.width;
//                    float width2 = f2VD.width;
//                    float height2 = f2VD.height;
//                    // only use 16:9 format.
//                    if ((width2 / height2) >= (1.7)) {
//                        return NSOrderedAscending;
//                    } else {
//                        return NSOrderedDescending;
//                    }
//                }];
//
//                if (sortedFormats && sortedFormats.count > 0) {
//                    formatNilable = sortedFormats.lastObject;
//                }
//            }
//        }
        
        if (!formatNilable) {
            return;
        }
        
        NSArray *formatArr = [RTCCameraVideoCapturer supportedFormatsForDevice:frontCamera];
        for (AVCaptureDeviceFormat *format in formatArr) {
            NSLog(@"AVCaptureDeviceFormat format:%@", format);
        }
        
        [cameraVideoCapturer startCaptureWithDevice:frontCamera format:formatNilable fps:20 completionHandler:^(NSError *error) {
            NSLog(@"startCaptureWithDevice error:%@", error);
        }];
    }
    
    if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {
        RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;
        [fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {
            NSLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);
        }];
    }
    
    [self.localVideoTrack addRenderer:renderer];
}

- (void)renderRemoteVideo:(id<RTCVideoRenderer>)renderer {
    if (!self.isPublish) {
        return;
    }
    
    self.remoteRenderView = renderer;
}

- (RTCDataChannel *)createDataChannel {
    RTCDataChannelConfiguration *config = [[RTCDataChannelConfiguration alloc] init];
    RTCDataChannel *dataChannel = [self.peerConnection dataChannelForLabel:@"WebRTCData" configuration:config];
    if (!dataChannel) {
        return nil;
    }
    
    dataChannel.delegate = self;
    self.localDataChannel = dataChannel;
    return dataChannel;
}

- (void)sendData:(NSData *)data {
    RTCDataBuffer *buffer = [[RTCDataBuffer alloc] initWithData:data isBinary:YES];
    [self.remoteDataChannel sendData:buffer];
}

- (void)changeSDP2Server:(RTCSessionDescription *)sdp
                  urlStr:(NSString *)urlStr
               streamUrl:(NSString *)streamUrl
                 closure:(void (^)(BOOL isServerRetSuc))closure {
    
    __weak typeof(self) weakSelf = self;
    [self.httpClient changeSDP2Server:sdp urlStr:urlStr streamUrl:streamUrl closure:^(NSDictionary *result) {
        if (result && [result isKindOfClass:[NSDictionary class]]) {
            NSString *sdp = [result objectForKey:@"sdp"];
            if (sdp && [sdp isKindOfClass:[NSString class]] && sdp.length > 0) {
                RTCSessionDescription *remoteSDP = [[RTCSessionDescription alloc] initWithType:RTCSdpTypeAnswer sdp:sdp];
                [weakSelf setRemoteSdp:remoteSDP completion:^(NSError * _Nullable error) {
                    NSLog(@"changeSDP2Server setRemoteDescription error:%@", error);
                }];
            }
        }
    }];
}

#pragma mark - Hiden or show Video
- (void)hidenVideo {
    [self setVideoEnabled:NO];
}

- (void)showVideo {
    [self setVideoEnabled:YES];
}

- (void)setVideoEnabled:(BOOL)isEnabled {
    [self setTrackEnabled:[RTCVideoTrack class] isEnabled:isEnabled];
}

- (void)setTrackEnabled:(Class)track isEnabled:(BOOL)isEnabled {
    for (RTCRtpTransceiver *transceiver in self.peerConnection.transceivers) {
        if (transceiver && [transceiver isKindOfClass:track]) {
            transceiver.sender.track.isEnabled = isEnabled;
        }
    }
}


#pragma mark - Hiden or show Audio
- (void)muteAudio {
    [self setAudioEnabled:NO];
}

- (void)unmuteAudio {
    [self setAudioEnabled:YES];
}

- (void)speakOff {
    __weak typeof(self) weakSelf = self;
    dispatch_async(self.audioQueue, ^{
        [weakSelf.rtcAudioSession lockForConfiguration];
        @try {
            
            NSError *error;
            [self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];
            
            NSError *ooapError;
            [self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&ooapError];
            NSLog(@"speakOff error:%@, ooapError:%@", error, ooapError);
        } @catch (NSException *exception) {
            NSLog(@"speakOff exception:%@", exception);
        }
        [weakSelf.rtcAudioSession unlockForConfiguration];
    });
}

- (void)speakOn {
    __weak typeof(self) weakSelf = self;
    dispatch_async(self.audioQueue, ^{
        [weakSelf.rtcAudioSession lockForConfiguration];
        @try {
            
            NSError *error;
            [self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];
            
            NSError *ooapError;
            [self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&ooapError];
            
            NSError *activeError;
            [self.rtcAudioSession setActive:YES error:&activeError];
            
            NSLog(@"speakOn error:%@, ooapError:%@, activeError:%@", error, ooapError, activeError);
        } @catch (NSException *exception) {
            NSLog(@"speakOn exception:%@", exception);
        }
        [weakSelf.rtcAudioSession unlockForConfiguration];
    });
}

- (void)setAudioEnabled:(BOOL)isEnabled {
    [self setTrackEnabled:[RTCAudioTrack class] isEnabled:isEnabled];
}


#pragma mark - RTCPeerConnectionDelegate
/** Called when the SignalingState changed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
didChangeSignalingState:(RTCSignalingState)stateChanged {
    NSLog(@"peerConnection didChangeSignalingState:%ld", (long)stateChanged);
}

/** Called when media is received on a new stream from remote peer. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection didAddStream:(RTCMediaStream *)stream {
    NSLog(@"peerConnection didAddStream");
    if (self.isPublish) {
        return;
    }
    
    NSArray *videoTracks = stream.videoTracks;
    if (videoTracks && videoTracks.count > 0) {
        RTCVideoTrack *track = videoTracks.firstObject;
        self.remoteVideoTrack = track;
    }
    
    if (self.remoteVideoTrack && self.remoteRenderView) {
        id<RTCVideoRenderer> remoteRenderView = self.remoteRenderView;
        RTCVideoTrack *remoteVideoTrack = self.remoteVideoTrack;
        [remoteVideoTrack addRenderer:remoteRenderView];
    }
    
    /**
     if let audioTrack = stream.audioTracks.first{
     print("audio track faund")
     audioTrack.source.volume = 8
     }
     */
}

/** Called when a remote peer closes a stream.
 *  This is not called when RTCSdpSemanticsUnifiedPlan is specified.
 */
- (void)peerConnection:(RTCPeerConnection *)peerConnection didRemoveStream:(RTCMediaStream *)stream {
    NSLog(@"peerConnection didRemoveStream");
}

/** Called when negotiation is needed, for example ICE has restarted. */
- (void)peerConnectionShouldNegotiate:(RTCPeerConnection *)peerConnection {
    NSLog(@"peerConnection peerConnectionShouldNegotiate");
}

/** Called any time the IceConnectionState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didChangeIceConnectionState:(RTCIceConnectionState)newState {
    NSLog(@"peerConnection didChangeIceConnectionState:%ld", newState);
    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didChangeConnectionState:)]) {
        [self.delegate webRTCClient:self didChangeConnectionState:newState];
    }
}

/** Called any time the IceGatheringState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didChangeIceGatheringState:(RTCIceGatheringState)newState {
    NSLog(@"peerConnection didChangeIceGatheringState:%ld", newState);
}

/** New ice candidate has been found. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didGenerateIceCandidate:(RTCIceCandidate *)candidate {
    NSLog(@"peerConnection didGenerateIceCandidate:%@", candidate);
    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didDiscoverLocalCandidate:)]) {
        [self.delegate webRTCClient:self didDiscoverLocalCandidate:candidate];
    }
}

/** Called when a group of local Ice candidates have been removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didRemoveIceCandidates:(NSArray<RTCIceCandidate *> *)candidates {
    NSLog(@"peerConnection didRemoveIceCandidates:%@", candidates);
}

/** New data channel has been opened. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didOpenDataChannel:(RTCDataChannel *)dataChannel {
    NSLog(@"peerConnection didOpenDataChannel:%@", dataChannel);
    self.remoteDataChannel = dataChannel;
}

/** Called when signaling indicates a transceiver will be receiving media from
 *  the remote endpoint.
 *  This is only called with RTCSdpSemanticsUnifiedPlan specified.
 */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
    didStartReceivingOnTransceiver:(RTCRtpTransceiver *)transceiver {
    NSLog(@"peerConnection didStartReceivingOnTransceiver:%@", transceiver);
}

/** Called when a receiver and its track are created. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
        didAddReceiver:(RTCRtpReceiver *)rtpReceiver
               streams:(NSArray<RTCMediaStream *> *)mediaStreams {
    NSLog(@"peerConnection didAddReceiver");
}

/** Called when the receiver and its track are removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
     didRemoveReceiver:(RTCRtpReceiver *)rtpReceiver {
    NSLog(@"peerConnection didRemoveReceiver");
}

#pragma mark - RTCDataChannelDelegate
/** The data channel state changed. */
- (void)dataChannelDidChangeState:(RTCDataChannel *)dataChannel {
    NSLog(@"dataChannelDidChangeState:%@", dataChannel);
}

/** The data channel successfully received a data buffer. */
- (void)dataChannel:(RTCDataChannel *)dataChannel
didReceiveMessageWithBuffer:(RTCDataBuffer *)buffer {
    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didReceiveData:)]) {
        [self.delegate webRTCClient:self didReceiveData:buffer.data];
    }
}


#pragma mark - Lazy
- (RTCPeerConnectionFactory *)factory {
    if (!_factory) {
        RTCInitializeSSL();
        RTCDefaultVideoEncoderFactory *videoEncoderFactory = [[RTCDefaultVideoEncoderFactory alloc] init];
        
        RTCDefaultVideoDecoderFactory *videoDecoderFactory = [[RTCDefaultVideoDecoderFactory alloc] init];
        
        for (RTCVideoCodecInfo *codec in videoEncoderFactory.supportedCodecs) {
            if (codec.parameters) {
                NSString *profile_level_id = codec.parameters[@"profile-level-id"];
                if (profile_level_id && [profile_level_id isEqualToString:@"42e01f"]) {
                    videoEncoderFactory.preferredCodec = codec;
                    break;
                }
            }
        }
        _factory = [[RTCPeerConnectionFactory alloc] initWithEncoderFactory:videoEncoderFactory decoderFactory:videoDecoderFactory];
    }
    return _factory;
}

- (dispatch_queue_t)audioQueue {
    if (!_audioQueue) {
        _audioQueue = dispatch_queue_create("cn.ifour.webrtc", NULL);
    }
    return _audioQueue;
}

- (RTCAudioSession *)rtcAudioSession {
    if (!_rtcAudioSession) {
        _rtcAudioSession = [RTCAudioSession sharedInstance];
    }
    
    return _rtcAudioSession;
}

- (NSDictionary *)mediaConstrains {
    if (!_mediaConstrains) {
        _mediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:
                            kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,
                            kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,
                            kRTCMediaConstraintsValueTrue, @"IceRestart",
                            nil];
    }
    return _mediaConstrains;
}

- (NSDictionary *)publishMediaConstrains {
    if (!_publishMediaConstrains) {
        _publishMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:
                                   kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,
                                   kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,
                                   kRTCMediaConstraintsValueTrue, @"IceRestart",
                                   nil];
    }
    return _publishMediaConstrains;
}

- (NSDictionary *)playMediaConstrains {
    if (!_playMediaConstrains) {
        _playMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:
                                kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveAudio,
                                kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveVideo,
                                kRTCMediaConstraintsValueTrue, @"IceRestart",
                                nil];
    }
    return _playMediaConstrains;
}

- (NSDictionary *)optionalConstraints {
    if (!_optionalConstraints) {
        _optionalConstraints = [[NSDictionary alloc] initWithObjectsAndKeys:
                                kRTCMediaConstraintsValueTrue, @"DtlsSrtpKeyAgreement",
                                nil];
    }
    return _optionalConstraints;
}

@end

三、本地视频画面显示

使用RTCEAGLVideoView本地摄像头视频画面

self.localRenderer = [[RTCEAGLVideoView alloc] initWithFrame:CGRectZero];
//        self.localRenderer.videoContentMode = UIViewContentModeScaleAspectFill;
        [self addSubview:self.localRenderer];

        [self.webRTCClient startCaptureLocalVideo:self.localRenderer];

代码如下

PublishView.h

#import <UIKit/UIKit.h>
#import "WebRTCClient.h"

@interface PublishView : UIView

- (instancetype)initWithFrame:(CGRect)frame webRTCClient:(WebRTCClient *)webRTCClient;

@end

PublishView.m

#import "PublishView.h"

@interface PublishView ()

@property (nonatomic, strong) WebRTCClient *webRTCClient;
@property (nonatomic, strong) RTCEAGLVideoView *localRenderer;

@end

@implementation PublishView

- (instancetype)initWithFrame:(CGRect)frame webRTCClient:(WebRTCClient *)webRTCClient {
    self = [super initWithFrame:frame];
    if (self) {
        self.webRTCClient = webRTCClient;
        
        self.localRenderer = [[RTCEAGLVideoView alloc] initWithFrame:CGRectZero];
//        self.localRenderer.videoContentMode = UIViewContentModeScaleAspectFill;
        [self addSubview:self.localRenderer];

        [self.webRTCClient startCaptureLocalVideo:self.localRenderer];
    }
    return self;
}

- (void)layoutSubviews {
    [super layoutSubviews];
    self.localRenderer.frame = self.bounds;
    NSLog(@"self.localRenderer frame:%@", NSStringFromCGRect(self.localRenderer.frame));
}

@end

四、ossrs推流rtc服务

我这里通过调用rtc/v1/publish/从ossrs获得remotesdp,这里请求的地址如下:https://192.168.10.100:1990/rtc/v1/publish/

使用NSURLSessionDataTask实现http请求,请求代码如下

HttpClient.h

#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>

@interface HttpClient : NSObject<NSURLSessionDelegate>

- (void)changeSDP2Server:(RTCSessionDescription *)sdp
                  urlStr:(NSString *)urlStr
               streamUrl:(NSString *)streamUrl
                 closure:(void (^)(NSDictionary *result))closure;

@end

WebRTCClient.m

#import "HttpClient.h"
#import "IPUtil.h"

@interface HttpClient ()

@property (nonatomic, strong) NSURLSession *session;

@end

@implementation HttpClient

- (instancetype)init
{
    self = [super init];
    if (self) {
        self.session = [NSURLSession sessionWithConfiguration:[NSURLSessionConfiguration defaultSessionConfiguration] delegate:self delegateQueue:[NSOperationQueue mainQueue]];
    }
    return self;
}

- (void)changeSDP2Server:(RTCSessionDescription *)sdp
                  urlStr:(NSString *)urlStr
               streamUrl:(NSString *)streamUrl
                 closure:(void (^)(NSDictionary *result))closure {
    
    //设置URL
    NSURL *urlString = [NSURL URLWithString:urlStr];

    //创建可变请求对象
    NSMutableURLRequest* mutableRequest = [[NSMutableURLRequest alloc] initWithURL:urlString];

    //设置请求类型
    [mutableRequest setHTTPMethod:@"POST"];
    //创建字典,存放要上传的数据
    NSMutableDictionary *dict = [[NSMutableDictionary alloc] init];
    [dict setValue:urlStr forKey:@"api"];
    [dict setValue:[self createTid] forKey:@"tid"];
    [dict setValue:streamUrl forKey:@"streamurl"];
    [dict setValue:sdp.sdp forKey:@"sdp"];
    [dict setValue:[IPUtil localWiFiIPAddress] forKey:@"clientip"];

    //将字典转化NSData类型
    NSData *dictPhoneData = [NSJSONSerialization dataWithJSONObject:dict options:0 error:nil];
    //设置请求体
    [mutableRequest setHTTPBody:dictPhoneData];
    //设置请求头
    [mutableRequest addValue:@"application/json" forHTTPHeaderField:@"Content-Type"];
    [mutableRequest addValue:@"application/json" forHTTPHeaderField:@"Accept"];
    //创建任务
    NSURLSessionDataTask *dataTask = [self.session dataTaskWithRequest:mutableRequest completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable response, NSError * _Nullable error) {
        if (error == nil) {
            NSLog(@"请求成功:%@",data);
            NSString *dataString = [[NSString alloc] initWithData:data encoding:kCFStringEncodingUTF8];
            NSLog(@"请求成功 dataString:%@",dataString);
            NSDictionary *result = [NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingAllowFragments error:nil];
            NSLog(@"NSURLSessionDataTask result:%@", result);
            if (closure) {
                closure(result);
            }
        } else {
            NSLog(@"网络请求失败!");
        }
    }];

    //启动任务
    [dataTask resume];
}

- (NSString *)createTid {
    NSDate *date = [[NSDate alloc] init];
    int timeInterval = (int)([date timeIntervalSince1970]);
    int random = (int)(arc4random());
    NSString *str = [NSString stringWithFormat:@"%d*%d", timeInterval, random];
    if (str.length > 7) {
        NSString *tid = [str substringToIndex:7];
        return tid;
    }
    return @"";
}

#pragma mark -session delegate
-(void)URLSession:(NSURLSession *)session didReceiveChallenge:(NSURLAuthenticationChallenge *)challenge completionHandler:(void (^)(NSURLSessionAuthChallengeDisposition, NSURLCredential * _Nullable))completionHandler {

    NSURLSessionAuthChallengeDisposition disposition = NSURLSessionAuthChallengePerformDefaultHandling;
    __block NSURLCredential *credential = nil;

    if ([challenge.protectionSpace.authenticationMethod isEqualToString:NSURLAuthenticationMethodServerTrust]) {
        credential = [NSURLCredential credentialForTrust:challenge.protectionSpace.serverTrust];
        if (credential) {
            disposition = NSURLSessionAuthChallengeUseCredential;
        } else {
            disposition = NSURLSessionAuthChallengePerformDefaultHandling;
        }
    } else {
        disposition = NSURLSessionAuthChallengePerformDefaultHandling;
    }

    if (completionHandler) {
        completionHandler(disposition, credential);
    }
}

@end

这里用到了获取ip的类,代码如下

IPUtil.h

#import <Foundation/Foundation.h>

@interface IPUtil : NSObject

+ (NSString *)localWiFiIPAddress;

@end

IPUtil.m

#import "IPUtil.h"

#include <arpa/inet.h>
#include <netdb.h>

#include <net/if.h>

#include <ifaddrs.h>
#import <dlfcn.h>

#import <SystemConfiguration/SystemConfiguration.h>


@implementation IPUtil

+ (NSString *)localWiFiIPAddress
{
    BOOL success;
    struct ifaddrs * addrs;
    const struct ifaddrs * cursor;
   
    success = getifaddrs(&addrs) == 0;
    if (success) {
        cursor = addrs;
        while (cursor != NULL) {
            // the second test keeps from picking up the loopback address
            if (cursor->ifa_addr->sa_family == AF_INET && (cursor->ifa_flags & IFF_LOOPBACK) == 0)
            {
                NSString *name = [NSString stringWithUTF8String:cursor->ifa_name];
                if ([name isEqualToString:@"en0"])  // Wi-Fi adapter
                    return [NSString stringWithUTF8String:inet_ntoa(((struct sockaddr_in *)cursor->ifa_addr)->sin_addr)];
            }
            cursor = cursor->ifa_next;
        }
        freeifaddrs(addrs);
    }
    return nil;
}

@end

五、调用ossrs推流rtc服务

通过ossrs推流rtc服务,实现本地createOffer之后设置setLocalDescription,再调用rtc/v1/publish/

代码如下

- (void)publishBtnClick {
    __weak typeof(self) weakSelf = self;
    [self.webRTCClient offer:^(RTCSessionDescription *sdp) {
        [weakSelf.webRTCClient changeSDP2Server:sdp urlStr:@"https://192.168.10.100:1990/rtc/v1/publish/" streamUrl:@"webrtc://192.168.10.100:1990/live/livestream" closure:^(BOOL isServerRetSuc) {
            NSLog(@"isServerRetSuc:%@",(isServerRetSuc?@"YES":@"NO"));
        }];
    }];
}

在ViewController上的界面及推流操作

PublishViewController.h

#import <UIKit/UIKit.h>
#import "PublishView.h"

@interface PublishViewController : UIViewController

@end

PublishViewController.m

#import "PublishViewController.h"

@interface PublishViewController ()<WebRTCClientDelegate>

@property (nonatomic, strong) WebRTCClient *webRTCClient;

@property (nonatomic, strong) PublishView *publishView;

@property (nonatomic, strong) UIButton *publishBtn;

@end

@implementation PublishViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view.
    
    self.view.backgroundColor = [UIColor whiteColor];
    
    self.publishView = [[PublishView alloc] initWithFrame:CGRectZero webRTCClient:self.webRTCClient];
    [self.view addSubview:self.publishView];
    self.publishView.backgroundColor = [UIColor lightGrayColor];
    self.publishView.frame = self.view.bounds;
    
    CGFloat screenWidth = CGRectGetWidth(self.view.bounds);
    CGFloat screenHeight = CGRectGetHeight(self.view.bounds);
    self.publishBtn = [UIButton buttonWithType:UIButtonTypeCustom];
    self.publishBtn.frame = CGRectMake(50, screenHeight - 160, screenWidth - 2*50, 46);
    self.publishBtn.layer.cornerRadius = 4;
    self.publishBtn.backgroundColor = [UIColor grayColor];
    [self.publishBtn setTitle:@"publish" forState:UIControlStateNormal];
    [self.publishBtn addTarget:self action:@selector(publishBtnClick) forControlEvents:UIControlEventTouchUpInside];
    [self.view addSubview:self.publishBtn];
    
    self.webRTCClient.delegate = self;
}

- (void)publishBtnClick {
    __weak typeof(self) weakSelf = self;
    [self.webRTCClient offer:^(RTCSessionDescription *sdp) {
        [weakSelf.webRTCClient changeSDP2Server:sdp urlStr:@"https://192.168.10.100:1990/rtc/v1/publish/" streamUrl:@"webrtc://192.168.10.100:1990/live/livestream" closure:^(BOOL isServerRetSuc) {
            NSLog(@"isServerRetSuc:%@",(isServerRetSuc?@"YES":@"NO"));
        }];
    }];
}

#pragma mark - WebRTCClientDelegate
- (void)webRTCClient:(WebRTCClient *)client didDiscoverLocalCandidate:(RTCIceCandidate *)candidate {
    NSLog(@"webRTCClient didDiscoverLocalCandidate");
}

- (void)webRTCClient:(WebRTCClient *)client didChangeConnectionState:(RTCIceConnectionState)state {
    NSLog(@"webRTCClient didChangeConnectionState");
    /**
     RTCIceConnectionStateNew,
     RTCIceConnectionStateChecking,
     RTCIceConnectionStateConnected,
     RTCIceConnectionStateCompleted,
     RTCIceConnectionStateFailed,
     RTCIceConnectionStateDisconnected,
     RTCIceConnectionStateClosed,
     RTCIceConnectionStateCount,
     */
    UIColor *textColor = [UIColor blackColor];
    BOOL openSpeak = NO;
    switch (state) {
        case RTCIceConnectionStateCompleted:
        case RTCIceConnectionStateConnected:
            textColor = [UIColor greenColor];
            openSpeak = YES;
            break;
            
        case RTCIceConnectionStateDisconnected:
            textColor = [UIColor orangeColor];
            break;
            
        case RTCIceConnectionStateFailed:
        case RTCIceConnectionStateClosed:
            textColor = [UIColor redColor];
            break;
            
        case RTCIceConnectionStateNew:
        case RTCIceConnectionStateChecking:
        case RTCIceConnectionStateCount:
            textColor = [UIColor blackColor];
            break;
            
        default:
            break;
    }
    
    dispatch_async(dispatch_get_main_queue(), ^{
        NSString *text = [NSString stringWithFormat:@"%ld", state];
        [self.publishBtn setTitle:text forState:UIControlStateNormal];
        [self.publishBtn setTitleColor:textColor forState:UIControlStateNormal];
        
        if (openSpeak) {
            [self.webRTCClient speakOn];
        }
//        if textColor == .green {
//            self?.webRTCClient.speakerOn()
//        }
    });
}

- (void)webRTCClient:(WebRTCClient *)client didReceiveData:(NSData *)data {
    NSLog(@"webRTCClient didReceiveData");
}


#pragma mark - Lazy
- (WebRTCClient *)webRTCClient {
    if (!_webRTCClient) {
        _webRTCClient = [[WebRTCClient alloc] initWithPublish:YES];
    }
    return _webRTCClient;
}

@end

当点击按钮开启rtc推流。效果图如下

在这里插入图片描述

六、WebRTC视频文件推流

WebRTC还为我们提供了视频文件推流RTCFileVideoCapturer

if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {
        RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;
        [fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {
            NSLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);
        }];
    }

推送的本地视频效果图如下

在这里插入图片描述

至此实现了WebRTC音视频通话的iOS端调用ossrs视频通话服务功能。内容较多,描述可能不准确,请见谅。

七、小结

WebRTC音视频通话-实现iOS端调用ossrs视频通话服务。内容较多,描述可能不准确,请见谅。本文地址:https://blog.csdn.net/gloryFlow/article/details/132262724

学习记录,每天不停进步。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/77503.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

【Java 回忆录】Java全栈开发笔记文档

这里能学到什么&#xff1f; 实战代码文档一比一记录实战问题和解决方案涉及前端、后端、服务器、运维、测试各方面通过各方面的文档与代码&#xff0c;封装一套低代码开发平台直接开腾讯会议&#xff0c;实实在线一起分享技术问题核心以 Spring Boot 作为基础框架进行整合后期…

【爱书不爱输的程序猿】公网访问本地搭建的WEB服务器之详细教程

欢迎来到爱书不爱输的程序猿的博客, 本博客致力于知识分享&#xff0c;与更多的人进行学习交流 本地电脑搭建Web服务器并用cpolar发布至公网访问 前言1. 首先将PHPStudy、WordPress、cpolar下载到电脑2. 安装PHPStudy3. 安装cpolar&#xff0c;进入Web-UI界面4.安装wordpress5.…

C语言好题解析(二)

目录 递归类型例题1例题2例题3例题4例题5例题6 递归类型 例题1 根据下面递归函数&#xff1a;调用函数Fun(2)&#xff0c;返回值是多少&#xff08; &#xff09;int Fun(int n) {if (n 5)return 2;elsereturn 2 * Fun(n 1); } A.2 B.4 C.8 D.16【答案】 D 【分析】 …

vs code 环境变量的配置

问题 环境变量中重复出现下面这两项 ..:/home/xxx/.local/bin/:/home/xxx/.local/bin/:...这造成了一些环境污染&#xff0c;因为/home/xxx/.local/bin 这个环境变量放在前面&#xff0c;文件夹里面的可执行的文件会比conda环境更加优先地执行。 解决 先说结论&#xff0c;…

Postman的高级用法—Runner的使用​

1.首先在postman新建要批量运行的接口文件夹&#xff0c;新建一个接口&#xff0c;并设置好全局变量。 2.然后在Test里面设置好要断言的方法 如&#xff1a; tests["Status code is 200"] responseCode.code 200; tests["Response time is less than 10000…

【Git】—— 标签管理

目录 &#xff08;一&#xff09;理解标签 1、作用 &#xff08;二&#xff09;创建标签 &#xff08;三&#xff09;操作标签 1、删除标签 2、推送标签 3、删除远程标签 &#xff08;一&#xff09;理解标签 标签 tag &#xff0c;可以简单的理解为是对某次 commit 的…

JavaWeb 速通 EL 和 JSTL

目录 一、EL表达式 1.快速入门 : 1.1 基本介绍 1.2 入门案例 2.常用输出形式 : 2.1 创建JavaBean类 2.2 创建JSP文件 3.empty运算符 : 3.1 介绍 3.2 实例 4.EL对象 : 4.1 EL11个内置对象 4.2 域对象演示 4.3获取HTTP信息 二、JSTL标签库 1.基本介绍 : 2.core核心库常用标…

corosync+pacemaker+nfs配置简单高可用

环境准备&#xff1a; 每个节点提供20G共享存储 web1192.168.134.176node7web2192.168.134.177node8 一、准备web环境&#xff08;两台web测试机都要准备&#xff09; yum install httpd -y echo " web test page ,ip is hostname -I." > /var/www/html/index…

【QT+ffmpeg】QT+ffmpeg 环境搭建

1.qt下载地址 download.qt.io/archive/ 2. win10sdk 下载 https://developer.microsoft.com/en-us/windows/downloads/windows-sdk/ 安装 debug工具路径 qtcreater会自动识别 调试器选择

OpenAI全球招外包大军,手把手训练ChatGPT取代码农 ; 码农:我自己「杀」自己

目录 前言 OpenAI招了一千多名外包人员&#xff0c;训练AI学会像人类一样一步步思考。如果ChatGPT「学成归来」&#xff0c;码农恐怕真的危了&#xff1f; 码农真的危了&#xff01; 当时OpenAI也说&#xff0c;ChatGPT最合适的定位&#xff0c;应该是编码辅助工具。 用Cha…

【学会动态规划】等差数列划分(22)

目录 动态规划怎么学&#xff1f; 1. 题目解析 2. 算法原理 1. 状态表示 2. 状态转移方程 3. 初始化 4. 填表顺序 5. 返回值 3. 代码编写 写在最后&#xff1a; 动态规划怎么学&#xff1f; 学习一个算法没有捷径&#xff0c;更何况是学习动态规划&#xff0c; 跟我…

easyx图形库基础:2.基本运动+键盘交互

基本运动键盘交互 一.基本运动1.基本运动&#xff1a;1.如何实现动画&#xff1a;2.实现一个小球从左到右从右到左&#xff1a;&#xff08;往返运动&#xff09;3.实现一个五角星的移动&#xff1a;4.实现一个五角星自转和圆周运动的集合&#xff1a;&#xff08;圆周运动&…

Opencv4基于C++基础入门笔记:图像 颜色 事件响应 图形 视频 直方图

效果图◕‿◕&#xff1a;opencv人脸识别效果图(请叫我真爱粉)✌✌✌先看一下效果图勾起你的兴趣&#xff01; 文章目录&#xff1a; 一&#xff1a;环境配置搭建 二&#xff1a;图像 1.图像读取与显示 main.cpp 运行结果 2.图像色彩空间转换 2.1 换色彩 test.h t…

java八股文面试[java基础]——String StringBuilder StringBuffer

String类型定义&#xff1a; final String 不可以继承 final char [] 不可以修改 String不可变的好处&#xff1a; hash值只需要算一次&#xff0c;当String作为map的key时&#xff0c; 不需要考虑hash改变 天然的线程安全 知识来源&#xff1a; 【基础】String、StringB…

latex 笔记:cs论文需要的排版格式

主要针对英文文献 1 基本环境 连字符 不同长度的"-"表示不同含义。 一个"-"长度的连字符用于词中两个"-"长度的连字符常用于制定范围三个"-"长度的连字符是破折号数学中的负数要用数学环境下的-得到 强调 在正式文章中, 通常不…

探索区块链世界:去中心化应用(DApp)的崭新前景

随着科技的不断发展&#xff0c;区块链技术逐渐引领着数字时代的潮流。在这个充满创新和变革的领域中&#xff0c;去中心化应用&#xff08;DApp&#xff09;成为了备受瞩目的焦点。DApp 不仅改变了传统应用程序的范式&#xff0c;还在金融、社交、游戏等多个领域展现出了广阔的…

File Upload

File Upload 文件上传功能是大部分WEB应用的常用功能&#xff0c;网站允许用户自行上传头像、照片、一些服务类网站需要用户上传证明材料的电子档、电商类网站允许用户上传图片展示商品情况等。然而&#xff0c;看似不起眼的文件上传功能如果没有做好安全防护措施&#xff0c;…

【数理知识】向量与基的内积,Matlab 代码验证

序号内容1【数理知识】向量的坐标基表示法&#xff0c;Matlab 代码验证2【数理知识】向量与基的内积&#xff0c;Matlab 代码验证 文章目录 1. 向量与基的内积2. 二维平面向量举例3. 代码验证Ref 1. 向量与基的内积 假设存在一个二维平面内的向量 a ⃗ \vec{a} a &#xff0c…

具身智能:人工智能的下一个浪潮

原创 | 文 BFT机器人 特斯拉 2023 年股东会上&#xff0c;马斯克强调了人形机器人对特斯拉未来的重要性&#xff0c;并预测其将成为公司的主要长期价值来源。他进一步表示&#xff1a;“如果人形机器人和人的比例大致为2比1&#xff0c;那么人们对机器人的需求可能达到100亿乃…

W5100S-EVB-PICO 做UDP Server进行数据回环测试(七)

前言 前面我们用W5100S-EVB-PICO 开发板在TCP Client和TCP Server模式下&#xff0c;分别进行数据回环测试&#xff0c;本章我们将用开发板在UDP Server模式下进行数据回环测试。 UDP是什么&#xff1f;什么是UDP Server&#xff1f;能干什么&#xff1f; UDP (User Dataqram …