手把手教你實(shí)現(xiàn)微信小視頻iOS代碼實(shí)現(xiàn)
前段時(shí)間項(xiàng)目要求需要在聊天模塊中加入類似微信的小視頻功能,這邊博客主要是為了總結(jié)遇到的問題和解決方法,希望能夠?qū)τ型瑯有枨蟮呐笥延兴鶐椭?br />
效果預(yù)覽:
這里先羅列遇到的主要問題:
1.視頻剪裁 微信的小視頻只是取了攝像頭獲取的一部分畫面
2.滾動(dòng)預(yù)覽的卡頓問題 AVPlayer播放視頻在滾動(dòng)中會(huì)出現(xiàn)很卡的問題
接下來讓我們一步步來實(shí)現(xiàn)。
Part 1 實(shí)現(xiàn)視頻錄制
1.錄制類WKMovieRecorder實(shí)現(xiàn)
創(chuàng)建一個(gè)錄制類WKMovieRecorder,負(fù)責(zé)視頻錄制。
@interface WKMovieRecorder : NSObject + (WKMovieRecorder*) sharedRecorder; - (instancetype)initWithMaxDuration:(NSTimeInterval)duration; @end
定義回調(diào)block
/** * 錄制結(jié)束 * * @param info 回調(diào)信息 * @param isCancle YES:取消 NO:正常結(jié)束 */ typedef void(^FinishRecordingBlock)(NSDictionary *info, WKRecorderFinishedReason finishReason); /** * 焦點(diǎn)改變 */ typedef void(^FocusAreaDidChanged)(); /** * 權(quán)限驗(yàn)證 * * @param success 是否成功 */ typedef void(^AuthorizationResult)(BOOL success); @interface WKMovieRecorder : NSObject //回調(diào) @property (nonatomic, copy) FinishRecordingBlock finishBlock;//錄制結(jié)束回調(diào) @property (nonatomic, copy) FocusAreaDidChanged focusAreaDidChangedBlock; @property (nonatomic, copy) AuthorizationResult authorizationResultBlock; @end
定義一個(gè)cropSize用于視頻裁剪
@property (nonatomic, assign) CGSize cropSize;
接下來就是capture的實(shí)現(xiàn)了,這里代碼有點(diǎn)長(zhǎng),懶得看的可以直接看后面的視頻剪裁部分
錄制配置:
@interface WKMovieRecorder () < AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate, WKMovieWriterDelegate > { AVCaptureSession* _session; AVCaptureVideoPreviewLayer* _preview; WKMovieWriter* _writer; //暫停錄制 BOOL _isCapturing; BOOL _isPaused; BOOL _discont; int _currentFile; CMTime _timeOffset; CMTime _lastVideo; CMTime _lastAudio; NSTimeInterval _maxDuration; } // Session management. @property (nonatomic, strong) dispatch_queue_t sessionQueue; @property (nonatomic, strong) dispatch_queue_t videoDataOutputQueue; @property (nonatomic, strong) AVCaptureSession *session; @property (nonatomic, strong) AVCaptureDevice *captureDevice; @property (nonatomic, strong) AVCaptureDeviceInput *videoDeviceInput; @property (nonatomic, strong) AVCaptureStillImageOutput *stillImageOutput; @property (nonatomic, strong) AVCaptureConnection *videoConnection; @property (nonatomic, strong) AVCaptureConnection *audioConnection; @property (nonatomic, strong) NSDictionary *videoCompressionSettings; @property (nonatomic, strong) NSDictionary *audioCompressionSettings; @property (nonatomic, strong) AVAssetWriterInputPixelBufferAdaptor *adaptor; @property (nonatomic, strong) AVCaptureVideoDataOutput *videoDataOutput; //Utilities @property (nonatomic, strong) NSMutableArray *frames;//存儲(chǔ)錄制幀 @property (nonatomic, assign) CaptureAVSetupResult result; @property (atomic, readwrite) BOOL isCapturing; @property (atomic, readwrite) BOOL isPaused; @property (nonatomic, strong) NSTimer *durationTimer; @property (nonatomic, assign) WKRecorderFinishedReason finishReason; @end
實(shí)例化方法:
+ (WKMovieRecorder *)sharedRecorder { static WKMovieRecorder *recorder; static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{ recorder = [[WKMovieRecorder alloc] initWithMaxDuration:CGFLOAT_MAX]; }); return recorder; } - (instancetype)initWithMaxDuration:(NSTimeInterval)duration { if(self = [self init]){ _maxDuration = duration; _duration = 0.f; } return self; } - (instancetype)init { self = [super init]; if (self) { _maxDuration = CGFLOAT_MAX; _duration = 0.f; _sessionQueue = dispatch_queue_create("wukong.movieRecorder.queue", DISPATCH_QUEUE_SERIAL ); _videoDataOutputQueue = dispatch_queue_create( "wukong.movieRecorder.video", DISPATCH_QUEUE_SERIAL ); dispatch_set_target_queue( _videoDataOutputQueue, dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_HIGH, 0 ) ); } return self; }
2.初始化設(shè)置
初始化設(shè)置分別為session創(chuàng)建、權(quán)限檢查以及session配置
1).session創(chuàng)建
self.session = [[AVCaptureSession alloc] init];
self.result = CaptureAVSetupResultSuccess;
2).權(quán)限檢查
//權(quán)限檢查 switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) { case AVAuthorizationStatusNotDetermined: { [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) { if (granted) { self.result = CaptureAVSetupResultSuccess; } }]; break; } case AVAuthorizationStatusAuthorized: { break; } default:{ self.result = CaptureAVSetupResultCameraNotAuthorized; } } if ( self.result != CaptureAVSetupResultSuccess) { if (self.authorizationResultBlock) { self.authorizationResultBlock(NO); } return; }
3).session配置
session配置是需要注意的是AVCaptureSession的配置不能在主線程, 需要自行創(chuàng)建串行線程。
3.1.1 獲取輸入設(shè)備與輸入流
AVCaptureDevice *captureDevice = [[self class] deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack]; _captureDevice = captureDevice; NSError *error = nil; _videoDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error]; if (!_videoDeviceInput) { NSLog(@"未找到設(shè)備"); }
3.1.2 錄制幀數(shù)設(shè)置
幀數(shù)設(shè)置的主要目的是適配iPhone4,畢竟是應(yīng)該淘汰的機(jī)器了
int frameRate; if ( [NSProcessInfo processInfo].processorCount == 1 ) { if ([self.session canSetSessionPreset:AVCaptureSessionPresetLow]) { [self.session setSessionPreset:AVCaptureSessionPresetLow]; } frameRate = 10; }else{ if ([self.session canSetSessionPreset:AVCaptureSessionPreset640x480]) { [self.session setSessionPreset:AVCaptureSessionPreset640x480]; } frameRate = 30; } CMTime frameDuration = CMTimeMake( 1, frameRate ); if ( [_captureDevice lockForConfiguration:&error] ) { _captureDevice.activeVideoMaxFrameDuration = frameDuration; _captureDevice.activeVideoMinFrameDuration = frameDuration; [_captureDevice unlockForConfiguration]; } else { NSLog( @"videoDevice lockForConfiguration returned error %@", error ); }
3.1.3 視頻輸出設(shè)置
視頻輸出設(shè)置需要注意的問題是:要設(shè)置videoConnection的方向,這樣才能保證設(shè)備旋轉(zhuǎn)時(shí)的顯示正常。
//Video if ([self.session canAddInput:_videoDeviceInput]) { [self.session addInput:_videoDeviceInput]; self.videoDeviceInput = _videoDeviceInput; [self.session removeOutput:_videoDataOutput]; AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init]; _videoDataOutput = videoOutput; videoOutput.videoSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) }; [videoOutput setSampleBufferDelegate:self queue:_videoDataOutputQueue]; videoOutput.alwaysDiscardsLateVideoFrames = NO; if ( [_session canAddOutput:videoOutput] ) { [_session addOutput:videoOutput]; [_captureDevice addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:FocusAreaChangedContext]; _videoConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo]; if(_videoConnection.isVideoStabilizationSupported){ _videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto; } UIInterfaceOrientation statusBarOrientation = [UIApplication sharedApplication].statusBarOrientation; AVCaptureVideoOrientation initialVideoOrientation = AVCaptureVideoOrientationPortrait; if ( statusBarOrientation != UIInterfaceOrientationUnknown ) { initialVideoOrientation = (AVCaptureVideoOrientation)statusBarOrientation; } _videoConnection.videoOrientation = initialVideoOrientation; } } else{ NSLog(@"無法添加視頻輸入到會(huì)話"); }
3.1.4 音頻設(shè)置
需要注意的是為了不丟幀,需要把音頻輸出的回調(diào)隊(duì)列放在串行隊(duì)列中
//audio AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; if ( ! audioDeviceInput ) { NSLog( @"Could not create audio device input: %@", error ); } if ( [self.session canAddInput:audioDeviceInput] ) { [self.session addInput:audioDeviceInput]; } else { NSLog( @"Could not add audio device input to the session" ); } AVCaptureAudioDataOutput *audioOut = [[AVCaptureAudioDataOutput alloc] init]; // Put audio on its own queue to ensure that our video processing doesn't cause us to drop audio dispatch_queue_t audioCaptureQueue = dispatch_queue_create( "wukong.movieRecorder.audio", DISPATCH_QUEUE_SERIAL ); [audioOut setSampleBufferDelegate:self queue:audioCaptureQueue]; if ( [self.session canAddOutput:audioOut] ) { [self.session addOutput:audioOut]; } _audioConnection = [audioOut connectionWithMediaType:AVMediaTypeAudio];
還需要注意一個(gè)問題就是對(duì)于session的配置代碼應(yīng)該是這樣的
[self.session beginConfiguration];
...配置代碼
[self.session commitConfiguration];
由于篇幅問題,后面的錄制代碼我就挑重點(diǎn)的講了。
3.2 視頻存儲(chǔ)
現(xiàn)在我們需要在AVCaptureVideoDataOutputSampleBufferDelegate與AVCaptureAudioDataOutputSampleBufferDelegate的回調(diào)中,將音頻和視頻寫入沙盒。在這個(gè)過程中需要注意的,在啟動(dòng)session后獲取到的第一幀黑色的,需要放棄。
3.2.1 創(chuàng)建WKMovieWriter類來封裝視頻存儲(chǔ)操作
WKMovieWriter的主要作用是利用AVAssetWriter拿到CMSampleBufferRef,剪裁后再寫入到沙盒中。
這是剪裁配置的代碼,AVAssetWriter會(huì)根據(jù)cropSize來剪裁視頻,這里需要注意的一個(gè)問題是cropSize的width必須是320的整數(shù)倍,不然的話剪裁出來的視頻右側(cè)會(huì)出現(xiàn)一條綠色的線
NSDictionary *videoSettings; if (_cropSize.height == 0 || _cropSize.width == 0) { _cropSize = [UIScreen mainScreen].bounds.size; } videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:_cropSize.width], AVVideoWidthKey, [NSNumber numberWithInt:_cropSize.height], AVVideoHeightKey, AVVideoScalingModeResizeAspectFill,AVVideoScalingModeKey, nil];
至此,視頻錄制就完成了。
接下來需要解決的預(yù)覽的問題了
Part 2 卡頓問題解決
1.1 gif圖生成
通過查資料發(fā)現(xiàn)了這篇blog 介紹說微信團(tuán)隊(duì)解決預(yù)覽卡頓的問題使用的是播放圖片gif,但是博客中的示例代碼有問題,通過CoreAnimation來播放圖片導(dǎo)致內(nèi)存暴漲而crash。但是,還是給了我一些靈感,因?yàn)橹绊?xiàng)目的啟動(dòng)頁(yè)用到了gif圖片的播放,所以我就想能不能把視頻轉(zhuǎn)成圖片,然后再轉(zhuǎn)成gif圖進(jìn)行播放,這樣不就解決了問題了嗎。于是我開始google功夫不負(fù)有心人找到了,圖片數(shù)組轉(zhuǎn)gif圖片的方法。
gif圖轉(zhuǎn)換代碼
static void makeAnimatedGif(NSArray *images, NSURL *gifURL, NSTimeInterval duration) { NSTimeInterval perSecond = duration /images.count; NSDictionary *fileProperties = @{ (__bridge id)kCGImagePropertyGIFDictionary: @{ (__bridge id)kCGImagePropertyGIFLoopCount: @0, // 0 means loop forever } }; NSDictionary *frameProperties = @{ (__bridge id)kCGImagePropertyGIFDictionary: @{ (__bridge id)kCGImagePropertyGIFDelayTime: @(perSecond), // a float (not double!) in seconds, rounded to centiseconds in the GIF data } }; CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)gifURL, kUTTypeGIF, images.count, NULL); CGImageDestinationSetProperties(destination, (__bridge CFDictionaryRef)fileProperties); for (UIImage *image in images) { @autoreleasepool { CGImageDestinationAddImage(destination, image.CGImage, (__bridge CFDictionaryRef)frameProperties); } } if (!CGImageDestinationFinalize(destination)) { NSLog(@"failed to finalize image destination"); }else{ } CFRelease(destination); }
轉(zhuǎn)換是轉(zhuǎn)換成功了,但是出現(xiàn)了新的問題,使用ImageIO生成gif圖片時(shí)會(huì)導(dǎo)致內(nèi)存暴漲,瞬間漲到100M以上,如果多個(gè)gif圖同時(shí)生成的話一樣會(huì)crash掉,為了解決這個(gè)問題需要用一個(gè)串行隊(duì)列來進(jìn)行g(shù)if圖的生成
1.2 視頻轉(zhuǎn)換為UIImages
主要是通過AVAssetReader、AVAssetTrack、AVAssetReaderTrackOutput 來進(jìn)行轉(zhuǎn)換
//轉(zhuǎn)成UIImage - (void)convertVideoUIImagesWithURL:(NSURL *)url finishBlock:(void (^)(id images, NSTimeInterval duration))finishBlock { AVAsset *asset = [AVAsset assetWithURL:url]; NSError *error = nil; self.reader = [[AVAssetReader alloc] initWithAsset:asset error:&error]; NSTimeInterval duration = CMTimeGetSeconds(asset.duration); __weak typeof(self)weakSelf = self; dispatch_queue_t backgroundQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0); dispatch_async(backgroundQueue, ^{ __strong typeof(weakSelf) strongSelf = weakSelf; NSLog(@""); if (error) { NSLog(@"%@", [error localizedDescription]); } NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo]; AVAssetTrack *videoTrack =[videoTracks firstObject]; if (!videoTrack) { return ; } int m_pixelFormatType; // 視頻播放時(shí), m_pixelFormatType = kCVPixelFormatType_32BGRA; // 其他用途,如視頻壓縮 // m_pixelFormatType = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange; NSMutableDictionary *options = [NSMutableDictionary dictionary]; [options setObject:@(m_pixelFormatType) forKey:(id)kCVPixelBufferPixelFormatTypeKey]; AVAssetReaderTrackOutput *videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:options]; if ([strongSelf.reader canAddOutput:videoReaderOutput]) { [strongSelf.reader addOutput:videoReaderOutput]; } [strongSelf.reader startReading]; NSMutableArray *images = [NSMutableArray array]; // 要確保nominalFrameRate>0,之前出現(xiàn)過android拍的0幀視頻 while ([strongSelf.reader status] == AVAssetReaderStatusReading && videoTrack.nominalFrameRate > 0) { @autoreleasepool { // 讀取 video sample CMSampleBufferRef videoBuffer = [videoReaderOutput copyNextSampleBuffer]; if (!videoBuffer) { break; } [images addObject:[WKVideoConverter convertSampleBufferRefToUIImage:videoBuffer]]; CFRelease(videoBuffer); } } if (finishBlock) { dispatch_async(dispatch_get_main_queue(), ^{ finishBlock(images, duration); }); } }); }
在這里有一個(gè)值得注意的問題,在視頻轉(zhuǎn)image的過程中,由于轉(zhuǎn)換時(shí)間很短,在短時(shí)間內(nèi)videoBuffer不能夠及時(shí)得到釋放,在多個(gè)視頻同時(shí)轉(zhuǎn)換時(shí)任然會(huì)出現(xiàn)內(nèi)存問題,這個(gè)時(shí)候就需要用autoreleasepool來實(shí)現(xiàn)及時(shí)釋放
@autoreleasepool { // 讀取 video sample CMSampleBufferRef videoBuffer = [videoReaderOutput copyNextSampleBuffer]; if (!videoBuffer) { break; } [images addObject:[WKVideoConverter convertSampleBufferRefToUIImage:videoBuffer]]; CFRelease(videoBuffer); }
至此,微信小視頻的難點(diǎn)(我認(rèn)為的)就解決了,至于其他的實(shí)現(xiàn)代碼請(qǐng)看demo就基本實(shí)現(xiàn)了,demo可以從這里下載。
視頻暫停錄制 http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html
視頻crop綠邊解決 http://stackoverflow.com/questions/22883525/avassetexportsession-giving-me-a-green-border-on-right-and-bottom-of-output-vide
視頻裁剪:http://stackoverflow.com/questions/15737781/video-capture-with-11-aspect-ratio-in-ios/16910263#16910263
CMSampleBufferRef轉(zhuǎn)image https://developer.apple.com/library/ios/qa/qa1702/_index.html
微信小視頻分析 http://www.jianshu.com/p/3d5ccbde0de1
感謝以上文章的作者
以上就是本文的全部?jī)?nèi)容,希望對(duì)大家的學(xué)習(xí)有所幫助,也希望大家多多支持腳本之家。
- 淺析iOS中視頻播放的幾種方案
- iOS開發(fā)之獲取系統(tǒng)相冊(cè)中的圖片與視頻教程(內(nèi)帶url轉(zhuǎn)換)
- IOS實(shí)現(xiàn)視頻動(dòng)畫效果的啟動(dòng)圖
- iOS實(shí)現(xiàn)視頻和圖片的上傳思路
- 詳解iOS應(yīng)用中播放本地視頻以及選取本地音頻的組件用法
- iOS中視頻播放器的簡(jiǎn)單封裝詳解
- iOS中讀取照片庫(kù)及保存圖片或視頻到照片庫(kù)的要點(diǎn)解析
- iOS視頻錄制(或選擇)壓縮及上傳功能(整理)
- iOS 本地視頻和網(wǎng)絡(luò)視頻流播放實(shí)例代碼
- iOS AVCaptureSession實(shí)現(xiàn)視頻錄制功能
相關(guān)文章
iOS 中使用tableView實(shí)現(xiàn)右滑顯示選擇功能
這篇文章主要介紹了iOS 中使用tableView實(shí)現(xiàn)右滑顯示選擇功能的相關(guān)資料,非常不錯(cuò),具有參考借鑒價(jià)值,需要的朋友可以參考下2016-07-07iOS13適配深色模式(Dark Mode)的實(shí)現(xiàn)
這篇文章主要介紹了iOS13適配深色模式(Dark Mode)的實(shí)現(xiàn),文中通過示例代碼介紹的非常詳細(xì),對(duì)大家的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價(jià)值,需要的朋友們下面隨著小編來一起學(xué)習(xí)學(xué)習(xí)吧2020-03-03iOS 底部按鈕和應(yīng)用圖標(biāo)顯示未讀消息(帶數(shù)字)
本文主要介紹了iOS 底部按鈕和應(yīng)用圖標(biāo)顯示未讀消息的相關(guān)知識(shí)。具有很好的參考價(jià)值。下面跟著小編一起來看下吧2017-04-04MacOS無法掛載NFS Operation not permitted錯(cuò)誤解決辦法
這篇文章主要介紹了MacOS無法掛載NFS Operation not permitted錯(cuò)誤解決辦法的相關(guān)資料2017-02-02iOS實(shí)現(xiàn)不規(guī)則Button點(diǎn)擊效果實(shí)例代碼
這篇文章主要給大家介紹了關(guān)于iOS實(shí)現(xiàn)不規(guī)則Button點(diǎn)擊的相關(guān)資料,文中通過示例代碼介紹的非常詳細(xì),對(duì)各位iOS開發(fā)者們具有一定的參考學(xué)習(xí)價(jià)值,需要的朋友們下面來一起學(xué)習(xí)學(xué)習(xí)吧2019-04-04TextField和TextView限制輸入字?jǐn)?shù)長(zhǎng)度
這篇文章主要為大家詳細(xì)介紹了TextField和TextView限制輸入字?jǐn)?shù)長(zhǎng)度代碼,感興趣的小伙伴們可以參考一下2016-08-08iOS實(shí)現(xiàn)簡(jiǎn)單長(zhǎng)截圖
這篇文章主要為大家詳細(xì)介紹了iOS實(shí)現(xiàn)簡(jiǎn)單長(zhǎng)截圖,文中示例代碼介紹的非常詳細(xì),具有一定的參考價(jià)值,感興趣的小伙伴們可以參考一下2022-07-07