iOS實(shí)現(xiàn)微信朋友圈視頻截取功能
序言
微信現(xiàn)在這么普及,功能也做的越來(lái)越強(qiáng)大,不知大家對(duì)于微信朋友圈發(fā)視頻截取的功能或者蘋(píng)果拍視頻對(duì)視頻編輯的功能有沒(méi)有了解(作者這里也猜測(cè),微信的這個(gè)功能也是仿蘋(píng)果的)。感覺(jué)這個(gè)功能確實(shí)很方便實(shí)用,近來(lái)作者也在研究音視頻功能,所以就實(shí)現(xiàn)了一下這個(gè)功能。
功能其實(shí)看著挺簡(jiǎn)單,實(shí)現(xiàn)過(guò)程也踩了不少坑。一方面記錄一下;另一方面也算是對(duì)實(shí)現(xiàn)過(guò)程的再一次梳理,這樣大家看代碼也會(huì)比較明白。
效果
我們先看看我實(shí)現(xiàn)的效果
實(shí)現(xiàn)
實(shí)現(xiàn)過(guò)程分析
整個(gè)功能可以分為三部分:
- 視頻播放
這部分我們單獨(dú)封裝一個(gè)視頻播放器即可
- 下邊的滑動(dòng)視圖
這部分實(shí)現(xiàn)過(guò)程比較復(fù)雜,一共分成了4部分。灰色遮蓋、左右把手滑塊、滑塊中間上下兩條線、圖片管理視圖
控制器視圖邏輯組裝和功能實(shí)現(xiàn)
- 視頻播放器的封裝
這里使用AVPlayer、playerLayer、AVPlayerItem這三個(gè)類(lèi)實(shí)現(xiàn)了視頻播放功能;由于整個(gè)事件都是基于KVO監(jiān)聽(tīng)的,所以增加了Block代碼提供了對(duì)外監(jiān)聽(tīng)使用。
#import "FOFMoviePlayer.h"
@interface FOFMoviePlayer()
{
AVPlayerLooper *_playerLooper;
AVPlayerItem *_playItem;
BOOL _loop;
}
@property(nonatomic,strong)NSURL *url;
@property(nonatomic,strong)AVPlayer *player;
@property(nonatomic,strong)AVPlayerLayer *playerLayer;
@property(nonatomic,strong)AVPlayerItem *playItem;
@property (nonatomic,assign) CMTime duration;
@end
@implementation FOFMoviePlayer
-(instancetype)initWithFrame:(CGRect)frame url:(NSURL *)url superLayer:(CALayer *)superLayer{
self = [super init];
if (self) {
[self initplayers:superLayer];
_playerLayer.frame = frame;
self.url = url;
}
return self;
}
-(instancetype)initWithFrame:(CGRect)frame url:(NSURL *)url superLayer:(CALayer *)superLayer loop:(BOOL)loop{
self = [self initWithFrame:frame url:url superLayer:superLayer];
if (self) {
_loop = loop;
}
return self;
}
- (void)initplayers:(CALayer *)superLayer{
self.player = [[AVPlayer alloc] init];
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.videoGravity = AVLayerVideoGravityResize;
[superLayer addSublayer:self.playerLayer];
}
- (void)initLoopPlayers:(CALayer *)superLayer{
self.player = [[AVQueuePlayer alloc] init];
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.videoGravity = AVLayerVideoGravityResize;
[superLayer addSublayer:self.playerLayer];
}
-(void)fof_play{
[self.player play];
}
-(void)fof_pause{
[self.player pause];
}
#pragma mark - Observe
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary*)change context:(void *)context{
if ([keyPath isEqualToString:@"status"]) {
AVPlayerItem *item = (AVPlayerItem *)object;
AVPlayerItemStatus status = [[change objectForKey:@"new"] intValue]; // 獲取更改后的狀態(tài)
if (status == AVPlayerItemStatusReadyToPlay) {
_duration = item.duration;//只有在此狀態(tài)下才能獲取,不能在AVPlayerItem初始化后馬上獲取
NSLog(@"準(zhǔn)備播放");
if (self.blockStatusReadyPlay) {
self.blockStatusReadyPlay(item);
}
} else if (status == AVPlayerItemStatusFailed) {
if (self.blockStatusFailed) {
self.blockStatusFailed();
}
AVPlayerItem *item = (AVPlayerItem *)object;
NSLog(@"%@",item.error);
NSLog(@"AVPlayerStatusFailed");
} else {
self.blockStatusUnknown();
NSLog(@"%@",item.error);
NSLog(@"AVPlayerStatusUnknown");
}
}else if ([keyPath isEqualToString:@"tracking"]){
NSInteger status = [change[@"new"] integerValue];
if (self.blockTracking) {
self.blockTracking(status);
}
if (status) {//正在拖動(dòng)
[self.player pause];
}else{//停止拖動(dòng)
}
}else if ([keyPath isEqualToString:@"loadedTimeRanges"]){
NSArray *array = _playItem.loadedTimeRanges;
CMTimeRange timeRange = [array.firstObject CMTimeRangeValue];//本次緩沖時(shí)間范圍
CGFloat startSeconds = CMTimeGetSeconds(timeRange.start);
CGFloat durationSeconds = CMTimeGetSeconds(timeRange.duration);
NSTimeInterval totalBuffer = startSeconds + durationSeconds;//緩沖總長(zhǎng)度
double progress = totalBuffer/CMTimeGetSeconds(_duration);
if (self.blockLoadedTimeRanges) {
self.blockLoadedTimeRanges(progress);
}
NSLog(@"當(dāng)前緩沖時(shí)間:%f",totalBuffer);
}else if ([keyPath isEqualToString:@"playbackBufferEmpty"]){
NSLog(@"緩存不夠,不能播放!");
}else if ([keyPath isEqualToString:@"playbackLikelyToKeepUp"]){
if (self.blockPlaybackLikelyToKeepUp) {
self.blockPlaybackLikelyToKeepUp([change[@"new"] boolValue]);
}
}
}
-(void)setUrl:(NSURL *)url{
_url = url;
[self.player replaceCurrentItemWithPlayerItem:self.playItem];
}
-(AVPlayerItem *)playItem{
_playItem = [[AVPlayerItem alloc] initWithURL:_url];
//監(jiān)聽(tīng)播放器的狀態(tài),準(zhǔn)備好播放、失敗、未知錯(cuò)誤
[_playItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];
// 監(jiān)聽(tīng)緩存的時(shí)間
[_playItem addObserver:self forKeyPath:@"loadedTimeRanges" options:NSKeyValueObservingOptionNew context:nil];
// 監(jiān)聽(tīng)獲取當(dāng)緩存不夠,視頻加載不出來(lái)的情況:
[_playItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil];
// 用于監(jiān)聽(tīng)緩存足夠播放的狀態(tài)
[_playItem addObserver:self forKeyPath:@"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(private_playerMovieFinish) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
return _playItem;
}
- (void)private_playerMovieFinish{
NSLog(@"播放結(jié)束");
if (self.blockPlayToEndTime) {
self.blockPlayToEndTime();
}
if (_loop) {//默認(rèn)提供一個(gè)循環(huán)播放的功能
[self.player pause];
CMTime time = CMTimeMake(1, 1);
__weak typeof(self)this = self;
[self.player seekToTime:time completionHandler:^(BOOL finished) {
[this.player play];
}];
}
}
-(void)dealloc{
NSLog(@"-----銷(xiāo)毀-----");
}
@end
視頻播放器就不重點(diǎn)講了,作者計(jì)劃單獨(dú)寫(xiě)一篇有關(guān)視頻播放器的。
下邊的滑動(dòng)視圖
灰色遮蓋
灰色遮蓋比較簡(jiǎn)單這里作者只是用了UIView
self.leftMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)]; self.leftMaskView.backgroundColor = [UIColor grayColor]; self.leftMaskView.alpha = 0.8; [self addSubview:self.leftMaskView]; self.rightMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)]; self.rightMaskView.backgroundColor = [UIColor grayColor]; self.rightMaskView.alpha = 0.8;
滑塊中間上下兩條線
這兩根線單獨(dú)封裝了一個(gè)視圖Line,一開(kāi)始也想到用一個(gè)UIView就好了,但是發(fā)現(xiàn)一個(gè)問(wèn)題,就是把手的滑動(dòng)與線的滑動(dòng)速度不匹配,線比較慢。
@implementation Line
-(void)setBeginPoint:(CGPoint)beginPoint{
_beginPoint = beginPoint;
[self setNeedsDisplay];
}
-(void)setEndPoint:(CGPoint)endPoint{
_endPoint = endPoint;
[self setNeedsDisplay];
}
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 3);
CGContextSetStrokeColorWithColor(context, [UIColor colorWithWhite:0.9 alpha:1].CGColor);
CGContextMoveToPoint(context, self.beginPoint.x, self.beginPoint.y);
CGContextAddLineToPoint(context, self.endPoint.x, self.endPoint.y);
CGContextStrokePath(context);
}
圖片管理視圖
這里封裝了一個(gè)VideoPieces,用來(lái)組裝把手、線、遮蓋的邏輯,并且用來(lái)顯示圖片。由于圖片只有10張,所以這里緊緊是一個(gè)for循環(huán),增加了10個(gè)UIImageView
@interface VideoPieces()
{
CGPoint _beginPoint;
}
@property(nonatomic,strong) Haft *leftHaft;
@property(nonatomic,strong) Haft *rightHaft;
@property(nonatomic,strong) Line *topLine;
@property(nonatomic,strong) Line *bottomLine;
@property(nonatomic,strong) UIView *leftMaskView;
@property(nonatomic,strong) UIView *rightMaskView;
@end
@implementation VideoPieces
-(instancetype)initWithFrame:(CGRect)frame{
self = [super initWithFrame:frame];
if (self) {
[self initSubViews:frame];
}
return self;
}
- (void)initSubViews:(CGRect)frame{
CGFloat height = CGRectGetHeight(frame);
CGFloat width = CGRectGetWidth(frame);
CGFloat minGap = 30;
CGFloat widthHaft = 10;
CGFloat heightLine = 3;
_leftHaft = [[Haft alloc] initWithFrame:CGRectMake(0, 0, widthHaft, height)];
_leftHaft.alpha = 0.8;
_leftHaft.backgroundColor = [UIColor colorWithWhite:0.9 alpha:1];
_leftHaft.rightEdgeInset = 20;
_leftHaft.lefEdgeInset = 5;
__weak typeof(self) this = self;
[_leftHaft setBlockMove:^(CGPoint point) {
CGFloat maxX = this.rightHaft.frame.origin.x-minGap;
if (point.x=minX) {
this.topLine.endPoint = CGPointMake(point.x-widthHaft, heightLine/2.0);
this.bottomLine.endPoint = CGPointMake(point.x-widthHaft, heightLine/2.0);
this.rightHaft.frame = CGRectMake(point.x, 0, widthHaft, height);
this.rightMaskView.frame = CGRectMake(point.x+widthHaft, 0, width-point.x-widthHaft, height);
if (this.blockSeekOffRight) {
this.blockSeekOffRight(point.x);
}
}
}];
[_rightHaft setBlockMoveEnd:^{
if (this.blockMoveEnd) {
this.blockMoveEnd();
}
}];
_topLine = [[Line alloc] init];
_topLine.alpha = 0.8;
_topLine.frame = CGRectMake(widthHaft, 0, width-2*widthHaft, heightLine);
_topLine.beginPoint = CGPointMake(0, heightLine/2.0);
_topLine.endPoint = CGPointMake(CGRectGetWidth(_topLine.bounds), heightLine/2.0);
_topLine.backgroundColor = [UIColor clearColor];
[self addSubview:_topLine];
_bottomLine = [[Line alloc] init];
_bottomLine.alpha = 0.8;
_bottomLine.frame = CGRectMake(widthHaft, height-heightLine, width-2*widthHaft, heightLine);
_bottomLine.beginPoint = CGPointMake(0, heightLine/2.0);
_bottomLine.endPoint = CGPointMake(CGRectGetWidth(_bottomLine.bounds), heightLine/2.0);
_bottomLine.backgroundColor = [UIColor clearColor];
[self addSubview:_bottomLine];
[self addSubview:_leftHaft];
[self addSubview:_rightHaft];
self.leftMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)];
self.leftMaskView.backgroundColor = [UIColor grayColor];
self.leftMaskView.alpha = 0.8;
[self addSubview:self.leftMaskView];
self.rightMaskView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 0, height)];
self.rightMaskView.backgroundColor = [UIColor grayColor];
self.rightMaskView.alpha = 0.8;
[self addSubview:self.rightMaskView];
}
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent *)event{
UITouch *touch = touches.anyObject;
_beginPoint = [touch locationInView:self];
}
把手的實(shí)現(xiàn)
把手的實(shí)現(xiàn)這里優(yōu)化了一點(diǎn),就是滑動(dòng)的時(shí)候比較靈敏,一開(kāi)始用手指滑動(dòng)的時(shí)候不是非常靈敏,經(jīng)常手指滑動(dòng)了,但是把手沒(méi)有動(dòng)。
增加了靈敏度的方法其實(shí)就是增加了接收事件區(qū)域的大小,重寫(xiě)了-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event這個(gè)方法
@implementation Haft
-(instancetype)initWithFrame:(CGRect)frame{
self = [super initWithFrame:frame];
if (self) {
self.userInteractionEnabled = true;
}
return self;
}
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event{
CGRect rect = CGRectMake(self.bounds.origin.x-self.lefEdgeInset, self.bounds.origin.y-self.topEdgeInset, CGRectGetWidth(self.bounds)+self.lefEdgeInset+self.rightEdgeInset, CGRectGetHeight(self.bounds)+self.bottomEdgeInset+self.topEdgeInset);
if (CGRectContainsPoint(rect, point)) {
return YES;
}
return NO;
}
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent *)event{
NSLog(@"開(kāi)始");
}
-(void)touchesMoved:(NSSet*)touches withEvent:(UIEvent *)event{
NSLog(@"Move");
UITouch *touch = touches.anyObject;
CGPoint point = [touch locationInView:self.superview];
CGFloat maxX = CGRectGetWidth(self.superview.bounds)-CGRectGetWidth(self.bounds);
if (point.x>maxX) {
point.x = maxX;
}
if (point.x>=0&&point.x<=(CGRectGetWidth(self.superview.bounds)-CGRectGetWidth(self.bounds))&&self.blockMove) {
self.blockMove(point);
}
}
-(void)touchesEnded:(NSSet*)touches withEvent:(UIEvent *)event{
if (self.blockMoveEnd) {
self.blockMoveEnd();
}
}
- (void)drawRect:(CGRect)rect {
CGFloat width = CGRectGetWidth(self.bounds);
CGFloat height = CGRectGetHeight(self.bounds);
CGFloat lineWidth = 1.5;
CGFloat lineHeight = 12;
CGFloat gap = (width-lineWidth*2)/3.0;
CGFloat lineY = (height-lineHeight)/2.0;
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, lineWidth);
CGContextSetStrokeColorWithColor(context, [[UIColor grayColor] colorWithAlphaComponent:0.8].CGColor);
CGContextMoveToPoint(context, gap+lineWidth/2, lineY);
CGContextAddLineToPoint(context, gap+lineWidth/2, lineY+lineHeight);
CGContextStrokePath(context);
CGContextSetLineWidth(context, lineWidth);
CGContextSetStrokeColorWithColor(context, [[UIColor grayColor] colorWithAlphaComponent:0.8].CGColor);
CGContextMoveToPoint(context, gap*2+lineWidth+lineWidth/2, lineY);
CGContextAddLineToPoint(context, gap*2+lineWidth+lineWidth/2, lineY+lineHeight);
CGContextStrokePath(context);
}
控制器視圖邏輯組裝和功能實(shí)現(xiàn)
這部分邏輯是最重要也是最復(fù)雜的。
獲取10張縮略圖
- (NSArray *)getVideoThumbnail:(NSString *)path count:(NSInteger)count splitCompleteBlock:(void(^)(BOOL success, NSMutableArray *splitimgs))splitCompleteBlock {
AVAsset *asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:path]];
NSMutableArray *arrayImages = [NSMutableArray array];
[asset loadValuesAsynchronouslyForKeys:@[@"duration"] completionHandler:^{
AVAssetImageGenerator *generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
// generator.maximumSize = CGSizeMake(480,136);//如果是CGSizeMake(480,136),則獲取到的圖片是{240, 136}。與實(shí)際大小成比例
generator.appliesPreferredTrackTransform = YES;//這個(gè)屬性保證我們獲取的圖片的方向是正確的。比如有的視頻需要旋轉(zhuǎn)手機(jī)方向才是視頻的正確方向。
/**因?yàn)橛姓`差,所以需要設(shè)置以下兩個(gè)屬性。如果不設(shè)置誤差有點(diǎn)大,設(shè)置了之后相差非常非常的小**/
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
Float64 seconds = CMTimeGetSeconds(asset.duration);
NSMutableArray *array = [NSMutableArray array];
for (int i = 0; i
CMTime time = CMTimeMakeWithSeconds(i*(seconds/10.0),1);//想要獲取圖片的時(shí)間位置
[array addObject:[NSValue valueWithCMTime:time]];
}
__block int i = 0;
[generator generateCGImagesAsynchronouslyForTimes:array completionHandler:^(CMTime requestedTime, CGImageRef _Nullable imageRef, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) {
i++;
if (result==AVAssetImageGeneratorSucceeded) {
UIImage *image = [UIImage imageWithCGImage:imageRef];
[arrayImages addObject:image];
}else{
NSLog(@"獲取圖片失?。。?!");
}
if (i==count) {
dispatch_async(dispatch_get_main_queue(), ^{
splitCompleteBlock(YES,arrayImages);
});
}
}];
}];
return arrayImages;
}
10張圖片很容易獲取到,不過(guò)這里要注意一點(diǎn):回調(diào)的時(shí)候要放到異步主隊(duì)列回調(diào)!要不會(huì)出現(xiàn)圖片顯示延遲比較嚴(yán)重的問(wèn)題。
監(jiān)聽(tīng)左右滑塊事件
[_videoPieces setBlockSeekOffLeft:^(CGFloat offX) {
this.seeking = true;
[this.moviePlayer fof_pause];
this.lastStartSeconds = this.totalSeconds*offX/CGRectGetWidth(this.videoPieces.bounds);
[this.moviePlayer.player seekToTime:CMTimeMakeWithSeconds(this.lastStartSeconds, 1) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}];
[_videoPieces setBlockSeekOffRight:^(CGFloat offX) {
this.seeking = true;
[this.moviePlayer fof_pause];
this.lastEndSeconds = this.totalSeconds*offX/CGRectGetWidth(this.videoPieces.bounds);
[this.moviePlayer.player seekToTime:CMTimeMakeWithSeconds(this.lastEndSeconds, 1) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}];
這里通過(guò)監(jiān)聽(tīng)左右滑塊的事件,將偏移距離轉(zhuǎn)換成時(shí)間,從而設(shè)置播放器的開(kāi)始時(shí)間和結(jié)束時(shí)間。
循環(huán)播放
self.timeObserverToken = [self.moviePlayer.player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(0.5, NSEC_PER_SEC) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
if (!this.seeking) {
if (fabs(CMTimeGetSeconds(time)-this.lastEndSeconds)<=0.02) {
[this.moviePlayer fof_pause];
[this private_replayAtBeginTime:this.lastStartSeconds];
}
}
}];
這里有兩個(gè)注意點(diǎn):
1. addPeriodicTimeObserverForInterval要進(jìn)行釋放,否則會(huì)有內(nèi)存泄漏。
-(void)dealloc{
[self.moviePlayer.player removeTimeObserver:self.timeObserverToken];
}
2.這里監(jiān)聽(tīng)了播放時(shí)間,進(jìn)而計(jì)算是否達(dá)到了我們右邊把手拖動(dòng)的時(shí)間,如果達(dá)到了則重新播放。 這個(gè)問(wèn)題作者思考了很久,怎么實(shí)現(xiàn)邊播放邊截取?差點(diǎn)進(jìn)入了一個(gè)誤區(qū),真去截取視頻。其實(shí)這里不用截取視頻,只是控制播放時(shí)間和結(jié)束時(shí)間就可以了,最后只截取一次就行了。
總結(jié)
這次微信小視頻編輯實(shí)現(xiàn)過(guò)程中,確實(shí)遇到了挺多的小問(wèn)題。不過(guò)通過(guò)仔細(xì)的研究,最終完美實(shí)現(xiàn)了,有種如釋重負(fù)的感覺(jué)。哈哈。
源碼
總結(jié)
以上所述是小編給大家介紹的iOS實(shí)現(xiàn)微信朋友圈視頻截取功能,希望對(duì)大家有所幫助,如果大家有任何疑問(wèn)請(qǐng)給我留言,小編會(huì)及時(shí)回復(fù)大家的。在此也非常感謝大家對(duì)腳本之家網(wǎng)站的支持!
相關(guān)文章
iOS中實(shí)現(xiàn)圖片自適應(yīng)拉伸效果的方法
圖片拉伸在移動(dòng)開(kāi)發(fā)中特別常見(jiàn),比如常用的即時(shí)通訊應(yīng)用中的聊天氣泡就需要根據(jù)文字長(zhǎng)度對(duì)背景圖片進(jìn)行拉伸自適應(yīng)。下面這篇文章主要給大家介紹了iOS中實(shí)現(xiàn)圖片自適應(yīng)拉伸效果的方法,需要的朋友可以參考借鑒,下面來(lái)一起看看吧。2017-03-03
iOS開(kāi)發(fā)中以application/json上傳文件實(shí)例詳解
在和sever后臺(tái)交互的過(guò)程中、有時(shí)候、他們需要我們iOS開(kāi)發(fā)者以“application/json”形式上傳,具體實(shí)例代碼大家參考下本文2017-07-07
實(shí)例解析設(shè)計(jì)模式中的外觀模式在iOS App開(kāi)發(fā)中的運(yùn)用
這篇文章主要介紹了設(shè)計(jì)模式中的外觀模式在iOS App開(kāi)發(fā)中的運(yùn)用,實(shí)例代碼為傳統(tǒng)的Objective-C,需要的朋友可以參考下2016-03-03
iOS Swift控制器轉(zhuǎn)場(chǎng)動(dòng)畫(huà)示例代碼
這篇文章主要給大家介紹了關(guān)于iOS Swift控制器轉(zhuǎn)場(chǎng)動(dòng)畫(huà)的相關(guān)資料,文中通過(guò)示例代碼介紹的非常詳細(xì),對(duì)各位iOS開(kāi)發(fā)者們具有一定的參考學(xué)習(xí)價(jià)值,需要的朋友們下面隨著小編來(lái)一起學(xué)習(xí)學(xué)習(xí)吧。2018-01-01
iOS中的多線程如何按設(shè)定順序去執(zhí)行任務(wù)詳解
多線程相信大家或多或少都有所了解吧,下面這篇文章主要給大家介紹了關(guān)于iOS中多線程如何按設(shè)定順序去執(zhí)行任務(wù)的相關(guān)資料,文中通過(guò)示例代碼介紹的非常詳細(xì),對(duì)各位iOS開(kāi)發(fā)者們的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價(jià)值,需要的朋友們下面來(lái)一起看看吧。2017-12-12
iOS開(kāi)發(fā)使用XML解析網(wǎng)絡(luò)數(shù)據(jù)
XML解析其實(shí)這個(gè)概念出現(xiàn)了算夠久了,以前javaweb什么到處都在用。這邊我們主要大致介紹下,然后在在ios編程如何使用。2016-02-02
UIPageViewController實(shí)現(xiàn)的左右滑動(dòng)界面
這篇文章主要為大家詳細(xì)介紹了UIPageViewController實(shí)現(xiàn)的左右滑動(dòng)界面,具有一定的參考價(jià)值,感興趣的小伙伴們可以參考一下2018-06-06
iOS 使用 socket 實(shí)現(xiàn)即時(shí)通信示例(非第三方庫(kù))
這篇文章主要介紹了iOS 使用 socket 即時(shí)通信示例(非第三方庫(kù))的資料,這里整理了詳細(xì)的代碼,有需要的小伙伴可以參考下。2017-02-02

