欧美bbbwbbbw肥妇,免费乱码人妻系列日韩,一级黄片

C語言使用ffmpeg實(shí)現(xiàn)單線程異步的視頻播放器

 更新時(shí)間:2022年12月16日 10:22:46   作者:CodeOfCC  
這篇文章主要為大家詳細(xì)介紹了C語言如何使用ffmpeg實(shí)現(xiàn)單線程異步的視頻播放器功能,文中的示例代碼講解詳細(xì),感興趣的小伙伴可以嘗試一下

前言

ffplay是一個(gè)不錯(cuò)的播放器,是基于多線程實(shí)現(xiàn)的,播放視頻時(shí)一般至少有4個(gè)線程:讀包線程、視頻解碼線程、音頻解碼線程、視頻渲染線程。如果需要多路播放時(shí),線程不可避免的有點(diǎn)多,比如需要播放8路視頻時(shí)則需要32個(gè)線程,這樣對性能的消耗還是比較大的。于是想到用單線程實(shí)現(xiàn)一個(gè)播放器,經(jīng)過實(shí)踐發(fā)現(xiàn)是可行的,播放本地文件時(shí)可以做到完全單線程、播放網(wǎng)絡(luò)流時(shí)需要一個(gè)線程實(shí)現(xiàn)讀包異步。

一、播放流程

二、關(guān)鍵實(shí)現(xiàn)

因?yàn)槭腔趩尉€程的播放器有些細(xì)節(jié)還是要注意的。

1.視頻

(1)解碼

解碼時(shí)需要注意設(shè)置多線程解碼或者硬解以確保解碼速度,因?yàn)樵趩尉€程中解碼過慢則會導(dǎo)致視頻卡頓。

//使用多線程解碼
if (!av_dict_get(opts, "threads", NULL, 0))
    av_dict_set(&opts, "threads", "auto", 0);
//打開解碼器
if (avcodec_open2(decoder->codecContext, codec, &opts) < 0) {
    LOG_ERROR("Could not open codec");
    av_dict_free(&opts);
    return ERRORCODE_DECODER_OPENFAILED;
}

或者根據(jù)情況設(shè)置硬解碼器

codec = avcodec_find_decoder_by_name("hevc_qsv");
//打開解碼器
if (avcodec_open2(decoder->codecContext, codec, &opts) < 0) {
    LOG_ERROR("Could not open codec");
    av_dict_free(&opts);
    return ERRORCODE_DECODER_OPENFAILED;
}

2、音頻

(1)修正時(shí)鐘

雖然音頻的播放是基于流的,時(shí)鐘也可以按照播放的數(shù)據(jù)量計(jì)算,但是出現(xiàn)丟包或者定位的一些情況時(shí),按照數(shù)據(jù)量累計(jì)的方式會導(dǎo)致時(shí)鐘不正確,所以在解碼后的數(shù)據(jù)放入播放隊(duì)列時(shí)應(yīng)該進(jìn)行時(shí)鐘修正。

synchronize_setClockTime參考《c語言 將音視頻時(shí)鐘同步封裝成通用模塊》。在音頻解碼之后:

//讀取解碼的音頻幀
av_fifo_generic_read(play->audio.decoder.fifoFrame, &frame, sizeof(AVFrame*), NULL);
//同步(修正)時(shí)鐘
AVRational timebase = play->formatContext->streams[audio->decoder.streamIndex]->time_base;
//當(dāng)前幀的時(shí)間戳
double pts = (double)frame->pts * timebase.num / timebase.den;
//減去播放隊(duì)列剩余數(shù)據(jù)的時(shí)長就是當(dāng)前的音頻時(shí)鐘
pts -= (double)av_audio_fifo_size(play->audio.playFifo) / play->audio.spec.freq;
synchronize_setClockTime(&play->synchronize, &play->synchronize.audio, pts);
//同步(修正)時(shí)鐘--end
//寫入播放隊(duì)列
av_audio_fifo_write(play->audio.playFifo, (void**)&data, samples);

3、時(shí)鐘同步

需要時(shí)鐘同步的地方有3處,一處是音頻解碼后即上面的2、(1)。另外兩處則是音頻播放和視頻渲染的地方。

(1)音頻播放

synchronize_updateAudio參考《c語言 將音視頻時(shí)鐘同步封裝成通用模塊》。

//sdl音頻回調(diào)
static void audio_callback(void* userdata, uint8_t* stream, int len) {
   Play* play = (Play*)userdata;
   //需要寫入的數(shù)據(jù)量
   samples = play->audio.spec.samples;
  //時(shí)鐘同步,獲取應(yīng)該寫入的數(shù)據(jù)量,如果是同步到音頻,則需要寫入的數(shù)據(jù)量始終等于應(yīng)該寫入的數(shù)據(jù)量。
   samples = synchronize_updateAudio(&play->synchronize, samples, play->audio.spec.freq);
   //略
}

(2)視頻播放

在視頻渲染處實(shí)現(xiàn)如下代碼,其中synchronize_updateVideo參考《c語言 將音視頻時(shí)鐘同步封裝成通用模塊》。

//---------------時(shí)鐘同步--------------        
AVRational timebase = play->formatContext->streams[video->decoder.streamIndex]->time_base;
//計(jì)算視頻幀的pts
double    pts = frame->pts * (double)timebase.num / timebase.den;
//視頻幀的持續(xù)時(shí)間
double duration = frame->pkt_duration * (double)timebase.num / timebase.den;
double delay = synchronize_updateVideo(&play->synchronize, pts, duration);
if (delay > 0)
    //延時(shí)
{
    play->wakeupTime = getCurrentTime() + delay;
    return 0;
}
else if (delay < 0)
    //丟幀
{
    av_fifo_generic_read(video->decoder.fifoFrame, &frame, sizeof(AVFrame*), NULL);
    av_frame_unref(frame);
    av_frame_free(&frame);
    return 0;
}
else
    //播放
{
    av_fifo_generic_read(video->decoder.fifoFrame, &frame, sizeof(AVFrame*), NULL);
}
//---------------時(shí)鐘同步--------------    end

4、異步讀包

如果是本地文件單線程播放是完全沒有問題的。但是播放網(wǎng)絡(luò)流時(shí),由于av_read_frame不是異步的,網(wǎng)絡(luò)狀況差時(shí)會導(dǎo)致延時(shí)過高影響到其他部分功能的正常進(jìn)行,所以只能是將讀包的操作放到子線程執(zhí)行,這里采用async、await的思想實(shí)現(xiàn)異步。

(1)async

將av_read_frame的放到線程池中執(zhí)行。

//異步讀取包,子線程中調(diào)用此方法
static int packet_readAsync(void* arg)
{
    Play* play = (Play*)arg;
    play->eofPacket = av_read_frame(play->formatContext, &play->packet);
    //回到播放線程處理包
    play_beginInvoke(play, packet_readAwait, play);
    return 0;
}

(2)await

執(zhí)行完成后通過消息隊(duì)列通知播放器線程,將后續(xù)操作放在播放線程中執(zhí)行

//異步讀取包完成后的操作
static int packet_readAwait(void* arg)
{
    Play* play = (Play*)arg;
    if (play->eofPacket == 0)
    {
        if (play->packet.stream_index == play->video.decoder.streamIndex)
            //寫入視頻包隊(duì)
        {
            AVPacket* packet = av_packet_clone(&play->packet);
            av_fifo_generic_write(play->video.decoder.fifoPacket, &packet, sizeof(AVPacket*), NULL);
        }
        else if (play->packet.stream_index == play->audio.decoder.streamIndex)
            //寫入音頻包隊(duì)
        {
            AVPacket* packet = av_packet_clone(&play->packet);
            av_fifo_generic_write(play->audio.decoder.fifoPacket, &packet, sizeof(AVPacket*), NULL);
        }
        av_packet_unref(&play->packet);
    }
    else if (play->eofPacket == AVERROR_EOF)
    {
        play->eofPacket = 1;
        //寫入空包flush解碼器中的緩存
        AVPacket* packet = &play->packet;
        if (play->audio.decoder.fifoPacket)
            av_fifo_generic_write(play->audio.decoder.fifoPacket, &packet, sizeof(AVPacket*), NULL);
        if (play->video.decoder.fifoPacket)
            av_fifo_generic_write(play->video.decoder.fifoPacket, &packet, sizeof(AVPacket*), NULL);
    }
    else
    {
        LOG_ERROR("read packet erro!\n");
        play->exitFlag = 1;
        play->isAsyncReading = 0;
        return ERRORCODE_PACKET_READFRAMEFAILED;
    }
    play->isAsyncReading = 0;
    return 0;
}

(3)消息處理

在播放線程中調(diào)用如下方法,處理事件,當(dāng)await方法拋入消息隊(duì)列后,就可以通過消息循環(huán)獲取await方法在播放線程中執(zhí)行。

//事件處理
static void play_eventHandler(Play* play) {
    PlayMessage msg;
    while (messageQueue_poll(&play->mq, &msg)) {
        switch (msg.type)
        {
        case PLAYMESSAGETYPE_INVOKE:
            SDL_ThreadFunction fn = (SDL_ThreadFunction)msg.param1;
            fn(msg.param2);
            break;
        }
    }
}

三、完整代碼

完整代碼c和c++都可以運(yùn)行,使用ffmpeg4.3、sdl2。

main.c/cpp

#include <stdio.h>
#include <stdint.h>
#include "SDL.h"
#include<stdint.h>
#include<string.h>
#ifdef  __cplusplus
extern "C" {
#endif 
#include "libavformat/avformat.h"
#include "libavcodec/avcodec.h"
#include "libswscale/swscale.h"
#include "libavutil/imgutils.h"
#include "libavutil/avutil.h"
#include "libavutil/time.h"
#include "libavutil/audio_fifo.h"
#include "libswresample/swresample.h"
#ifdef  __cplusplus
}
#endif 

/************************************************************************
* @Project:  	play
* @Decription:  視頻播放器
* 這是一個(gè)播放器,基于單線程實(shí)現(xiàn)的播放器。如果是播放本地文件可以做到完全單線程,播放網(wǎng)絡(luò)流則讀取包的時(shí)候是異步的,當(dāng)然
* 主流程依然是單線程。目前是讀取包始終異步,未作判斷本地文件同步讀包處理。
* @Verision:  	v0.0.0
* @Author:  	Xin Nie
* @Create:  	2022/12/12 21:21:00
* @LastUpdate:  2022/12/12 21:21:00
************************************************************************
* Copyright @ 2022. All rights reserved.
************************************************************************/


/// <summary>
/// 消息隊(duì)列
/// </summary>
typedef struct {
	//隊(duì)列長度
	int _capacity;
	//消息對象大小
	int _elementSize;
	//隊(duì)列
	AVFifoBuffer* _queue;
	//互斥變量
	SDL_mutex* _mtx;
	//條件變量
	SDL_cond* _cv;
}MessageQueue;
/// <summary>
/// 對象池
/// </summary>
typedef struct {
	//對象緩存
	void* buffer;
	//對象大小
	int elementSize;
	//對象個(gè)數(shù)
	int arraySize;
	//對象使用狀態(tài)1使用,0未使用
	int* _arrayUseState;
	//互斥變量
	SDL_mutex* _mtx;
	//條件變量
	SDL_cond* _cv;
}OjectPool;
/// <summary>
/// 線程池
/// </summary>
typedef struct {
	//最大線程數(shù)
	int maxThreadCount;
	//線程信息對象池
	OjectPool _pool;
}ThreadPool;
/// <summary>
/// 線程信息
/// </summary>
typedef struct {
	//所屬線程池
	ThreadPool* _threadPool;
	//線程句柄
	SDL_Thread* _thread;
	//消息隊(duì)列
	MessageQueue _queue;
	//線程回調(diào)方法
	SDL_ThreadFunction _fn;
	//線程回調(diào)參數(shù)
	void* _arg;
}ThreadInfo;
//解碼器
typedef  struct {
	//解碼上下文
	AVCodecContext* codecContext;
	//解碼器
	const AVCodec* codec;
	//解碼臨時(shí)幀
	AVFrame* frame;
	//包隊(duì)列
	AVFifoBuffer* fifoPacket;
	//幀隊(duì)列
	AVFifoBuffer* fifoFrame;
	//流下標(biāo)
	int	streamIndex;
	//解碼結(jié)束標(biāo)記
	int eofFrame;
}Decoder;
/// <summary>
/// 時(shí)鐘對象
/// </summary>
typedef  struct {
	//起始時(shí)間
	double startTime;
	//當(dāng)前pts
	double currentPts;
}Clock;
/// <summary>
/// 時(shí)鐘同步類型
/// </summary>
typedef enum {
	//同步到音頻
	SYNCHRONIZETYPE_AUDIO,
	//同步到視頻
	SYNCHRONIZETYPE_VIDEO,
	//同步到絕對時(shí)鐘
	SYNCHRONIZETYPE_ABSOLUTE
}SynchronizeType;
/// <summary>
/// 時(shí)鐘同步對象
/// </summary>
typedef  struct {
	/// <summary>
	/// 音頻時(shí)鐘
	/// </summary>
	Clock audio;
	/// <summary>
	/// 視頻時(shí)鐘
	/// </summary>
	Clock video;
	/// <summary>
	/// 絕對時(shí)鐘
	/// </summary>
	Clock absolute;
	/// <summary>
	/// 時(shí)鐘同步類型
	/// </summary>
	SynchronizeType type;
	/// <summary>
	/// 估算的視頻幀時(shí)長
	/// </summary>
	double estimateVideoDuration;
	/// <summary>
	/// 估算視頻幀數(shù)
	/// </summary>
	double n;
}Synchronize;
//視頻模塊
typedef  struct {
	//解碼器
	Decoder decoder;
	//輸出格式
	enum AVPixelFormat forcePixelFormat;
	//重采樣對象
	struct SwsContext* swsContext;
	//重采樣緩存
	uint8_t* swsBuffer;
	//渲染器
	SDL_Renderer* sdlRenderer;
	//紋理
	SDL_Texture* sdlTexture;
	//窗口
	SDL_Window* screen;
	//窗口寬
	int screen_w;
	//窗口高
	int	screen_h;
	//旋轉(zhuǎn)角度
	 double  angle;
	//播放結(jié)束標(biāo)記
	int eofDisplay;
	//播放開始標(biāo)記
	int sofDisplay;
}Video;

//音頻模塊
typedef  struct {
	//解碼器
	Decoder decoder;
	//輸出格式
	enum AVSampleFormat forceSampleFormat;
	//音頻設(shè)備id
	SDL_AudioDeviceID audioId;
	//期望的音頻設(shè)備參數(shù)
	SDL_AudioSpec wantedSpec;
	//實(shí)際的音頻設(shè)備參數(shù)
	SDL_AudioSpec spec;
	//重采樣對象
	struct SwrContext* swrContext;
	//重采樣緩存
	uint8_t* swrBuffer;
	//播放隊(duì)列
	AVAudioFifo* playFifo;
	//播放隊(duì)列互斥鎖
	SDL_mutex* mutex;
	//累積的待播放采樣數(shù)
	int accumulateSamples;
	//音量
	int volume;
	//聲音混合buffer
	uint8_t* mixBuffer;
	//播放結(jié)束標(biāo)記
	int eofPlay;
	//播放開始標(biāo)記
	int sofPlay;
}Audio;

//播放器
typedef  struct {
	//視頻url
	char* url;
	//解復(fù)用上下文
	AVFormatContext* formatContext;
	//包
	AVPacket packet;
	//是否正在讀取包
	int isAsyncReading;
	//包讀取結(jié)束標(biāo)記
	int eofPacket;
	//視頻模塊
	Video video;
	//音頻模塊
	Audio audio;
	//時(shí)鐘同步
	Synchronize synchronize;
	//延時(shí)結(jié)束時(shí)間
	double wakeupTime;
	//播放一幀
	int step;
	//是否暫停
	int isPaused;
	//是否循環(huán)
	int isLoop;
	//退出標(biāo)記
	int exitFlag;
	//消息隊(duì)列
	MessageQueue mq;
}Play;

//播放消息類型
typedef enum {
	//調(diào)用方法
	PLAYMESSAGETYPE_INVOKE
}PlayMessageType;

//播放消息
typedef  struct {
	PlayMessageType type;
	void* param1;
	void* param2;
}PlayMessage;

//格式映射
static const struct TextureFormatEntry {
	enum AVPixelFormat format;
	int texture_fmt;
} sdl_texture_format_map[] = {
	{ AV_PIX_FMT_RGB8, SDL_PIXELFORMAT_RGB332 },
	{ AV_PIX_FMT_RGB444, SDL_PIXELFORMAT_RGB444 },
	{ AV_PIX_FMT_RGB555, SDL_PIXELFORMAT_RGB555 },
	{ AV_PIX_FMT_BGR555, SDL_PIXELFORMAT_BGR555 },
	{ AV_PIX_FMT_RGB565, SDL_PIXELFORMAT_RGB565 },
	{ AV_PIX_FMT_BGR565, SDL_PIXELFORMAT_BGR565 },
	{ AV_PIX_FMT_RGB24, SDL_PIXELFORMAT_RGB24 },
	{ AV_PIX_FMT_BGR24, SDL_PIXELFORMAT_BGR24 },
	{ AV_PIX_FMT_0RGB32, SDL_PIXELFORMAT_RGB888 },
	{ AV_PIX_FMT_0BGR32, SDL_PIXELFORMAT_BGR888 },
	{ AV_PIX_FMT_NE(RGB0, 0BGR), SDL_PIXELFORMAT_RGBX8888 },
	{ AV_PIX_FMT_NE(BGR0, 0RGB), SDL_PIXELFORMAT_BGRX8888 },
	{ AV_PIX_FMT_RGB32, SDL_PIXELFORMAT_ARGB8888 },
	{ AV_PIX_FMT_RGB32_1, SDL_PIXELFORMAT_RGBA8888 },
	{ AV_PIX_FMT_BGR32, SDL_PIXELFORMAT_ABGR8888 },
	{ AV_PIX_FMT_BGR32_1, SDL_PIXELFORMAT_BGRA8888 },
	{ AV_PIX_FMT_YUV420P, SDL_PIXELFORMAT_IYUV },
	{ AV_PIX_FMT_YUYV422, SDL_PIXELFORMAT_YUY2 },
	{ AV_PIX_FMT_UYVY422, SDL_PIXELFORMAT_UYVY },
	{ AV_PIX_FMT_NONE, SDL_PIXELFORMAT_UNKNOWN },
};

/// <summary>
/// 錯(cuò)誤碼
/// </summary>
typedef  enum {
	//無錯(cuò)誤
	ERRORCODE_NONE = 0,
	//播放
	ERRORCODE_PLAY_OPENINPUTSTREAMFAILED = -0xffff,//打開輸入流失敗
	ERRORCODE_PLAY_VIDEOINITFAILED,//視頻初始化失敗
	ERRORCODE_PLAY_AUDIOINITFAILED,//音頻初始化失敗
	ERRORCODE_PLAY_LOOPERROR,//播放循環(huán)錯(cuò)誤
	ERRORCODE_PLAY_READPACKETERROR,//解包錯(cuò)誤
	ERRORCODE_PLAY_VIDEODECODEERROR,//視頻解碼錯(cuò)誤
	ERRORCODE_PLAY_AUDIODECODEERROR,//音頻解碼錯(cuò)誤
	ERRORCODE_PLAY_VIDEODISPLAYERROR,//視頻播放錯(cuò)誤
	ERRORCODE_PLAY_AUDIOPLAYERROR,//音頻播放錯(cuò)誤
	//解包
	ERRORCODE_PACKET_CANNOTOPENINPUTSTREAM,//無法代碼輸入流
	ERRORCODE_PACKET_CANNOTFINDSTREAMINFO,//查找不到流信息
	ERRORCODE_PACKET_DIDNOTFINDDANYSTREAM,//找不到任何流
	ERRORCODE_PACKET_READFRAMEFAILED,//讀取包失敗
	//解碼
	ERRORCODE_DECODER_CANNOTALLOCATECONTEXT,//解碼器上下文申請內(nèi)存失敗
	ERRORCODE_DECODER_SETPARAMFAILED,//解碼器上下文設(shè)置參數(shù)失敗
	ERRORCODE_DECODER_CANNOTFINDDECODER,//找不到解碼器
	ERRORCODE_DECODER_OPENFAILED,//打開解碼器失敗
	ERRORCODE_DECODER_SENDPACKEDFAILED,//解碼失敗
	ERRORCODE_DECODER_MISSINGASTREAMTODECODE,//缺少用于解碼的流
	//視頻
	ERRORCODE_VIDEO_DECODERINITFAILED,//音頻解碼器初始化失敗
	ERRORCODE_VIDEO_CANNOTGETSWSCONTEX,//無法獲取ffmpeg swsContext
	ERRORCODE_VIDEO_IMAGEFILLARRAYFAILED,//將圖像數(shù)據(jù)映射到數(shù)組時(shí)失?。篴v_image_fill_arrays
	ERRORCODE_VIDEO_CANNOTRESAMPLEAFRAME,//無法重采樣視頻幀
	ERRORCODE_VIDEO_MISSINGSTREAM,//缺少視頻流
	//音頻
	ERRORCODE_AUDIO_DECODERINITFAILED,//音頻解碼器初始化失敗
	ERRORCODE_AUDIO_UNSUPORTDEVICESAMPLEFORMAT,//不支持音頻設(shè)備采樣格式
	ERRORCODE_AUDIO_SAMPLESSIZEINVALID,//采樣大小不合法
	ERRORCODE_AUDIO_MISSINGSTREAM,//缺少音頻流
	ERRORCODE_AUDIO_SWRINITFAILED,//ffmpeg swr重采樣對象初始化失敗
	ERRORCODE_AUDIO_CANNOTCONVERSAMPLE,//音頻重采樣失敗
	ERRORCODE_AUDIO_QUEUEISEMPTY,//隊(duì)列數(shù)據(jù)為空
	//幀
	ERRORCODE_FRAME_ALLOCFAILED,//初始化幀失敗
	//隊(duì)列
	ERRORCODE_FIFO_ALLOCFAILED,//初始化隊(duì)列失敗
	//sdl
	ERRORCODE_SDL_INITFAILED,//sdl初始化失敗
	ERRORCODE_SDL_CANNOTCREATEMUTEX,//無法創(chuàng)建互斥鎖
	ERRORCODE_SDL_CANNOTOPENDEVICE, //無法打開音頻設(shè)備
	ERRORCODE_SDL_CREATEWINDOWFAILED,//創(chuàng)建窗口失敗
	ERRORCODE_SDL_CREATERENDERERFAILED,//創(chuàng)建渲染器失敗
	ERRORCODE_SDL_CREATETEXTUREFAILED,//創(chuàng)建紋理失敗
	//內(nèi)存
	ERRORCODE_MEMORY_ALLOCFAILED,//申請內(nèi)存失敗
	ERRORCODE_MEMORY_LEAK,//內(nèi)存泄漏
	//參數(shù)
	ERRORCODE_ARGUMENT_INVALID,//參數(shù)不合法
	ERRORCODE_ARGUMENT_OUTOFRANGE,//超出范圍
}ErrorCode;

/// <summary>
/// 日志等級
/// </summary>
typedef  enum {
	LOGLEVEL_NONE = 0,
	LOGLEVEL_INFO = 1,
	LOGLEVEL_DEBUG = 2,
	LOGLEVEL_TRACE = 4,
	LOGLEVEL_WARNNING = 8,
	LOGLEVEL_ERROR = 16,
	LOGLEVEL_ALL = LOGLEVEL_INFO | LOGLEVEL_DEBUG | LOGLEVEL_TRACE | LOGLEVEL_WARNNING | LOGLEVEL_ERROR
}
LogLevel;

//輸出日志
#define LOGHELPERINTERNALLOG(message,level,...)  aclog(__FILE__,__FUNCTION__,__LINE__,level,message,##__VA_ARGS__)  
#define LOG_INFO(message,...) LOGHELPERINTERNALLOG(message,LOGLEVEL_INFO, ##__VA_ARGS__)
#define LOG_DEBUG(message,...) LOGHELPERINTERNALLOG(message,LOGLEVEL_DEBUG,##__VA_ARGS__)
#define LOG_TRACE(message,...) LOGHELPERINTERNALLOG(cmessage,LOGLEVEL_TRACE,##__VA_ARGS__)
#define LOG_WARNNING(message,...) LOGHELPERINTERNALLOG(message,LOGLEVEL_WARNNING,##__VA_ARGS__)
#define LOG_ERROR(message,...) LOGHELPERINTERNALLOG(message,LOGLEVEL_ERROR,##__VA_ARGS__)

static int logLevelFilter = LOGLEVEL_ALL;
static ThreadPool* _pool = NULL;

//寫日志
void aclog(const char* fileName, const char* methodName, int line, LogLevel level, const char* message, ...) {
	if ((logLevelFilter & level) == 0)
		return;
	char dateTime[32];
	time_t tt = time(0);
	struct tm* t;
	va_list valist;
	char buf[512];
	char* pBuf = buf;
	va_start(valist, message);
	int size = vsnprintf(pBuf, sizeof(buf), message, valist);
	if (size > sizeof(buf))
	{
		pBuf = (char*)av_malloc(size + 1);
		vsnprintf(pBuf, size + 1, message, valist);
	}
	va_end(valist);
	t = localtime(&tt);
	sprintf(dateTime, "%04d-%02d-%02d %02d:%02d:%02d", t->tm_year + 1900, t->tm_mon + 1, t->tm_mday, t->tm_hour, t->tm_min, t->tm_sec);
	//在此處可替換為寫文件
	printf("%s %d %d %s %s %d: %s\n", dateTime, level, SDL_ThreadID(), fileName, methodName, line, pBuf);
	if (pBuf != buf)
		av_free(pBuf);
}
//日志過濾,設(shè)為LOGLEVEL_NONE則不輸出日志
void setLogFilter(LogLevel level) {
	logLevelFilter = level;
}


//初始化消息隊(duì)列
int messageQueue_init(MessageQueue* _this, int capacity, int elementSize) {
	_this->_queue = av_fifo_alloc(elementSize * capacity);
	if (!_this->_queue)
		return ERRORCODE_MEMORY_ALLOCFAILED;
	_this->_mtx = SDL_CreateMutex();
	if (!_this->_mtx)
		return ERRORCODE_MEMORY_ALLOCFAILED;
	_this->_cv = SDL_CreateCond();
	if (!_this->_cv)
		return ERRORCODE_MEMORY_ALLOCFAILED;
	_this->_capacity = capacity;
	_this->_elementSize = elementSize;
	return 0;
}
//反初始化消息隊(duì)列
void messageQueue_deinit(MessageQueue* _this) {
	if (_this->_queue)
		av_fifo_free(_this->_queue);
	if (_this->_cv)
		SDL_DestroyCond(_this->_cv);
	if (_this->_mtx)
		SDL_DestroyMutex(_this->_mtx);
	memset(_this, 0, sizeof(MessageQueue));
}

//推入消息
int messageQueue_push(MessageQueue* _this, void* msg) {
	int ret = 0;
	SDL_LockMutex(_this->_mtx);
	ret = av_fifo_generic_write(_this->_queue, msg, _this->_elementSize, NULL);
	SDL_CondSignal(_this->_cv);
	SDL_UnlockMutex(_this->_mtx);
	return  ret > 0;
}
//輪詢序消息
int messageQueue_poll(MessageQueue* _this, void* msg) {
	SDL_LockMutex(_this->_mtx);
	int size = av_fifo_size(_this->_queue);
	if (size >= _this->_elementSize)
	{
		av_fifo_generic_read(_this->_queue, msg, _this->_elementSize, NULL);
	}
	SDL_UnlockMutex(_this->_mtx);
	return size;
}
//等待消息
void messageQueue_wait(MessageQueue* _this, void* msg) {
	SDL_LockMutex(_this->_mtx);
	while (1) {
		int size = av_fifo_size(_this->_queue);
		if (size >= _this->_elementSize)
		{
			av_fifo_generic_read(_this->_queue, msg, _this->_elementSize, NULL);
			break;
		}
		SDL_CondWait(_this->_cv, _this->_mtx);
	}
	SDL_UnlockMutex(_this->_mtx);
}

//初始化對象池
int ojectPool_init(OjectPool* _this, void* bufferArray, int elementSize, int arraySize)
{
	if (elementSize < 1 || arraySize < 1)
		return ERRORCODE_ARGUMENT_INVALID;
	_this->buffer = (unsigned char*)bufferArray;
	_this->elementSize = elementSize;
	_this->arraySize = arraySize;
	_this->_arrayUseState = (int*)av_mallocz(sizeof(int) * arraySize);
	if (!_this->_arrayUseState)
		return ERRORCODE_MEMORY_ALLOCFAILED;
	_this->_mtx = SDL_CreateMutex();
	if (!_this->_mtx)
		return ERRORCODE_MEMORY_ALLOCFAILED;
	_this->_cv = SDL_CreateCond();
	if (!_this->_cv)
		return ERRORCODE_MEMORY_ALLOCFAILED;
	return 0;
}
//反初始化對象池
void objectPool_deinit(OjectPool* _this)
{
	av_free(_this->_arrayUseState);
	if (_this->_cv)
		SDL_DestroyCond(_this->_cv);
	if (_this->_mtx)
		SDL_DestroyMutex(_this->_mtx);
	memset(_this, 0, sizeof(OjectPool));
}

//取出對象
void* objectPool_take(OjectPool* _this, int timeout) {
	void* element = NULL;
	SDL_LockMutex(_this->_mtx);
	while (1)
	{
		for (int i = 0; i < _this->arraySize; i++)
		{
			if (!_this->_arrayUseState[i])
			{
				element = &((Uint8*)_this->buffer)[i * _this->elementSize];
				_this->_arrayUseState[i] = 1;
				break;
			}
		}
		if (!element)
		{
			if (timeout == -1)
			{
				int ret = SDL_CondWait(_this->_cv, _this->_mtx);
				if (ret == -1)
				{
					LOG_ERROR("SDL_CondWait error");
					break;
				}
			}
			else
			{
				int ret = SDL_CondWaitTimeout(_this->_cv, _this->_mtx, timeout);
				if (ret != 0)
				{
					if (ret == -1)
					{
						LOG_ERROR("SDL_CondWait error");
					}
					break;
				}
			}
		}
		else
		{
			break;
		}
	}
	SDL_UnlockMutex(_this->_mtx);
	return element;
}

//歸還對象
void objectPool_restore(OjectPool* _this, void* element) {
	SDL_LockMutex(_this->_mtx);
	for (int i = 0; i < _this->arraySize; i++)
	{
		if (_this->_arrayUseState[i] && &((Uint8*)_this->buffer)[i * _this->elementSize] == element)
		{
			SDL_CondSignal(_this->_cv);
			_this->_arrayUseState[i] = 0;
			break;
		}
	}
	SDL_UnlockMutex(_this->_mtx);
}

//初始化線程池
int threadPool_init(ThreadPool* _this, int maxThreadCount) {
	_this->maxThreadCount = maxThreadCount;
	return ojectPool_init(&_this->_pool, av_mallocz(sizeof(ThreadInfo) * maxThreadCount), sizeof(ThreadInfo), maxThreadCount);
}
//反初始化線程池
void threadPool_denit(ThreadPool* _this) {
	ThreadInfo* threads = (ThreadInfo*)_this->_pool.buffer;
	if (threads)
	{
		for (int i = 0; i < _this->maxThreadCount; i++)
		{
			int status;
			if (threads[i]._thread)
			{
				int msg = 0;
				messageQueue_push(&threads[i]._queue, &msg);
				SDL_WaitThread(threads[i]._thread, &status);
				messageQueue_deinit(&threads[i]._queue);
			}
		}
	}
	av_freep(&_this->_pool.buffer);
	objectPool_deinit(&_this->_pool);
}
//線程池,線程處理過程
int threadPool_threadProc(void* data)
{
	ThreadInfo* info = (ThreadInfo*)data;
	int msg = 1;
	while (msg) {
		info->_fn(info->_arg);
		objectPool_restore(&info->_threadPool->_pool, info);
		messageQueue_wait(&info->_queue, &msg);
	}
	return 0;
}
//在線程池中運(yùn)行方法
void threadPool_run(ThreadPool* _this, SDL_ThreadFunction fn, void* arg) {
	ThreadInfo* info = (ThreadInfo*)objectPool_take(&_this->_pool, -1);
	info->_fn = fn;
	info->_arg = arg;
	if (info->_thread)
	{
		int msg = 1;
		messageQueue_push(&info->_queue, &msg);
	}
	else
	{
		info->_threadPool = _this;
		messageQueue_init(&info->_queue, 1, sizeof(int));
		info->_thread = SDL_CreateThread(threadPool_threadProc, "threadPool_threadProc", info);
	}
}

//在播放線程中運(yùn)行方法
void play_beginInvoke(Play* _this, SDL_ThreadFunction fn, void* arg)
{
	PlayMessage msg;
	msg.type = PLAYMESSAGETYPE_INVOKE;
	msg.param1 = fn;
	msg.param2 = arg;
	messageQueue_push(&_this->mq, &msg);
}

//#include<chrono>
/// <summary>
/// 返回當(dāng)前時(shí)間
/// </summary>
/// <returns>當(dāng)前時(shí)間,單位秒,精度微秒</returns>
static double  getCurrentTime()
{
	//此處用的是ffmpeg的av_gettime_relative。如果沒有ffmpeg環(huán)境,則可替換成平臺獲取時(shí)鐘的方法:單位為秒,精度需要微妙,相對絕對時(shí)鐘都可以。
	return av_gettime_relative() / 1000000.0;
	//return std::chrono::time_point_cast <std::chrono::nanoseconds>(std::chrono::high_resolution_clock::now()).time_since_epoch().count() / 1e+9;
}

/// <summary>
/// 重置時(shí)鐘同步
/// 通常用于暫停、定位
/// </summary>
/// <param name="syn">時(shí)鐘同步對象</param>
void synchronize_reset(Synchronize* syn) {
	SynchronizeType type = syn->type;
	memset(syn, 0, sizeof(Synchronize));
	syn->type = type;
}

/// <summary>
/// 獲取主時(shí)鐘
/// </summary>
/// <param name="syn">時(shí)鐘同步對象</param>
/// <returns>主時(shí)鐘對象</returns>
Clock* synchronize_getMasterClock(Synchronize* syn) {
	switch (syn->type)
	{
	case SYNCHRONIZETYPE_AUDIO:
		return &syn->audio;
	case SYNCHRONIZETYPE_VIDEO:
		return &syn->video;
	case SYNCHRONIZETYPE_ABSOLUTE:
		return &syn->absolute;
	default:
		break;
	}
	return 0;
}

/// <summary>
/// 獲取主時(shí)鐘的時(shí)間
/// </summary>
/// <param name="syn">時(shí)鐘同步對象</param>
/// <returns>時(shí)間,單位s</returns>
double synchronize_getMasterTime(Synchronize* syn) {
	return getCurrentTime() - synchronize_getMasterClock(syn)->startTime;
}

/// <summary>
/// 設(shè)置時(shí)鐘的時(shí)間
/// </summary>
/// <param name="syn">時(shí)鐘同步對象</param>
/// <param name="pts">當(dāng)前時(shí)間,單位s</param>
void synchronize_setClockTime(Synchronize* syn, Clock* clock, double pts)
{
	clock->currentPts = pts;
	clock->startTime = getCurrentTime() - pts;
}

/// <summary>
/// 獲取時(shí)鐘的時(shí)間
/// </summary>
/// <param name="syn">時(shí)鐘同步對象</param>
/// <param name="clock">時(shí)鐘對象</param>
/// <returns>時(shí)間,單位s</returns>
double synchronize_getClockTime(Synchronize* syn, Clock* clock)
{
	return  getCurrentTime() - clock->startTime;
}

/// <summary>
/// 更新視頻時(shí)鐘
/// </summary>
/// <param name="syn">時(shí)鐘同步對象</param>
/// <param name="pts">視頻幀pts,單位為s</param>
/// <param name="duration">視頻幀時(shí)長,單位為s。缺省值為0,內(nèi)部自動(dòng)估算duration</param>
/// <returns>大于0則延時(shí)值為延時(shí)時(shí)長,等于0顯示,小于0丟幀</returns>
double synchronize_updateVideo(Synchronize* syn, double pts, double duration)
{
	if (duration == 0)
		//估算duration
	{
		if (pts != syn->video.currentPts)
			syn->estimateVideoDuration = (syn->estimateVideoDuration * syn->n + pts - syn->video.currentPts) / (double)(syn->n + 1);
		duration = syn->estimateVideoDuration;
		//只估算最新3幀
		if (syn->n++ > 3)
			syn->estimateVideoDuration = syn->n = 0;
		if (duration == 0)
			duration = 0.1;
	}
	if (syn->video.startTime == 0)
	{
		syn->video.startTime = getCurrentTime() - pts;
	}
	//以下變量時(shí)間單位為s	
	//當(dāng)前時(shí)間
	double currentTime = getCurrentTime() - syn->video.startTime;
	//計(jì)算時(shí)間差,大于0則late,小于0則early。
	double diff = currentTime - pts;
	double sDiff = 0;
	if (syn->type != SYNCHRONIZETYPE_VIDEO && synchronize_getMasterClock(syn)->startTime != 0)
		//同步到主時(shí)鐘	
	{
		sDiff = syn->video.startTime - synchronize_getMasterClock(syn)->startTime;
		diff += sDiff;
	}
	修正時(shí)間,時(shí)鐘和視頻幀偏差超過0.1s時(shí)重新設(shè)置起點(diǎn)時(shí)間。
	//if (diff > 0.1)
	//{
	//	syn->video.startTime = getCurrentTime() - pts;
	//	currentTime = pts;
	//	diff = 0;
	//}
	//時(shí)間早了延時(shí)
	if (diff < -0.001)
	{
		if (diff < -0.1)
		{
			diff = -0.1;
		}
		//printf("video-time:%.3lfs audio-time:%.3lfs avDiff:%.4lfms early:%.4lfms  \n", getCurrentTime() - syn->video.startTime, getCurrentTime() - syn->audio.startTime, sDiff * 1000, diff * 1000);
		return -diff;
	}
	syn->video.currentPts = pts;
	//時(shí)間晚了丟幀,duration為一幀的持續(xù)時(shí)間,在一個(gè)duration內(nèi)是正常時(shí)間,加一個(gè)duration作為閾值來判斷丟幀。
	if (diff > 2 * duration)
	{
		//printf("time:%.3lfs avDiff %.4lfms late for:%.4lfms droped\n", pts, sDiff * 1000, diff * 1000);
		return -1;
	}
	//更新視頻時(shí)鐘
	printf("video-time:%.3lfs  audio-time:%.3lfs absolute-time:%.3lfs synDiff:%.4lfms diff:%.4lfms  \r", pts, getCurrentTime() - syn->audio.startTime, getCurrentTime() - syn->absolute.startTime, sDiff * 1000, diff * 1000);
	syn->video.startTime = getCurrentTime() - pts;
	if (syn->absolute.startTime == 0)
	{
		syn->absolute.startTime = syn->video.startTime;
	}
	return 0;
}

//double lastTime = 0;
/// <summary>
/// 更新音頻時(shí)鐘
/// </summary>
/// <param name="syn">時(shí)鐘同步對象</param>
/// <param name="samples">采樣數(shù)</param>
/// <param name="samplerate">采樣率</param>
/// <returns>應(yīng)該播放的采樣數(shù)</returns>
int synchronize_updateAudio(Synchronize* syn, int samples, int samplerate) {

	if (syn->type != SYNCHRONIZETYPE_AUDIO && synchronize_getMasterClock(syn)->startTime != 0)
	{
		//同步到主時(shí)鐘	
		double audioTime = getCurrentTime() - syn->audio.startTime;
		double diff = 0;
		diff = synchronize_getMasterTime(syn) - audioTime;
		int oldSamples = samples;
		if (fabs(diff) > 0.01) {
			samples += diff * samplerate;
		}
		if (samples < 0)
		{
			samples = 0;
		}
		if (samples > oldSamples * 2)
		{
			samples = oldSamples * 2;
		}
	}
	syn->audio.currentPts += (double)samples / samplerate;
	syn->audio.startTime = getCurrentTime() - syn->audio.currentPts;
	if (syn->absolute.startTime == 0)
	{
		syn->absolute.startTime = syn->audio.startTime;
	}
	return samples;
}

/// <summary>
/// 更新音頻時(shí)鐘,通過數(shù)據(jù)長度
/// </summary>
/// <param name="syn">時(shí)鐘同步對象</param>
/// <param name="bytesSize">數(shù)據(jù)長度</param>
/// <param name="samplerate">采樣率</param>
/// <param name="channels">聲道數(shù)</param>
/// <param name="bitsPerSample">位深</param>
/// <returns>應(yīng)該播放的數(shù)據(jù)長度</returns>
int synchronize_updateAudioByBytesSize(Synchronize* syn, size_t bytesSize, int samplerate, int channels, int bitsPerSample) {
	return synchronize_updateAudio(syn, bytesSize / (channels * bitsPerSample / 8), samplerate) * (bitsPerSample / 8) * channels;
}

//初始化解碼器
static int decoder_init(Play* play, Decoder* decoder, int wantFifoPacketSize, int wantFifoFrameSize) {
	//創(chuàng)建解碼上下文
	decoder->codecContext = avcodec_alloc_context3(NULL);
	AVDictionary* opts = NULL;
	if (decoder->codecContext == NULL)
	{
		LOG_ERROR("Could not allocate AVCodecContext");
		return ERRORCODE_DECODER_CANNOTALLOCATECONTEXT;
	}
	//獲取解碼器
	if (avcodec_parameters_to_context(decoder->codecContext, play->formatContext->streams[decoder->streamIndex]->codecpar) < 0)
	{
		LOG_ERROR("Could not init AVCodecContext");
		return ERRORCODE_DECODER_SETPARAMFAILED;
	}
	AVCodec* codec = avcodec_find_decoder(decoder->codecContext->codec_id);
	if (codec == NULL) {
		LOG_ERROR("Codec not found");
		return ERRORCODE_DECODER_CANNOTFINDDECODER;
	}
	//使用多線程解碼
	if (!av_dict_get(opts, "threads", NULL, 0))
		av_dict_set(&opts, "threads", "auto", 0);
	//打開解碼器
	if (avcodec_open2(decoder->codecContext, codec, &opts) < 0) {
		LOG_ERROR("Could not open codec");
		av_dict_free(&opts);
		return ERRORCODE_DECODER_OPENFAILED;
	}
	av_dict_free(&opts);
	//初始化臨時(shí)幀
	decoder->frame = av_frame_alloc();
	if (!decoder->frame)
	{
		LOG_ERROR("Alloc avframe failed");
		return ERRORCODE_FRAME_ALLOCFAILED;
	}
	//初始化包隊(duì)列
	decoder->fifoPacket = av_fifo_alloc(sizeof(AVPacket*) * wantFifoPacketSize);
	if (!decoder->fifoPacket)
	{
		LOG_ERROR("alloc packet fifo failed");
		return ERRORCODE_FIFO_ALLOCFAILED;
	}
	//初始化幀隊(duì)列
	decoder->fifoFrame = av_fifo_alloc(sizeof(AVFrame*) * wantFifoFrameSize);
	if (!decoder->fifoFrame)
	{
		LOG_ERROR("alloc frame fifo failed");
		return ERRORCODE_FIFO_ALLOCFAILED;
	}
	return 0;
}
//清空解碼隊(duì)列
static void decoder_clear(Play* play, Decoder* decoder) {
	//清空包隊(duì)列
	if (decoder->fifoPacket)
	{
		while (av_fifo_size(decoder->fifoPacket) > 0)
		{
			AVPacket* packet;
			av_fifo_generic_read(decoder->fifoPacket, &packet, sizeof(AVPacket*), NULL);
			if (packet != &play->packet)
			{
				av_packet_unref(packet);
				av_packet_free(&packet);
			}
		}
	}
	//清空幀隊(duì)列
	if (decoder->fifoFrame)
	{
		while (av_fifo_size(decoder->fifoFrame) > 0)
		{
			AVFrame* frame;
			av_fifo_generic_read(decoder->fifoFrame, &frame, sizeof(AVFrame*), NULL);
			av_frame_unref(frame);
			av_frame_free(&frame);
		}
	}

	//清空解碼器緩存
	if (decoder->codecContext)
	{
		avcodec_flush_buffers(decoder->codecContext);
	}
}
//反初始化解碼器
static void decoder_deinit(Play* play, Decoder* decoder)
{
	decoder_clear(play, decoder);
	if (decoder->codecContext)
	{
		avcodec_close(decoder->codecContext);
		avcodec_free_context(&decoder->codecContext);
	}
	if (decoder->fifoPacket)
	{
		av_fifo_free(decoder->fifoPacket);
		decoder->fifoPacket = NULL;
	}
	if (decoder->fifoFrame)
	{
		av_fifo_free(decoder->fifoFrame);
		decoder->fifoFrame = NULL;
	}
	if (decoder->frame)
	{
		if (decoder->frame->format != -1)
		{
			av_frame_unref(decoder->frame);
		}
		av_frame_free(&decoder->frame);
	}
	decoder->eofFrame = 0;
}
//解碼
static int decoder_decode(Play* play, Decoder* decoder) {
	int ret = 0;
	AVPacket* packet = NULL;
	if (decoder->streamIndex == -1)
	{
		LOG_ERROR("Decoder missing a stream");
		return ERRORCODE_DECODER_MISSINGASTREAMTODECODE;
	}
	if (av_fifo_space(decoder->fifoFrame) < 1)
		//幀隊(duì)列已滿
	{
		goto end;
	}
	//接收上次解碼的幀
	while (avcodec_receive_frame(decoder->codecContext, decoder->frame) == 0) {
		AVFrame* frame = av_frame_clone(decoder->frame);
		av_frame_unref(decoder->frame);
		av_fifo_generic_write(decoder->fifoFrame, &frame, sizeof(AVFrame*), NULL);
		if (av_fifo_space(decoder->fifoFrame) < 1)
			//幀隊(duì)列已滿
		{
			goto end;
		}
	}
	if (av_fifo_size(decoder->fifoPacket) > 0)
		//包隊(duì)列有數(shù)據(jù),開始解碼
	{
		av_fifo_generic_read(decoder->fifoPacket, &packet, sizeof(AVPacket*), NULL);
		//發(fā)送包
		if (avcodec_send_packet(decoder->codecContext, packet) < 0)
		{
			LOG_ERROR("Decode error");
			ret = ERRORCODE_DECODER_SENDPACKEDFAILED;
			goto end;
		}
		//接收解碼的幀
		while (avcodec_receive_frame(decoder->codecContext, decoder->frame) == 0) {
			AVFrame* frame = av_frame_clone(decoder->frame);
			av_frame_unref(decoder->frame);
			av_fifo_generic_write(decoder->fifoFrame, &frame, sizeof(AVFrame*), NULL);
			if (av_fifo_space(decoder->fifoFrame) < 1)
				//幀隊(duì)列已滿
			{
				goto end;
			}
		}
	}
	if (play->eofPacket)
	{
		decoder->eofFrame = 1;
	}
end:
	if (packet && packet != &play->packet)
	{
		av_packet_unref(packet);
		av_packet_free(&packet);
	}
	return ret;
}



//音頻設(shè)備播放回調(diào)
static void audio_callback(void* userdata, uint8_t* stream, int len) {
	Play* play = (Play*)userdata;
	int samples = 0;
	//讀取隊(duì)列中的音頻數(shù)據(jù),由于AVAudioFifo非線程安全,且是子線程觸發(fā)此回調(diào),所以需要加鎖
	SDL_LockMutex(play->audio.mutex);
	if (!play->isPaused)
	{
		if (av_audio_fifo_size(play->audio.playFifo) >= play->audio.spec.samples)
		{
			int drain = 0;
			//需要寫入的數(shù)據(jù)量
			samples = play->audio.spec.samples;
			//時(shí)鐘同步,獲取應(yīng)該寫入的數(shù)據(jù)量,如果是同步到音頻,則需要寫入的數(shù)據(jù)量始終等于應(yīng)該寫入的數(shù)據(jù)量。
			samples = synchronize_updateAudio(&play->synchronize, samples, play->audio.spec.freq);
			samples += play->audio.accumulateSamples;
			if (samples > av_audio_fifo_size(play->audio.playFifo))
			{
				play->audio.accumulateSamples = samples - av_audio_fifo_size(play->audio.playFifo);
				samples = av_audio_fifo_size(play->audio.playFifo);
			}
			else
			{
				play->audio.accumulateSamples = 0;
			}
			if (samples > play->audio.spec.samples)
				//比需要寫入的數(shù)據(jù)量大,則丟棄一部分
			{
				drain = samples - play->audio.spec.samples;
				samples = play->audio.spec.samples;
			}
			if (play->audio.volume + SDL_MIX_MAXVOLUME != SDL_MIX_MAXVOLUME)
				//改變音量
			{
				if (!play->audio.mixBuffer)
				{
					play->audio.mixBuffer = (uint8_t*)av_malloc(len);
					if (!play->audio.mixBuffer)
					{
						LOG_ERROR("mixBuffer alloc failed");
						return;
					}
				}
				av_audio_fifo_read(play->audio.playFifo, (void**)&play->audio.mixBuffer, samples);
				int len2 = av_samples_get_buffer_size(0, play->audio.spec.channels, samples, play->audio.forceSampleFormat, 1);
				memset(stream, 0, len2);
				SDL_MixAudioFormat(stream, play->audio.mixBuffer, play->audio.spec.format, len2, play->audio.volume + SDL_MIX_MAXVOLUME);
			}
			else
				//直接寫入
			{
				av_audio_fifo_read(play->audio.playFifo, (void**)&stream, samples);
			}
			av_audio_fifo_drain(play->audio.playFifo, drain);
		}
	}
	SDL_UnlockMutex(play->audio.mutex);
	//補(bǔ)充靜音數(shù)據(jù)
	int fillSize = av_samples_get_buffer_size(0, play->audio.spec.channels, samples, play->audio.forceSampleFormat, 1);
	if (fillSize < 0)
		fillSize = 0;
	if (len - fillSize > 0)
	{
		memset(stream + fillSize, 0, len - fillSize);
	}

}
//初始化音頻模塊
static int audio_init(Play* play, Audio* audio) {
	//初始化解碼器
	if (decoder_init(play, &audio->decoder, 600, 100) != 0)
	{
		LOG_ERROR("audio decoder init error");
		return 	ERRORCODE_AUDIO_DECODERINITFAILED;
	}
	//初始化sdl
	if ((SDL_WasInit(0) & (SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER)) == 0)
	{
		if (SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER)) {
			LOG_ERROR("Could not initialize SDL - %s", SDL_GetError());
			return ERRORCODE_SDL_INITFAILED;
		}
	}
	//打開音頻設(shè)備
	audio->wantedSpec.channels = av_get_channel_layout_nb_channels(audio->decoder.codecContext->channel_layout);
	audio->wantedSpec.freq = audio->decoder.codecContext->sample_rate;
	audio->wantedSpec.format = AUDIO_F32SYS;
	audio->wantedSpec.silence = 0;
	audio->wantedSpec.samples = FFMAX(512, 2 << av_log2(audio->wantedSpec.freq / 30));
	audio->wantedSpec.callback = audio_callback;
	audio->wantedSpec.userdata = play;
	audio->audioId = SDL_OpenAudioDevice(NULL, 0, &audio->wantedSpec, &audio->spec, SDL_AUDIO_ALLOW_ANY_CHANGE);
	if (audio->audioId < 2)
	{
		LOG_ERROR("Open audio device error");
		return ERRORCODE_SDL_CANNOTOPENDEVICE;
	}
	//匹配音頻格式
	switch (audio->spec.format)
	{
	case	AUDIO_S16SYS:
		audio->forceSampleFormat = AV_SAMPLE_FMT_S16;
		break;
	case	AUDIO_S32SYS:
		audio->forceSampleFormat = AV_SAMPLE_FMT_S32;
		break;
	case	AUDIO_F32SYS:
		audio->forceSampleFormat = AV_SAMPLE_FMT_FLT;
		break;
	default:

		LOG_ERROR("audio device format was not surported %d", (int)audio->spec.format);
		return ERRORCODE_AUDIO_UNSUPORTDEVICESAMPLEFORMAT;
	}
	//初始化音頻幀隊(duì)列的互斥鎖
	audio->mutex = SDL_CreateMutex();
	if (!audio->mutex)
	{
		LOG_ERROR("alloc mutex failed");
		return ERRORCODE_SDL_CANNOTCREATEMUTEX;
	}
	//音頻播放隊(duì)列
	audio->playFifo = av_audio_fifo_alloc(audio->forceSampleFormat, audio->spec.channels, audio->spec.samples * 30);
	if (!audio->playFifo)
	{
		LOG_ERROR("alloc audio fifo failed");
		return ERRORCODE_FIFO_ALLOCFAILED;
	}
	//設(shè)備開啟播放
	SDL_PauseAudioDevice(audio->audioId, 0);
	return 0;
}
//音頻初始化
static void audio_deinit(Play* play, Audio* audio)
{
	if (audio->audioId >= 2)
	{
		SDL_PauseAudioDevice(audio->audioId, 1);
		SDL_CloseAudioDevice(audio->audioId);
		audio->audioId = 0;
	}
	if (audio->mutex)
	{
		SDL_DestroyMutex(audio->mutex);
		audio->mutex = NULL;
	}
	if (play->audio.playFifo)
	{
		av_audio_fifo_free(play->audio.playFifo);
		play->audio.playFifo = NULL;
	}
	if (audio->swrContext)
	{
		swr_free(&audio->swrContext);
	}
	if (audio->swrBuffer)
	{
		av_freep(&audio->swrBuffer);
	}
	if (audio->mixBuffer)
	{
		av_freep(&audio->mixBuffer);
	}
	decoder_deinit(play, &audio->decoder);
	audio->eofPlay = 0;
	audio->sofPlay = 0;
}


//音頻播放
static int audio_play(Play* play, Audio* audio) {
	if (audio->decoder.streamIndex == -1)
	{
		LOG_ERROR("audio play missing audio stream");
		//沒有音頻流
		return ERRORCODE_AUDIO_MISSINGSTREAM;
	}
	if (play->video.decoder.streamIndex != -1 && !play->video.sofDisplay)
	{
		return 0;
	}
	while (av_fifo_size(play->audio.decoder.fifoFrame) > 0)
	{
		AVFrame* frame = NULL;
		uint8_t* data = NULL;
		int dataSize = 0;
		int samples = 0;
		av_fifo_generic_peek(play->audio.decoder.fifoFrame, &frame, sizeof(AVFrame*), NULL);
		if (play->audio.forceSampleFormat != play->audio.decoder.codecContext->sample_fmt || play->audio.spec.freq != frame->sample_rate || play->audio.spec.channels != frame->channels)
			//重采樣
		{
			//計(jì)算輸入采樣數(shù)
			int out_count = (int64_t)frame->nb_samples * play->audio.spec.freq / frame->sample_rate + 256;
			//計(jì)算輸出數(shù)據(jù)大小
			int out_size = av_samples_get_buffer_size(NULL, play->audio.spec.channels, out_count, play->audio.forceSampleFormat, 0);
			//輸入數(shù)據(jù)指針
			const uint8_t** in = (const uint8_t**)frame->extended_data;
			//輸出緩沖區(qū)指針
			uint8_t** out = &play->audio.swrBuffer;
			int len2 = 0;
			if (out_size < 0) {
				LOG_ERROR("sample output size value %d was invalid", out_size);
				return ERRORCODE_AUDIO_SAMPLESSIZEINVALID;
			}
			if (!play->audio.swrContext)
				//初始化重采樣對象
			{
				play->audio.swrContext = swr_alloc_set_opts(NULL, av_get_default_channel_layout(play->audio.spec.channels), play->audio.forceSampleFormat, play->audio.spec.freq, play->audio.decoder.codecContext->channel_layout, play->audio.decoder.codecContext->sample_fmt, play->audio.decoder.codecContext->sample_rate, 0, NULL);
				if (!play->audio.swrContext || swr_init(play->audio.swrContext) < 0) {
					LOG_ERROR("swr_alloc_set_opts or swr_init failed");
					return ERRORCODE_AUDIO_SWRINITFAILED;
				}
			}
			if (!play->audio.swrBuffer)
				//申請輸出緩沖區(qū)
			{
				play->audio.swrBuffer = (uint8_t*)av_mallocz(out_size);
				if (!play->audio.swrBuffer)
				{
					LOG_ERROR("audio swr ouput buffer alloc failed");
					return ERRORCODE_MEMORY_ALLOCFAILED;
				}
			}
			//執(zhí)行重采樣
			len2 = swr_convert(play->audio.swrContext, out, out_count, in, frame->nb_samples);
			if (len2 < 0) {
				LOG_ERROR("swr_convert failed");
				return ERRORCODE_AUDIO_CANNOTCONVERSAMPLE;
			}
			//取得輸出數(shù)據(jù)
			data = play->audio.swrBuffer;
			//輸出數(shù)據(jù)長度
			dataSize = av_samples_get_buffer_size(0, play->audio.spec.channels, len2, play->audio.forceSampleFormat, 1);
			samples = len2;
		}
		else
			//無需重采樣
		{
			data = frame->data[0];
			dataSize = av_samples_get_buffer_size(frame->linesize, frame->channels, frame->nb_samples, play->audio.forceSampleFormat, 0);
			samples = frame->nb_samples;
		}
		if (dataSize < 0)
		{
			LOG_ERROR("sample data size value %d was invalid", dataSize);
			return ERRORCODE_AUDIO_SAMPLESSIZEINVALID;
		}
		//寫入播放隊(duì)列
		SDL_LockMutex(play->audio.mutex);
		if (av_audio_fifo_space(play->audio.playFifo) >= samples)
		{
			//同步(修正)時(shí)鐘
			AVRational timebase = play->formatContext->streams[audio->decoder.streamIndex]->time_base;
			//當(dāng)前幀的時(shí)間戳
			double pts = (double)frame->pts * timebase.num / timebase.den;
			//減去播放隊(duì)列剩余數(shù)據(jù)的時(shí)長就是當(dāng)前的音頻時(shí)鐘
			pts -= (double)av_audio_fifo_size(play->audio.playFifo) / play->audio.spec.freq;
			//設(shè)置音頻時(shí)鐘
			synchronize_setClockTime(&play->synchronize, &play->synchronize.audio, pts);
			//同步(修正)時(shí)鐘--end
			//寫入播放隊(duì)列
			av_audio_fifo_write(play->audio.playFifo, (void**)&data, samples);
			//解碼隊(duì)列的幀出隊(duì)
			av_fifo_generic_read(play->audio.decoder.fifoFrame, &frame, sizeof(AVFrame*), NULL);
			av_frame_unref(frame);
			av_frame_free(&frame);
			if (!audio->sofPlay)
				//標(biāo)記開始
			{
				audio->sofPlay = 1;
			}
		}
		else
		{
			SDL_UnlockMutex(play->audio.mutex);
			break;
		}
		SDL_UnlockMutex(play->audio.mutex);
	}
	//計(jì)算睡眠延時(shí)
	SDL_LockMutex(play->audio.mutex);
	double canSleepTime = (double)av_audio_fifo_size(play->audio.playFifo) / play->audio.spec.freq;
	double wakeupTime = getCurrentTime() + canSleepTime;
	if (play->video.decoder.streamIndex == -1 || wakeupTime < play->wakeupTime)
	{
		play->wakeupTime = wakeupTime;
	}
	SDL_UnlockMutex(play->audio.mutex);
	if (av_fifo_size(play->audio.decoder.fifoFrame) < 1 && audio->decoder.eofFrame)
		//標(biāo)記結(jié)束
	{
		audio->eofPlay = 1;
	}
	return 0;
}

//初始化視頻模塊
static int video_init(Play* play, Video* video) {
	//初始化解碼器
	if (decoder_init(play, &video->decoder, 600, 1) != 0)
	{
		LOG_ERROR("video decoder init error");
		return ERRORCODE_VIDEO_DECODERINITFAILED;
	}
	//初始化sdl
	if ((SDL_WasInit(0) & (SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER)) == 0)
	{
		if (SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER)) {
			LOG_ERROR("Could not initialize SDL - %s", SDL_GetError());
			return ERRORCODE_SDL_INITFAILED;
		}
	}
	return 0;
}
//反初始化視頻模塊
static void video_deinit(Play* play, Video* video) {
	if (video->swsContext)
	{
		sws_freeContext(video->swsContext);
		video->swsContext = NULL;
	}
	if (video->swsBuffer)
	{
		av_free(video->swsBuffer);
		video->swsBuffer = NULL;
	}
	if (video->sdlTexture)
	{
		SDL_DestroyTexture(video->sdlTexture);
		video->sdlTexture = NULL;
	}
	if (video->sdlRenderer)
	{
		SDL_DestroyRenderer(video->sdlRenderer);
		video->sdlRenderer = NULL;
	}
	if (video->screen)
	{
		SDL_DestroyWindow(video->screen);
		video->screen = NULL;
	}
	decoder_deinit(play, &video->decoder);
	video->eofDisplay = 0;
	video->sofDisplay = 0;

}


double get_rotation(AVStream* st)
{
	AVDictionaryEntry* rotate_tag = av_dict_get(st->metadata, "rotate", NULL, 0);
	double theta = 0;
	if (rotate_tag && *rotate_tag->value && strcmp(rotate_tag->value, "0")) {
		theta = atof(rotate_tag->value);
	}
	theta -= 360 * floor(theta / 360 + 0.9 / 360);
	if (fabs(theta - 90 * round(theta / 90)) > 2)
	{
		LOG_INFO("Odd rotation angle");
	}
	return theta;

}


/// <summary>
/// 計(jì)算旋轉(zhuǎn)的矩形大小
/// </summary>
/// <param name="src">原圖像區(qū)域</param>
/// <param name="dst">目標(biāo)區(qū)域</param>
/// <param name="angle">旋轉(zhuǎn)角度</param>
/// <returns></returns>
static SDL_Rect getRotateRect(SDL_Rect *srcRect, SDL_Rect* dstRect,double angle) {
	SDL_Rect targetRect;
	const double PI = 3.1415926535897935384626;
	double theta = PI / 180.0 * angle;
	//計(jì)算旋轉(zhuǎn)后的邊框大小
	int width = srcRect->h * fabs(sin(theta) )+ srcRect->w * fabs(cos(theta)) + 0.5;
	int height = srcRect->h * fabs(cos(theta)) + srcRect->w * fabs(sin(theta)) + 0.5;
	double srcRatio = (double)srcRect->w / srcRect->h;
	double srcBorderRatio = (double)width / height;
	double dstRatio = (double)dstRect->w / dstRect->h;
	//計(jì)算邊框縮放到目標(biāo)區(qū)域的大小
	int zoomWidth;
	int zoomHeight;
	if (srcBorderRatio > dstRatio)
	{
		zoomWidth = dstRect->w;
		zoomHeight = dstRect->w / srcBorderRatio;
	}
	else
	{
		zoomWidth = dstRect->h * srcBorderRatio;
		zoomHeight = dstRect->h;
	}
	//通過縮放后的邊框計(jì)算還原的圖像大小
	targetRect.h = (double)zoomWidth / (fabs(sin(theta) )+ srcRatio * fabs(cos(theta)));
	targetRect.w = targetRect.h * srcRatio;
	targetRect.x = (dstRect->w- targetRect.w ) / 2;
	targetRect.y = (dstRect->h- targetRect.h ) / 2;
	return targetRect;
}

//渲染到窗口
static int video_present(Play* play, Video* video, AVFrame* frame)
{
	SDL_Rect sdlRect;
	SDL_Rect sdlRect2;
	uint8_t* dst_data[4];
	int dst_linesize[4];

	if (!video->screen)
	{
		//創(chuàng)建窗口
		video->screen = SDL_CreateWindow("video play window", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED,
			video->screen_w, video->screen_h,
			SDL_WINDOW_OPENGL);
		if (!video->screen) {
			LOG_ERROR("SDL: could not create window - exiting:%s\n", SDL_GetError());
			return ERRORCODE_SDL_CREATEWINDOWFAILED;
		}
	}
	if (!video->sdlRenderer)
		//初始化sdl紋理
	{
		video->angle = get_rotation(play->formatContext->streams[play->video.decoder.streamIndex]);
		video->sdlRenderer = SDL_CreateRenderer(video->screen, -1, 0);
		if (!video->sdlRenderer)
		{
			LOG_ERROR("Create sdl renderer error");
			return ERRORCODE_SDL_CREATERENDERERFAILED;
		}
		//獲取合適的像素格式
		struct TextureFormatEntry format;
		format.format = AV_PIX_FMT_YUV420P;
		format.texture_fmt = SDL_PIXELFORMAT_IYUV;
		for (int i = 0; i < sizeof(sdl_texture_format_map) / sizeof(struct TextureFormatEntry); i++)
		{
			if (sdl_texture_format_map[i].format == video->decoder.codecContext->pix_fmt)
			{
				format = sdl_texture_format_map[i];
				break;
			}
		}
		video->forcePixelFormat = format.format;

		//創(chuàng)建和視頻大小一樣的紋理
		video->sdlTexture = SDL_CreateTexture(video->sdlRenderer, format.texture_fmt, SDL_TEXTUREACCESS_STREAMING, video->decoder.codecContext->width, video->decoder.codecContext->height);
		if (!video->sdlTexture)
		{
			LOG_ERROR("Create sdl texture error");
			return ERRORCODE_SDL_CREATETEXTUREFAILED;
		}
	}

	if (video->forcePixelFormat != video->decoder.codecContext->pix_fmt)
		//重采樣-格式轉(zhuǎn)換
	{
		video->swsContext = sws_getCachedContext(video->swsContext, video->decoder.codecContext->width, video->decoder.codecContext->height, video->decoder.codecContext->pix_fmt, video->decoder.codecContext->width, video->decoder.codecContext->height, video->forcePixelFormat, SWS_FAST_BILINEAR, NULL, NULL, NULL);
		if (!video->swsContext)
		{
			LOG_ERROR("sws_getCachedContext failed");
			return ERRORCODE_VIDEO_CANNOTGETSWSCONTEX;
		}
		if (!video->swsBuffer)
		{
			video->swsBuffer = (uint8_t*)av_malloc(av_image_get_buffer_size(video->forcePixelFormat, video->decoder.codecContext->width, video->decoder.codecContext->height, 64));
			if (video->swsBuffer)
			{
				LOG_ERROR("audio swr ouput buffer alloc failed");
				return ERRORCODE_MEMORY_ALLOCFAILED;
			}
		}
		if (av_image_fill_arrays(dst_data, dst_linesize, video->swsBuffer, video->forcePixelFormat, video->decoder.codecContext->width, video->decoder.codecContext->height, 1) < 0)
		{
			LOG_ERROR("sws_getCachedContext failed");
			return ERRORCODE_VIDEO_IMAGEFILLARRAYFAILED;
		}

		if (sws_scale(video->swsContext, frame->data, frame->linesize, 0, frame->height, dst_data, dst_linesize) < 0)
		{
			LOG_ERROR("Call sws_scale error");
			return ERRORCODE_VIDEO_CANNOTRESAMPLEAFRAME;
		}
	}
	else
		//無需重采樣
	{
		memcpy(dst_data, frame->data, sizeof(uint8_t*) * 4);
		memcpy(dst_linesize, frame->linesize, sizeof(int) * 4);
	}

	//窗口區(qū)域
	sdlRect.x = 0;
	sdlRect.y = 0;
	sdlRect.w = video->screen_w;
	sdlRect.h = video->screen_h;
	//視頻區(qū)域
	sdlRect2.x = 0;
	sdlRect2.y = 0;
	sdlRect2.w = video->decoder.codecContext->width;
	sdlRect2.h = video->decoder.codecContext->height;
	//渲染到sdl窗口
	SDL_RenderClear(video->sdlRenderer);
	SDL_UpdateYUVTexture(video->sdlTexture, &sdlRect2, dst_data[0], dst_linesize[0], dst_data[1], dst_linesize[1], dst_data[2], dst_linesize[2]);
	if (video->angle == 0)
		SDL_RenderCopy(video->sdlRenderer, video->sdlTexture, NULL, &sdlRect);
	else
		//旋轉(zhuǎn)視頻
	{
		SDL_Rect sdlRect3;
		sdlRect3= getRotateRect(&sdlRect2,&sdlRect,video->angle);
		SDL_RenderCopyEx(video->sdlRenderer, video->sdlTexture, NULL
			, &sdlRect3, video->angle, 0, SDL_FLIP_NONE);
	}
	SDL_RenderPresent(video->sdlRenderer);
}

//視頻顯示
static int video_display(Play* play, Video* video) {
	if (play->video.decoder.streamIndex == -1)
		//沒有視頻流
	{
		return ERRORCODE_VIDEO_MISSINGSTREAM;
	}
	if (play->audio.decoder.streamIndex != -1 && video->sofDisplay && !play->audio.sofPlay)
		return 0;
	AVFrame* frame = NULL;
	if (av_fifo_size(video->decoder.fifoFrame) > 0)
	{
		av_fifo_generic_peek(video->decoder.fifoFrame, &frame, sizeof(AVFrame*), NULL);
		//---------------時(shí)鐘同步--------------		
		AVRational timebase = play->formatContext->streams[video->decoder.streamIndex]->time_base;
		//計(jì)算視頻幀的pts
		double	pts = frame->pts * (double)timebase.num / timebase.den;
		//視頻幀的持續(xù)時(shí)間
		double duration = frame->pkt_duration * (double)timebase.num / timebase.den;
		double delay = synchronize_updateVideo(&play->synchronize, pts, duration);
		if (delay > 0)
			//延時(shí)
		{
			play->wakeupTime = getCurrentTime() + delay;
			return 0;
		}
		else if (delay < 0)
			//丟幀
		{
			av_fifo_generic_read(video->decoder.fifoFrame, &frame, sizeof(AVFrame*), NULL);
			av_frame_unref(frame);
			av_frame_free(&frame);
			return 0;
		}
		else
			//播放
		{
			av_fifo_generic_read(video->decoder.fifoFrame, &frame, sizeof(AVFrame*), NULL);
		}
		//---------------時(shí)鐘同步--------------	end
	}
	else if (video->decoder.eofFrame)
	{
		video->sofDisplay = 1;
		//標(biāo)記結(jié)束
		video->eofDisplay = 1;
	}
	if (frame)
	{
		//渲染
		video_present(play, video, frame);
		av_frame_unref(frame);
		av_frame_free(&frame);
		if (!video->sofDisplay)
			video->sofDisplay = 1;
		if (play->step)
		{
			play->step--;
		}
	}
	return 0;
}



static int interrupt_cb(void* arg) {
	Play* play = (Play*)arg;
	return play->exitFlag;
}


//打開輸入流
static int packet_open(Play* play) {
	//打開輸入流
	play->formatContext = avformat_alloc_context();
	play->formatContext->interrupt_callback.callback = interrupt_cb;
	play->formatContext->interrupt_callback.opaque = play;
	if (avformat_open_input(&play->formatContext, play->url, NULL, NULL) != 0) {
		LOG_ERROR("Couldn't open input stream");
		return ERRORCODE_PACKET_CANNOTOPENINPUTSTREAM;
	}
	//查找輸入流信息
	if (avformat_find_stream_info(play->formatContext, NULL) < 0) {
		LOG_ERROR("Couldn't find stream information");
		return ERRORCODE_PACKET_CANNOTFINDSTREAMINFO;
	}
	//ignore = 0;
	play->video.decoder.streamIndex = -1;
	//獲取視頻流
	for (unsigned i = 0; i < play->formatContext->nb_streams; i++)
		if (play->formatContext->streams[i]->codecpar->codec_type == AVMEDIA_TYPE_VIDEO) {
			play->video.decoder.streamIndex = i;
			break;
		}
	play->audio.decoder.streamIndex = -1;
	//獲取音頻流
	for (unsigned i = 0; i < play->formatContext->nb_streams; i++)
		if (play->formatContext->streams[i]->codecpar->codec_type == AVMEDIA_TYPE_AUDIO) {
			play->audio.decoder.streamIndex = i;
			break;
		}
	//沒有找到任何流
	if (play->video.decoder.streamIndex == -1 && play->audio.decoder.streamIndex == -1) {
		LOG_ERROR("Didn't find any stream.");
		return ERRORCODE_PACKET_DIDNOTFINDDANYSTREAM;
	}
	play->eofPacket = 0;
	return 0;
}

//異步讀取包完成后的操作
static int packet_readAwait(void* arg)
{
	Play* play = (Play*)arg;
	if (play->eofPacket == 0)
	{
		if (play->packet.stream_index == play->video.decoder.streamIndex)
			//寫入視頻包隊(duì)
		{
			AVPacket* packet = av_packet_clone(&play->packet);
			av_fifo_generic_write(play->video.decoder.fifoPacket, &packet, sizeof(AVPacket*), NULL);
		}
		else if (play->packet.stream_index == play->audio.decoder.streamIndex)
			//寫入音頻包隊(duì)
		{
			AVPacket* packet = av_packet_clone(&play->packet);
			av_fifo_generic_write(play->audio.decoder.fifoPacket, &packet, sizeof(AVPacket*), NULL);
		}
		av_packet_unref(&play->packet);
	}
	else if (play->eofPacket == AVERROR_EOF)
	{
		play->eofPacket = 1;
		//寫入空包flush解碼器中的緩存
		AVPacket* packet = &play->packet;
		if (play->audio.decoder.fifoPacket)
			av_fifo_generic_write(play->audio.decoder.fifoPacket, &packet, sizeof(AVPacket*), NULL);
		if (play->video.decoder.fifoPacket)
			av_fifo_generic_write(play->video.decoder.fifoPacket, &packet, sizeof(AVPacket*), NULL);
	}
	else
	{
		LOG_ERROR("read packet erro!\n");
		play->exitFlag = 1;
		play->isAsyncReading = 0;
		return ERRORCODE_PACKET_READFRAMEFAILED;
	}
	play->isAsyncReading = 0;
	return 0;
}

//異步讀取包
static int packet_readAsync(void* arg)
{
	Play* play = (Play*)arg;
	play->eofPacket = av_read_frame(play->formatContext, &play->packet);
	//回到播放線程處理包
	play_beginInvoke(play, packet_readAwait, play);
	return 0;
}

//讀取包
static int packet_read(Play* play) {
	if (play->isAsyncReading)
		return 0;
	if (play->eofPacket)
	{
		return 0;
	}
	if (play->video.decoder.streamIndex != -1 && av_fifo_space(play->video.decoder.fifoPacket) < 1)
		//視頻包隊(duì)列已滿
	{
		return 0;
	}
	if (play->audio.decoder.streamIndex != -1 && av_fifo_space(play->audio.decoder.fifoPacket) < 1)
		//音頻包隊(duì)列已滿
	{
		return 0;
	}
	if (!_pool)
		//初始化線程池
	{
		_pool = (ThreadPool*)av_mallocz(sizeof(ThreadPool));
		threadPool_init(_pool, 32);
	}
	play->isAsyncReading = 1;
	//異步讀取包
	threadPool_run(_pool, packet_readAsync, play);
	return 0;
}
static void play_eventHandler(Play* play);
//等待讀包結(jié)束
static void packet_waitAsyncReadFinished(Play* play) {
	//確保異步操作結(jié)束
	while (play->isAsyncReading)
	{
		av_usleep(0.01 * 1000000);
		play_eventHandler(play);
	}
}

//關(guān)閉輸入流
static void packet_close(Play* play) {
	packet_waitAsyncReadFinished(play);
	if (play->packet.data)
	{
		av_packet_unref(&play->packet);
	}
	if (play->formatContext)
	{
		avformat_close_input(&play->formatContext);
	}
}

//定位
static int packet_seek(Play* play, double time) {
	packet_waitAsyncReadFinished(play);
	return avformat_seek_file(play->formatContext, -1, INT64_MIN, time * AV_TIME_BASE, INT64_MAX, 0) >= 0;
}



//設(shè)置窗口大小
void play_setWindowSize(Play* play, int width, int height) {
	play->video.screen_w = width;
	play->video.screen_h = height;
}


//定位
void play_seek(Play* play, double time) {

	if (time < 0)
	{
		time = 0;
	}
	if (packet_seek(play, time))
	{
		//重置屬性
		play->audio.accumulateSamples = 0;
		play->audio.sofPlay = 0;
		play->video.sofDisplay = 0;
		//清除緩存
		decoder_clear(play, &play->video.decoder);
		decoder_clear(play, &play->audio.decoder);
		avformat_flush(play->formatContext);
		if (play->audio.playFifo)
		{
			SDL_LockMutex(play->audio.mutex);
			av_audio_fifo_reset(play->audio.playFifo);
			synchronize_reset(&play->synchronize);
			SDL_UnlockMutex(play->audio.mutex);
		}
		else
		{
			synchronize_reset(&play->synchronize);
		}
		//暫停時(shí)需要播放一幀
		play->step = 1;
	}
}

//暫停
void play_pause(Play* play, int isPaused) {
	if (play->isPaused == isPaused)
		return;
	if (!isPaused)
	{
		play->audio.sofPlay = 0;
		play->video.sofDisplay = 0;
		synchronize_reset(&play->synchronize);
	}
	play->isPaused = isPaused;
}

//靜音
void play_setVolume(Play* play, int value) {
	//移動(dòng)到0作為最大音量。則初始化memset后不需要設(shè)置默認(rèn)音量。
	if (value < 0)
		value = 0;
	value -= SDL_MIX_MAXVOLUME;
	if (play->audio.volume == value)
		return;
	play->audio.volume = value;
}

int play_getVolume(Play* play) {
	return play->audio.volume + SDL_MIX_MAXVOLUME;
}

//事件處理
static void play_eventHandler(Play* play) {

	PlayMessage msg;
	while (messageQueue_poll(&play->mq, &msg)) {
		switch (msg.type)
		{
		case PLAYMESSAGETYPE_INVOKE:
			SDL_ThreadFunction fn = (SDL_ThreadFunction)msg.param1;
			fn(msg.param2);
			break;
		}
	}

	//處理窗口消息
	SDL_Event sdl_event;
	if (SDL_PollEvent(&sdl_event))
	{
		switch (sdl_event.type)
		{
		case SDL_WINDOWEVENT:
			if (sdl_event.window.event == SDL_WINDOWEVENT_CLOSE)
				play->exitFlag = 1;
			break;
		case SDL_KEYDOWN:
			switch (sdl_event.key.keysym.sym) {
			case SDLK_UP:
				play_setVolume(play, play_getVolume(play) + 20);
				break;
			case SDLK_DOWN:
				play_setVolume(play, play_getVolume(play) - 20);
				break;
			case SDLK_LEFT:
				play_seek(play, synchronize_getMasterTime(&play->synchronize) - 10);
				break;
			case SDLK_RIGHT:
				play_seek(play, synchronize_getMasterTime(&play->synchronize) + 10);
				break;
			case SDLK_SPACE:
				play_pause(play, !play->isPaused);
				break;
			default:
				break;
			}
			break;
		}
	}
}

//播放循環(huán)
static int play_loop(Play* play)
{
	int ret = 0;
	double remainingTime = 0;
	while (!play->exitFlag)
	{
		if (!play->isPaused || play->step)
		{
			//解復(fù)用
			if ((ret = packet_read(play)) != 0)
			{
				LOG_ERROR("read packet error");
				ret = ERRORCODE_PLAY_READPACKETERROR;
				break;
			}
			//視頻解碼
			if ((ret = decoder_decode(play, &play->video.decoder)) != 0)
			{
				LOG_ERROR("video decode error");
				ret = ERRORCODE_PLAY_VIDEODECODEERROR;
				break;
			}
			//音頻解碼
			if ((ret = decoder_decode(play, &play->audio.decoder)) != 0)
			{
				LOG_ERROR("audio decode error");
				ret = ERRORCODE_PLAY_AUDIODECODEERROR;
				break;
			}
			//延時(shí)等待
			remainingTime = (play->wakeupTime - getCurrentTime()) / 2;
			if (remainingTime > 0)
			{
				av_usleep(remainingTime * 1000000);
			}
			//視頻顯示
			if ((ret = video_display(play, &play->video)) != 0)
			{
				LOG_ERROR("video display error");
				ret = ERRORCODE_PLAY_VIDEODISPLAYERROR;
				break;
			}
			//音頻播放
			if ((ret = audio_play(play, &play->audio)) != 0)
			{
				LOG_ERROR("audio play error");
				ret = ERRORCODE_PLAY_AUDIOPLAYERROR;
				break;
			}
			//檢查結(jié)尾
			if ((play->video.decoder.streamIndex == -1 || play->video.eofDisplay) && (play->audio.decoder.streamIndex == -1 || play->audio.eofPlay))
			{
				if (!play->isLoop)
					break;
				//循環(huán)播放,定位到起點(diǎn)
				play->eofPacket = 0;
				play->audio.decoder.eofFrame = 0;
				play->video.decoder.eofFrame = 0;
				play->audio.eofPlay = 0;
				play->video.eofDisplay = 0;
				play_seek(play, 0);
				continue;
			}
		}
		else
		{
			av_usleep(0.01 * 1000000);
		}
		//處理消息
		play_eventHandler(play);
	}

	return ret;
}

/// <summary>
/// 播放
/// 單線程阻塞播放
/// </summary>
/// <param name="play">播放器對象</param>
/// <param name="url">輸入流的地址,可以是本地路徑、http、https、rtmp、rtsp等</param>
/// <returns>錯(cuò)誤碼,0表示無錯(cuò)誤</returns>
int play_exec(Play* play, const char* url) {
	int ret = 0;
	play->url = (char*)url;
	//打開輸入流
	if ((ret = packet_open(play)) != 0)
	{
		LOG_ERROR("open input error");
		ret = ERRORCODE_PLAY_OPENINPUTSTREAMFAILED;
		goto end;
	}
	//初始化視頻模塊
	if (play->video.decoder.streamIndex != -1)
	{
		if ((ret = video_init(play, &play->video)) != 0)
		{
			LOG_ERROR("init video error");
			ret = ERRORCODE_PLAY_VIDEOINITFAILED;
			goto end;
		}
	}
	//初始化音頻模塊
	if (play->audio.decoder.streamIndex != -1)
	{
		if ((ret = audio_init(play, &play->audio)) != 0)
		{
			LOG_ERROR("init audio error");
			ret = ERRORCODE_PLAY_AUDIOINITFAILED;
			goto end;
		}
	}
	else
	{
		play->synchronize.type = SYNCHRONIZETYPE_ABSOLUTE;
	}
	//初始化消息隊(duì)列
	if ((ret = messageQueue_init(&play->mq, 500, sizeof(PlayMessage))) != 0)
	{
		LOG_ERROR("open input error");
		ret = ERRORCODE_PLAY_OPENINPUTSTREAMFAILED;
		goto end;
	}
	//進(jìn)入播放循環(huán)進(jìn)行:解碼-渲染-播放
	if ((ret = play_loop(play)) != 0)
	{
		ret = ERRORCODE_PLAY_LOOPERROR;
	}
end:
	//銷毀資源	
	if (play->video.decoder.streamIndex != -1)
	{
		video_deinit(play, &play->video);
	}
	if (play->audio.decoder.streamIndex != -1)
	{
		audio_deinit(play, &play->audio);
	}

	packet_close(play);
	synchronize_reset(&play->synchronize);
	messageQueue_deinit(&play->mq);
	play->url = NULL;
	play->exitFlag = 0;
	return ret;
}


/// <summary>
/// 退出播放
/// </summary>
/// <param name="play">播放器對象</param>
void play_exit(Play* play)
{
	play->exitFlag = 1;
}
#undef main
int main(int argc, char** argv) {
	Play play;
	//memset相當(dāng)于初始化
	memset(&play, 0, sizeof(Play));
	play_setWindowSize(&play, 640, 360);
	play.isLoop = 1;
	//單線程阻塞播放
	return play_exec(&play, "D:\\test.mp4");
}

完整代碼項(xiàng)目:vs2022、makefile,Windows、Linux都可以運(yùn)行,Linux需要自行配置ffmpeg和sdl

四、使用示例

#undef main
int main(int argc, char** argv) {
    Play play;
    //memset相當(dāng)于初始化
    memset(&play, 0, sizeof(Play));
    play_setWindowSize(&play, 640, 360);
    play.isLoop = 1;
    //單線程阻塞播放。左快退、右快進(jìn)、上下調(diào)音量、空格暫停。
    return play_exec(&play, "D:\\test.mp4");
}

總結(jié)

以上就是今天要講的內(nèi)容,本文的播放器驗(yàn)證了單線程播放是可行的,尤其是播放本地文件可以做到完全單線程,那這樣對于實(shí)現(xiàn)視頻剪輯工具就有很大的幫助了,每多一條軌道只需要增加一個(gè)線程。而且采用異步讀包之后也能正常播放網(wǎng)絡(luò)流,2個(gè)線程播放視頻依然比ffplay要優(yōu)化。

到此這篇關(guān)于C語言使用ffmpeg實(shí)現(xiàn)單線程異步的視頻播放器的文章就介紹到這了,更多相關(guān)C語言 ffmpeg視頻播放器內(nèi)容請搜索腳本之家以前的文章或繼續(xù)瀏覽下面的相關(guān)文章希望大家以后多多支持腳本之家!

相關(guān)文章

  • C++設(shè)計(jì)模式之簡單工廠模式的實(shí)現(xiàn)示例

    C++設(shè)計(jì)模式之簡單工廠模式的實(shí)現(xiàn)示例

    這篇文章主要給大家介紹了關(guān)于C++設(shè)計(jì)模式之簡單工廠模式的相關(guān)資料,簡單工廠模式,主要用于創(chuàng)建對象,添加類時(shí),不會影響以前的系統(tǒng)代碼,需要的朋友可以參考下
    2021-06-06
  • 一文帶你了解Qt中槽的使用

    一文帶你了解Qt中槽的使用

    這篇文章主要為大家詳細(xì)介紹了Qt中槽的使用教程,文中的示例代碼講解詳細(xì),對我們學(xué)習(xí)Qt有一定的幫助,感興趣的小伙伴可以跟隨小編一起學(xué)習(xí)一下
    2022-12-12
  • 如何用C++實(shí)現(xiàn)雙向循環(huán)鏈表

    如何用C++實(shí)現(xiàn)雙向循環(huán)鏈表

    本篇文章是對用C++實(shí)現(xiàn)雙向循環(huán)鏈表的方法進(jìn)行了詳細(xì)的分析介紹,需要的朋友參考下
    2013-05-05
  • C語言中關(guān)于scanf讀取緩存區(qū)的問題

    C語言中關(guān)于scanf讀取緩存區(qū)的問題

    scanf()函數(shù)是通用終端格式化輸入函數(shù),它從標(biāo)準(zhǔn)輸入設(shè)備(鍵盤) 讀取輸入的信息,接下來通過本文給大家介紹C語言中關(guān)于scanf讀取緩存區(qū)的問題,需要的朋友一起看看吧
    2021-09-09
  • C++深入淺出講解函數(shù)重載

    C++深入淺出講解函數(shù)重載

    C++允許多個(gè)函數(shù)擁有相同的名字,只要它們的參數(shù)列表不同就可以,這就是函數(shù)的重載(Function?Overloading),借助重載,一個(gè)函數(shù)名可以有多種用途
    2022-05-05
  • MFC程序執(zhí)行過程深入剖析

    MFC程序執(zhí)行過程深入剖析

    這篇文章主要介紹了MFC程序執(zhí)行過程,包括對MFC執(zhí)行流程的分析以及斷點(diǎn)調(diào)試分析出的SDI程序執(zhí)行流程,需要的朋友可以參考下
    2014-09-09
  • C/C++ Crypto密碼庫調(diào)用的實(shí)現(xiàn)方法

    C/C++ Crypto密碼庫調(diào)用的實(shí)現(xiàn)方法

    Crypto 庫是C/C++的加密算法庫,這個(gè)加密庫很流行,基本上涵蓋了市面上的各類加密解密算法,感興趣的可以參考一下
    2021-06-06
  • C++編譯報(bào)錯(cuò):||error: ld returned 1 exit status|的解決

    C++編譯報(bào)錯(cuò):||error: ld returned 1 exit 

    這篇文章主要介紹了C++編譯報(bào)錯(cuò):||error: ld returned 1 exit status|的解決方式,具有很好的參考價(jià)值,希望對大家有所幫助,如有錯(cuò)誤或未考慮完全的地方,望不吝賜教
    2024-01-01
  • OpenCV獲取視頻的每一幀并保存為.jpg圖片

    OpenCV獲取視頻的每一幀并保存為.jpg圖片

    這篇文章主要為大家詳細(xì)介紹了OpenCV獲取視頻的每一幀,并保存為.jpg圖片,文中示例代碼介紹的非常詳細(xì),具有一定的參考價(jià)值,感興趣的小伙伴們可以參考一下
    2019-07-07
  • C++模板以及實(shí)現(xiàn)vector實(shí)例詳解

    C++模板以及實(shí)現(xiàn)vector實(shí)例詳解

    模板是為了實(shí)現(xiàn)泛型編程,所謂泛型編程,就是指編寫與類型無關(guān)的代碼,下面這篇文章主要給大家介紹了關(guān)于C++模板以及實(shí)現(xiàn)vector的相關(guān)資料,需要的朋友可以參考下
    2021-11-11

最新評論