• Android RTMP推拉流,MediaCodec硬件编解码


    mediacodec简介

    MediaCodec 类可以用来访问底层媒体编解码器,即编码器/解码器的组件。 它是 Android 底层多媒体支持架构的一部分(通常与 MediaExtractor,MediaSync,MediaMuxer,MediaCrypto,MediaDrm,ImageSurface 和 AudioTrack 一起使用)。

    编解码器可以处理三类数据:压缩数据、原始音频数据、原始视频数据。

    a Compressed Buffers 压缩缓冲区

    输入和输出缓冲区包含了对应类型的压缩数据;对于视频类型通常是简单的压缩视频帧;音频数据通常是一个单入单元,(一种编码格式典型的包含了许多 ms 的音频类型),但当一个缓冲区包含了多种编码音频进入单元,可以不需要。另一方面,缓冲区不能在任意字节边界开始或停止,但当标记了 BUFFERFLAGPARTIAL_FRAME 标记时,可以访问帧或进入单元边界。

    b Raw Audio Buffers 原始音频缓冲区

    原始音频缓冲区包含完整的 PCM 格式的帧数据,一种通道对应一个采样率。每一种采样率是一个 16 位有符号整型在规定参数里面;

    c Raw Video Buffers 原始视频缓冲区

    在 ByteBuffer 模式,视频缓冲区根据颜色格式;可以通过 getCodecInfo().getCapabilitiesForType(…).colorFormats 获取支持的颜色格式,视频编码支持三种类型的颜色格式:

    native raw video format: 标记 COLOR_FormatSurface,可以配合输入输出 surface 使用

    flexible YUV buffers:COLOR_FormatYUV420Flexible,可以配合输入输出 surface、在 ByteBuffer 模式,可以通过 getInput/OutputImage(int)访问

    other, specific formats:这些格式只在 ByteBuffer 模式支持。一些格式是厂商特有的,其他的定义在 MediaCodecInfo.CodecCapabilities;

    自从 5.1.1 之后,所有的编解码器支持 YUV 4:2:0 buffers。

    d Accessing Raw Video ByteBuffers on Older Devices 在老的设备上面访问原始视频缓冲区

    本文福利, 免费领取C++音视频学习资料包、技术视频,内容包括(音视频开发,面试题,FFmpeg webRTC rtmp hls rtsp ffplay srs↓↓↓↓↓↓见下面↓↓文章底部点击免费领取↓↓

    状态

    编解码器理论上存在三种状态:停止、执行、释放;停止状态也包含三种子状态:未初始化的、已配置的、错误;执行状态也包含三种子状态:已刷新、正在运行、流结束;

    0) 工厂方法创建一个编解码器,处于未初始化状态。

    1) configure(…)方法进入已配置状态。

    2) start()方法进入执行状态。此时,才可以通过上面缓冲队列来处理数据。

    3) start()之后,编解码出于已刷新子状态,此时持有所有的缓冲区;当第一个输入缓冲块被出队时,编解码器会耗费许多时间进入运行状态。当一个输入缓冲块被入队时(被标记流结束标记),编解码器进入流结束状态;此时,编解码器不在接收输入缓冲块,但是可以产生输出缓冲块,直到流结束块被出队。

    4) 可以在任意时刻,通过调用 flush(),进入已刷新状态。

    5) stop()让其进入未初始化状态,如果需要使用,需要再配置一次。

    6) 当你已经用完编解码器,你需要 release();

    某些情况下,编解码器会遭遇错误进入错误状态;可以根据不合法返回值或者异常来判断;调用 reset()可以复位编码器,让其可以重新使用,并进入未初始化状态。调用 releases()进入最终释放状态。

    1 creating创建

    可以根据指定的 MediaFormat 通过 MediaCodecList 创建一个编解码器;

    可以根据 MediaExtractor.getTrackFormat 来创建一个可以用于解码文件和流的编解码器;

    在引入其他格式之前,当你想 MediaFormat.setFeatureEnabled,需要通过 MediaCodecList.findDecoderForFormat 获得与名字对应的特殊的媒体格式的编解码器;

    最后通过 createByCodecName(String)创建;

    也可以通过 MIME 类型使用 createDecoder/EncoderByType(String)来创建。

    2 Initialization初始化

    creating之后,可以设置回调 setCallback 来异步处理数据;然后 configure 配置指定的媒体格式。你可以为视频生成指定一个输出 surface;也可以设置安全编码,参考 MediaCrypto;最后编解码器运行在多个模式下,需要特殊指定在编码或解码状态;

    如果你想处理原始输入视频缓冲区,可以在配置后通过 createInputSurface()创建一个指定的 Surface。也可以通过 setInputSurface(Surface)设置编解码器使用指定的 Surface。

    AAC audio and MPEG4, H.264 and H.265 video 格式要求预置启动参数或者编解码特殊数据。当处理一些压缩格式时,这些数据必须在任意帧数据之前和 start()之后提交到编解码器。这些数据在调用 queueInputBuffer 时需要被标记 BUFFERFLAGCODEC_CONFIG。

    这些数据也可以通过 configure 来配置,可以从 MediaExtractor 获取并放在 MediaFromat 里面。这些数据会在 start()时提交到比爱你解码器里面。

    编码器会在任何可用数据之前创建和返回特定标记了 codec-config 标记的编码参数,缓冲区包含了没有时间戳的 codec-specific-data。

    3 Data processing执行编解码

    同步模式下,获取一个输入缓冲区之后,填充数据,并通过 queueInputBuffer 提交到 codec,不要提交多个同样时间戳一样的输入数据到 codec。codec 处理完后,会返回一个只读输出缓冲区数据。

    异步模式可以通过 onOutputBufferAvailable 读取,同步模式通过 dequeuOutputBuffer 读取;最后需要调用 releaseOutputBuffer 返回缓冲区到 codec。

    二 用法示例

    1. //播放mp4文件
    2. package com.cclin.jubaohe.activity.Media;
    3. import android.media.AudioFormat;
    4. import android.media.AudioManager;
    5. import android.media.AudioTrack;
    6. import android.media.MediaCodec;
    7. import android.media.MediaExtractor;
    8. import android.media.MediaFormat;
    9. import android.os.Bundle;
    10. import android.view.Surface;
    11. import android.view.SurfaceHolder;
    12. import android.view.SurfaceView;
    13. import android.view.View;
    14. import android.widget.Button;
    15. import com.cclin.jubaohe.R;
    16. import com.cclin.jubaohe.base.BaseActivity;
    17. import com.cclin.jubaohe.util.CameraUtil;
    18. import com.cclin.jubaohe.util.LogUtil;
    19. import com.cclin.jubaohe.util.SDPathConfig;
    20. import java.io.File;
    21. import java.io.IOException;
    22. import java.nio.ByteBuffer;
    23. /**
    24. * Created by LinChengChun on 2018/4/14.
    25. */
    26. public class MediaTestActivity extends BaseActivity implements SurfaceHolder.Callback, View.OnClickListener {
    27. private final static String MEDIA_FILE_PATH = SDPathConfig.LIVE_MOVIE_PATH+"/18-04-12-10:47:06-0.mp4";
    28. private Surface mSurface;
    29. private SurfaceView mSvRenderFromCamera;
    30. private SurfaceView mSvRenderFromFile;
    31. Button mBtnCameraPreview;
    32. Button mBtnPlayMediaFile;
    33. private MediaStream mMediaStream;
    34. private Thread mVideoDecoderThread;
    35. private AudioTrack mAudioTrack;
    36. private Thread mAudioDecoderThread;
    37. @Override
    38. protected void onCreate(Bundle savedInstanceState) {
    39. super.onCreate(savedInstanceState);
    40. mBtnCameraPreview = retrieveView(R.id.btn_camera_preview);
    41. mBtnPlayMediaFile = retrieveView(R.id.btn_play_media_file);
    42. mBtnCameraPreview.setOnClickListener(this);
    43. mBtnPlayMediaFile.setOnClickListener(this);
    44. mSvRenderFromCamera = retrieveView(R.id.sv_render);
    45. mSvRenderFromCamera.setOnClickListener(this);
    46. mSvRenderFromCamera.getHolder().addCallback(this);
    47. mSvRenderFromFile = retrieveView(R.id.sv_display);
    48. mSvRenderFromFile.setOnClickListener(this);
    49. mSvRenderFromFile.getHolder().addCallback(this);
    50. init();
    51. }
    52. @Override
    53. protected int initLayout() {
    54. return R.layout.activity_media_test;
    55. }
    56. private void init(){
    57. File file = new File(MEDIA_FILE_PATH);
    58. if (!file.exists()){
    59. LogUtil.e("文件不存在!!");
    60. return;
    61. }
    62. LogUtil.e("目标文件存在!!");
    63. }
    64. private void startVideoDecoder(){
    65. // fill inputBuffer with valid data
    66. mVideoDecoderThread = new Thread("mVideoDecoderThread"){
    67. @Override
    68. public void run() {
    69. super.run();
    70. MediaFormat mMfVideo = null, mMfAudio = null;
    71. String value = null;
    72. String strVideoMime = null;
    73. String strAudioMime = null;
    74. try {
    75. MediaExtractor mediaExtractor = new MediaExtractor(); // 提取器用来从文件中读取音视频
    76. mediaExtractor.setDataSource(MEDIA_FILE_PATH);
    77. int numTracks = mediaExtractor.getTrackCount(); // 轨道数,一般为2
    78. LogUtil.e("获取track数"+numTracks);
    79. for (int i=0; i< numTracks; i++) { // 检索每个轨道的格式
    80. MediaFormat mediaFormat = mediaExtractor.getTrackFormat(i);
    81. LogUtil.e("单独显示track MF:"+mediaFormat);
    82. value = mediaFormat.getString(MediaFormat.KEY_MIME);
    83. if (value.contains("audio")){
    84. mMfAudio = mediaFormat;
    85. strAudioMime = value;
    86. }else {
    87. mMfVideo = mediaFormat;
    88. strVideoMime = value;
    89. mediaExtractor.selectTrack(i);
    90. }
    91. }
    92. mSurface = mSvRenderFromFile.getHolder().getSurface();
    93. MediaCodec codec = MediaCodec.createDecoderByType(strVideoMime); // 创建编解码器
    94. codec.configure(mMfVideo, mSurface, null, 0); // 配置解码后的视频帧数据直接渲染到Surface
    95. codec.setVideoScalingMode(MediaCodec.VIDEO_SCALING_MODE_SCALE_TO_FIT);
    96. codec.start(); // 启动编解码器,让codec进入running模式
    97. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); //缓冲区信息
    98. int size = -1, outputBufferIndex = -1;
    99. LogUtil.e("开始解码。。。");
    100. long previewStampUs = 0l;
    101. do {
    102. int inputBufferId = codec.dequeueInputBuffer(10);// 从编码器中获取 输入缓冲区
    103. if (inputBufferId >= 0) {
    104. ByteBuffer inputBuffer = codec.getInputBuffer(inputBufferId); // 获取该输入缓冲区
    105. // fill inputBuffer with valid data
    106. inputBuffer.clear(); // 清空缓冲区
    107. size = mediaExtractor.readSampleData(inputBuffer, 0); // 从提取器中获取一帧数据填充到输入缓冲区
    108. LogUtil.e("readSampleData: size = "+size);
    109. if (size < 0)
    110. break;
    111. int trackIndex = mediaExtractor.getSampleTrackIndex();
    112. long presentationTimeUs = mediaExtractor.getSampleTime(); // 获取采样时间
    113. LogUtil.e("queueInputBuffer: 把数据放入编码器。。。");
    114. codec.queueInputBuffer(inputBufferId, 0, size, presentationTimeUs, 0); // 将输入缓冲区压入编码器
    115. mediaExtractor.advance(); // 获取下一帧
    116. LogUtil.e("advance: 获取下一帧。。。");
    117. outputBufferIndex = codec.dequeueOutputBuffer(bufferInfo, 10000); // 从编码器中读取解码完的数据
    118. LogUtil.e("outputBufferIndex = "+outputBufferIndex);
    119. switch (outputBufferIndex) {
    120. case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
    121. // MediaFormat mf = codec.getOutputFormat(outputBufferIndex); // 导致播放视频失败
    122. MediaFormat mf = codec.getOutputFormat();
    123. LogUtil.e("INFO_OUTPUT_FORMAT_CHANGED:"+mf);
    124. break;
    125. case MediaCodec.INFO_TRY_AGAIN_LATER:
    126. LogUtil.e("解码当前帧超时");
    127. break;
    128. case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
    129. //outputBuffers = videoCodec.getOutputBuffers();
    130. LogUtil.e("output buffers changed");
    131. break;
    132. default:
    133. //直接渲染到Surface时使用不到outputBuffer
    134. //ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
    135. //延时操作
    136. //如果缓冲区里的可展示时间>当前视频播放的进度,就休眠一下
    137. boolean firstTime = previewStampUs == 0l;
    138. long newSleepUs = -1;
    139. long sleepUs = (bufferInfo.presentationTimeUs - previewStampUs);
    140. if (!firstTime) {
    141. long cache = 0;
    142. newSleepUs = CameraUtil.fixSleepTime(sleepUs, cache, -100000);
    143. }
    144. previewStampUs = bufferInfo.presentationTimeUs;
    145. //渲染
    146. if (newSleepUs < 0)
    147. newSleepUs = 0;
    148. Thread.sleep(newSleepUs / 1000);
    149. codec.releaseOutputBuffer(outputBufferIndex, true); // 释放输入缓冲区,并渲染到Surface
    150. break;
    151. }
    152. }
    153. }while (!this.isInterrupted());
    154. LogUtil.e("解码结束。。。");
    155. codec.stop();
    156. codec.release();
    157. codec = null;
    158. mediaExtractor.release();
    159. mediaExtractor = null;
    160. } catch (IOException e) {
    161. e.printStackTrace();
    162. } catch (InterruptedException e) {
    163. e.printStackTrace();
    164. }
    165. }
    166. };
    167. mVideoDecoderThread.start();
    168. }
    169. private void startAudioDecoder(){
    170. mAudioDecoderThread = new Thread("AudioDecoderThread"){
    171. @Override
    172. public void run() {
    173. super.run();
    174. try {
    175. MediaFormat mMfVideo = null, mMfAudio = null;
    176. String value = null;
    177. String strVideoMime = null;
    178. String strAudioMime = null;
    179. MediaExtractor mediaExtractor = new MediaExtractor();
    180. mediaExtractor.setDataSource(MEDIA_FILE_PATH);
    181. int numTracks = mediaExtractor.getTrackCount();
    182. LogUtil.e("获取track数"+numTracks);
    183. for (int i=0; i< numTracks; i++) {
    184. MediaFormat mediaFormat = mediaExtractor.getTrackFormat(i);
    185. LogUtil.e("单独显示track MF:"+mediaFormat);
    186. value = mediaFormat.getString(MediaFormat.KEY_MIME);
    187. if (value.contains("audio")){
    188. mMfAudio = mediaFormat;
    189. strAudioMime = value;
    190. mediaExtractor.selectTrack(i);
    191. }else {
    192. mMfVideo = mediaFormat;
    193. strVideoMime = value;
    194. }
    195. }
    196. // mMfAudio.setInteger(MediaFormat.KEY_IS_ADTS, 1);
    197. mMfAudio.setInteger(MediaFormat.KEY_BIT_RATE, 16000);
    198. MediaCodec codec = MediaCodec.createDecoderByType(strAudioMime);
    199. codec.configure(mMfAudio, null, null, 0);
    200. codec.start();
    201. ByteBuffer outputByteBuffer = null;
    202. ByteBuffer[] outputByteBuffers = null;
    203. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    204. int size = -1, outputBufferIndex = -1;
    205. long previewStampUs = 01;
    206. LogUtil.e("开始解码。。。");
    207. if (mAudioTrack == null){
    208. int sample_rate = mMfAudio.getInteger(MediaFormat.KEY_SAMPLE_RATE);
    209. int channels = mMfAudio.getInteger(MediaFormat.KEY_CHANNEL_COUNT);
    210. int sampleRateInHz = (int) (sample_rate * 1.004);
    211. int channelConfig = channels == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO;
    212. int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
    213. int bfSize = AudioTrack.getMinBufferSize(sampleRateInHz, channelConfig, audioFormat) * 4;
    214. mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRateInHz, channelConfig, audioFormat, bfSize, AudioTrack.MODE_STREAM);
    215. }
    216. mAudioTrack.play();
    217. // outputByteBuffers = codec.getOutputBuffers();
    218. do {
    219. int inputBufferId = codec.dequeueInputBuffer(10);
    220. if (inputBufferId >= 0) {
    221. ByteBuffer inputBuffer = codec.getInputBuffer(inputBufferId);
    222. // fill inputBuffer with valid data
    223. inputBuffer.clear();
    224. size = mediaExtractor.readSampleData(inputBuffer, 0);
    225. if (size<0)
    226. break;
    227. long presentationTimeUs = mediaExtractor.getSampleTime();
    228. // LogUtil.e("queueInputBuffer: 把数据放入编码器。。。");
    229. codec.queueInputBuffer(inputBufferId, 0, size, presentationTimeUs, 0);
    230. mediaExtractor.advance();
    231. // LogUtil.e("advance: 获取下一帧。。。");
    232. outputBufferIndex = codec.dequeueOutputBuffer(bufferInfo, 50000);
    233. switch (outputBufferIndex) {
    234. case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
    235. // MediaFormat mf = codec.getOutputFormat(outputBufferIndex);
    236. MediaFormat mf = codec.getOutputFormat();
    237. LogUtil.e("INFO_OUTPUT_FORMAT_CHANGED:"+mf);
    238. break;
    239. case MediaCodec.INFO_TRY_AGAIN_LATER:
    240. LogUtil.e( "解码当前帧超时");
    241. break;
    242. case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
    243. // outputByteBuffer = codec.getOutputBuffers();
    244. LogUtil.e( "output buffers changed");
    245. break;
    246. default:
    247. //直接渲染到Surface时使用不到outputBuffer
    248. //ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
    249. //延时操作
    250. //如果缓冲区里的可展示时间>当前视频播放的进度,就休眠一下
    251. LogUtil.e("outputBufferIndex = "+outputBufferIndex);
    252. // outputByteBuffer = outputByteBuffers[outputBufferIndex];
    253. outputByteBuffer = codec.getOutputBuffer(outputBufferIndex); // 获取解码后的数据
    254. outputByteBuffer.clear();
    255. byte[] outData = new byte[bufferInfo.size];
    256. outputByteBuffer.get(outData);
    257. boolean firstTime = previewStampUs == 0l;
    258. long newSleepUs = -1;
    259. long sleepUs = (bufferInfo.presentationTimeUs - previewStampUs);
    260. if (!firstTime){
    261. long cache = 0;
    262. newSleepUs = CameraUtil.fixSleepTime(sleepUs, cache, -100000);
    263. }
    264. previewStampUs = bufferInfo.presentationTimeUs;
    265. //渲染
    266. if (newSleepUs < 0)
    267. newSleepUs = 0;
    268. Thread.sleep(newSleepUs/1000);
    269. mAudioTrack.write(outData, 0, outData.length); // 输出音频
    270. codec.releaseOutputBuffer(outputBufferIndex, false); // 释放输出缓冲区
    271. break;
    272. }
    273. }
    274. }while (!this.isInterrupted());
    275. LogUtil.e("解码结束。。。");
    276. codec.stop();
    277. codec.release();
    278. codec = null;
    279. mAudioTrack.stop();
    280. mAudioTrack.release();
    281. mAudioTrack = null;
    282. mediaExtractor.release();
    283. mediaExtractor = null;
    284. } catch (IOException e) {
    285. e.printStackTrace();
    286. } catch (InterruptedException e) {
    287. e.printStackTrace();
    288. }
    289. }
    290. };
    291. mAudioDecoderThread.start();
    292. }
    293. @Override
    294. public void onClick(View view){
    295. switch (view.getId()){
    296. case R.id.sv_render:
    297. mMediaStream.getCamera().autoFocus(null);
    298. break;
    299. case R.id.sv_display:
    300. break;
    301. case R.id.btn_camera_preview:
    302. break;
    303. case R.id.btn_play_media_file:
    304. break;
    305. default:break;
    306. }
    307. }
    308. private int getDgree() {
    309. int rotation = getWindowManager().getDefaultDisplay().getRotation();
    310. int degrees = 0;
    311. switch (rotation) {
    312. case Surface.ROTATION_0:
    313. degrees = 0;
    314. break; // Natural orientation
    315. case Surface.ROTATION_90:
    316. degrees = 90;
    317. break; // Landscape left
    318. case Surface.ROTATION_180:
    319. degrees = 180;
    320. break;// Upside down
    321. case Surface.ROTATION_270:
    322. degrees = 270;
    323. break;// Landscape right
    324. }
    325. return degrees;
    326. }
    327. private void onMediaStreamCreate(){
    328. if (mMediaStream==null)
    329. mMediaStream = new MediaStream(this, mSvRenderFromCamera.getHolder());
    330. mMediaStream.setDgree(getDgree());
    331. mMediaStream.createCamera();
    332. mMediaStream.startPreview();
    333. }
    334. private void onMediaStreamDestroy(){
    335. mMediaStream.release();
    336. mMediaStream = null;
    337. }
    338. @Override
    339. protected void onPause() {
    340. super.onPause();
    341. onMediaStreamDestroy();
    342. if (mVideoDecoderThread!=null)
    343. mVideoDecoderThread.interrupt();
    344. if (mAudioDecoderThread!=null)
    345. mAudioDecoderThread.interrupt();
    346. }
    347. @Override
    348. protected void onResume() {
    349. super.onResume();
    350. if (isSurfaceCreated && mMediaStream == null){
    351. onMediaStreamCreate();
    352. }
    353. }
    354. private boolean isSurfaceCreated = false;
    355. @Override
    356. public void surfaceCreated(SurfaceHolder holder) {
    357. LogUtil.e("surfaceCreated: "+holder);
    358. if (holder.getSurface() == mSvRenderFromCamera.getHolder().getSurface()){
    359. isSurfaceCreated = true;
    360. onMediaStreamCreate();
    361. }else if (holder.getSurface() == mSvRenderFromFile.getHolder().getSurface()){
    362. if (new File(MEDIA_FILE_PATH).exists()) {
    363. startVideoDecoder();
    364. startAudioDecoder();
    365. }
    366. }
    367. }
    368. @Override
    369. public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    370. LogUtil.e("surfaceChanged: "
    371. +"\nholder = "+holder
    372. +"\nformat = "+format
    373. +"\nwidth = "+width
    374. +"\nheight = "+height);
    375. }
    376. @Override
    377. public void surfaceDestroyed(SurfaceHolder holder) {
    378. LogUtil.e("surfaceDestroyed: ");
    379. if (holder.getSurface() == mSvRenderFromCamera.getHolder().getSurface()) {
    380. isSurfaceCreated = false;
    381. }
    382. }
    383. }
    1. //摄像头数据编码h264
    2. final int millisPerframe = 1000 / 20;
    3. long lastPush = 0;
    4. @Override
    5. public void run() {
    6. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    7. int outputBufferIndex = 0;
    8. byte[] mPpsSps = new byte[0];
    9. byte[] h264 = new byte[mWidth * mHeight];
    10. do {
    11. outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 10000); // 从codec中获取编码完的数据
    12. if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
    13. // no output available yet
    14. } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
    15. // not expected for an encoder
    16. outputBuffers = mMediaCodec.getOutputBuffers();
    17. } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
    18. synchronized (HWConsumer.this) {
    19. newFormat = mMediaCodec.getOutputFormat();
    20. EasyMuxer muxer = mMuxer;
    21. if (muxer != null) {
    22. // should happen before receiving buffers, and should only happen once
    23. muxer.addTrack(newFormat, true);
    24. }
    25. }
    26. } else if (outputBufferIndex < 0) {
    27. // let's ignore it
    28. } else {
    29. ByteBuffer outputBuffer;
    30. if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
    31. outputBuffer = mMediaCodec.getOutputBuffer(outputBufferIndex);
    32. } else {
    33. outputBuffer = outputBuffers[outputBufferIndex];
    34. }
    35. outputBuffer.position(bufferInfo.offset);
    36. outputBuffer.limit(bufferInfo.offset + bufferInfo.size);
    37. EasyMuxer muxer = mMuxer;
    38. if (muxer != null) {
    39. muxer.pumpStream(outputBuffer, bufferInfo, true);
    40. }
    41. boolean sync = false;
    42. if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {// codec会产生sps和pps
    43. sync = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0; // 标记是I帧还是同步帧
    44. if (!sync) { // 如果是同步帧,也就是填充着pps和sps参数
    45. byte[] temp = new byte[bufferInfo.size];
    46. outputBuffer.get(temp);
    47. mPpsSps = temp;
    48. mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
    49. continue; // 等待下一帧
    50. } else {
    51. mPpsSps = new byte[0];
    52. }
    53. }
    54. sync |= (bufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0; // 标记是否是关键帧
    55. int len = mPpsSps.length + bufferInfo.size;
    56. if (len > h264.length) {
    57. h264 = new byte[len];
    58. }
    59. if (sync) {
    60. // 如果是关键帧
    61. if (BuildConfig.DEBUG)
    62. Log.i(TAG, String.format("push i video stamp:%d", bufferInfo.presentationTimeUs / 1000));
    63. } else { // 非I帧直接读取出来
    64. outputBuffer.get(h264, 0, bufferInfo.size);
    65. if (BuildConfig.DEBUG)
    66. Log.i(TAG, String.format("push video stamp:%d", bufferInfo.presentationTimeUs / 1000));
    67. }
    68. mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
    69. }
    70. }
    71. while (mVideoStarted);
    72. }
    73. @Override
    74. public int onVideo(byte[] data, int format) {
    75. if (!mVideoStarted) return 0;
    76. try {
    77. if (lastPush == 0) {
    78. lastPush = System.currentTimeMillis();
    79. }
    80. long time = System.currentTimeMillis() - lastPush;
    81. if (time >= 0) {
    82. time = millisPerframe - time;
    83. if (time > 0) Thread.sleep(time / 2);
    84. }
    85. if (format == ImageFormat.YV12) {
    86. JNIUtil.yV12ToYUV420P(data, mWidth, mHeight);
    87. } else {
    88. JNIUtil.nV21To420SP(data, mWidth, mHeight);
    89. }
    90. int bufferIndex = mMediaCodec.dequeueInputBuffer(0);
    91. if (bufferIndex >= 0) {
    92. ByteBuffer buffer = null;
    93. if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
    94. buffer = mMediaCodec.getInputBuffer(bufferIndex);
    95. } else {
    96. buffer = inputBuffers[bufferIndex];
    97. }
    98. buffer.clear();
    99. buffer.put(data);
    100. buffer.clear();
    101. mMediaCodec.queueInputBuffer(bufferIndex, 0, data.length, System.nanoTime() / 1000, MediaCodec.BUFFER_FLAG_KEY_FRAME); // 标记含有关键帧
    102. }
    103. if (time > 0) Thread.sleep(time / 2); // 添加延时,确保帧率
    104. lastPush = System.currentTimeMillis();
    105. } catch (InterruptedException ex) {
    106. ex.printStackTrace();
    107. }
    108. return 0;
    109. }
    110. /**
    111. * 初始化编码器
    112. */
    113. private void startMediaCodec() throws IOException {
    114. /*
    115. SD (Low quality) SD (High quality) HD 720p
    116. 1 HD 1080p
    117. 1
    118. Video resolution 320 x 240 px 720 x 480 px 1280 x 720 px 1920 x 1080 px
    119. Video frame rate 20 fps 30 fps 30 fps 30 fps
    120. Video bitrate 384 Kbps 2 Mbps 4 Mbps 10 Mbps
    121. */
    122. int framerate = 20;
    123. // if (width == 640 || height == 640) {
    124. // bitrate = 2000000;
    125. // } else if (width == 1280 || height == 1280) {
    126. // bitrate = 4000000;
    127. // } else {
    128. // bitrate = 2 * width * height;
    129. // }
    130. int bitrate = (int) (mWidth * mHeight * 20 * 2 * 0.05f);
    131. if (mWidth >= 1920 || mHeight >= 1920) bitrate *= 0.3;
    132. else if (mWidth >= 1280 || mHeight >= 1280) bitrate *= 0.4;
    133. else if (mWidth >= 720 || mHeight >= 720) bitrate *= 0.6;
    134. EncoderDebugger debugger = EncoderDebugger.debug(mContext, mWidth, mHeight);
    135. mVideoConverter = debugger.getNV21Convertor();
    136. mMediaCodec = MediaCodec.createByCodecName(debugger.getEncoderName());
    137. MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", mWidth, mHeight);
    138. mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
    139. mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
    140. mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, debugger.getEncoderColorFormat());
    141. mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
    142. mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
    143. mMediaCodec.start();
    144. Bundle params = new Bundle();
    145. params.putInt(MediaCodec.PARAMETER_KEY_REQUEST_SYNC_FRAME, 0);
    146. if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
    147. mMediaCodec.setParameters(params);
    148. }
    149. }
    1. //h264编码生成pps,sps
    2. inputBuffers = mMediaCodec.getInputBuffers();
    3. outputBuffers = mMediaCodec.getOutputBuffers();
    4. int bufferIndex = mMediaCodec.dequeueInputBuffer(0);
    5. if (bufferIndex >= 0) {
    6. inputBuffers[bufferIndex].clear();
    7. mConvertor.convert(data, inputBuffers[bufferIndex]);
    8. mMediaCodec.queueInputBuffer(bufferIndex, 0, inputBuffers[bufferIndex].position(), System.nanoTime() / 1000, 0);
    9. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    10. int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
    11. while (outputBufferIndex >= 0) {
    12. ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
    13. // String data0 = String.format("%x %x %x %x %x %x %x %x %x %x ", outData[0], outData[1], outData[2], outData[3], outData[4], outData[5], outData[6], outData[7], outData[8], outData[9]);
    14. // Log.e("out_data", data0);
    15. //记录pps和sps
    16. int type = outputBuffer.get(4) & 0x07; // 判断是什么帧
    17. // LogUtil.e(TAG, String.format("type is %d", type));
    18. if (type == 7 || type == 8) {
    19. byte[] outData = new byte[bufferInfo.size];
    20. outputBuffer.get(outData);
    21. mPpsSps = outData;
    22. ArrayList<Integer> posLists = new ArrayList<>(2);
    23. for (int i=0; i<bufferInfo.size-3; i++){ // 找寻 pps sps
    24. if (outData[i]==0 && outData[i+1]==0&& outData[i+2]==0 && outData[i+3]==1){
    25. posLists.add(i);
    26. }
    27. }
    28. int sps_pos = posLists.get(0);
    29. int pps_pos = posLists.get(1);
    30. posLists.clear();
    31. posLists = null;
    32. ByteBuffer csd0 = ByteBuffer.allocate(pps_pos);
    33. csd0.put(outData, sps_pos, pps_pos);
    34. csd0.clear();
    35. mCSD0 = csd0;
    36. LogUtil.e(TAG, String.format("CSD-0 searched!!!"));
    37. ByteBuffer csd1 = ByteBuffer.allocate(outData.length-pps_pos);
    38. csd1.put(outData, pps_pos, outData.length-pps_pos);
    39. csd1.clear();
    40. mCSD1 = csd1;
    41. LogUtil.e(TAG, String.format("CSD-1 searched!!!"));
    42. LocalBroadcastManager.getInstance(mApplicationContext).sendBroadcast(new Intent(ACTION_H264_SPS_PPS_GOT));
    43. } else if (type == 5) {
    44. // 这是一个关键帧
    45. if (mEasyMuxer !=null && !isRecordPause) {
    46. bufferInfo.presentationTimeUs = TimeStamp.getInstance().getCurrentTimeUS();
    47. mEasyMuxer.pumpStream(outputBuffer, bufferInfo, true);// 用于保存本地视频到本地
    48. isWaitKeyFrame = false; // 拿到关键帧,则清除等待关键帧的条件
    49. // LocalBroadcastManager.getInstance(mApplicationContext).sendBroadcast(new Intent(ACTION_I_KEY_FRAME_GOT));
    50. }
    51. } else {
    52. outputBuffer.get(h264, 0, bufferInfo.size);
    53. if (System.currentTimeMillis() - timeStamp >= 3000) {
    54. timeStamp = System.currentTimeMillis();
    55. if (Build.VERSION.SDK_INT >= 23) {
    56. Bundle params = new Bundle();
    57. params.putInt(MediaCodec.PARAMETER_KEY_REQUEST_SYNC_FRAME, 0);
    58. mMediaCodec.setParameters(params);
    59. }
    60. }
    61. if (mEasyMuxer !=null && !isRecordPause && !isWaitKeyFrame) {
    62. bufferInfo.presentationTimeUs = TimeStamp.getInstance().getCurrentTimeUS();
    63. mEasyMuxer.pumpStream(outputBuffer, bufferInfo, true);// 用于保存本地视频到本地
    64. }
    65. }
    66. mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
    67. outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
    68. }
    69. } else {
    70. Log.e(TAG, "No buffer available !");
    71. }

    如果你对音视频开发感兴趣,觉得文章对您有帮助,别忘了点赞、收藏哦!或者对本文的一些阐述有自己的看法,有任何问题,欢迎在下方评论区讨论!

     本文福利, 免费领取C++音视频学习资料包、技术视频,内容包括(音视频开发,面试题,FFmpeg webRTC rtmp hls rtsp ffplay srs↓↓↓↓↓↓见下面↓↓文章底部点击免费领取↓↓

  • 相关阅读:
    外卖项目(SpringBoot)--- 前后端分离开发部署
    无线传感器网络:数据链路层,MAC
    php mysql摄影网_图片分享网站
    Vue3实现刷新页面局部内容
    Python简介-Python3及环境配置
    go语言的一些常见踩坑问题
    CROS错误 403 preflight 预检
    【20221201】【每日一题】划分字母区间
    要求用一句sql语句打印出A,B各剩了多少
    Seata 源码篇之AT模式启动流程 - 中 - 03
  • 原文地址:https://blog.csdn.net/m0_60259116/article/details/127109840