• 音视频开发:MediaCodec录制MP4文件


    概述

    • 完成音频的硬编码和硬解码

    • 完成视频的硬编码和硬解码

    • 完成音视频的录制MP4

    MediaCodec介绍

    在Android 4.1版本提供了MediaCodec来访问设备的编解码器,它采用的是硬件编解码,所以在速度上比软解码更有优势

    MediaCodec的工作流程

    俩边的Client分别代表输入端和输出端

    使用者输入端用MediaCodec请求一个一个空的ByteBuffer,填充数据后将他传递给MediaCodec去处理

    MediaCodec处理完之后将处理后的数据输出到一个空的ByteBuffer中

    使用者从MediaCodec中获取输出的ByteBuffer,消耗掉里面的数据,使用完输出的ByteBuffer后

    MediaCodec的生命周期

    MediaCodec的声明周期有三种:Stopped、Executing、Released

    Stopped包含三种状态:Uninitialized、Configured、Error

    Executing包含三种状态:Flushed、Running、End-of-Stream

    本文福利, 免费领取C++音视频学习资料包、技术视频,内容包括(音视频开发,面试题,FFmpeg webRTC rtmp hls rtsp ffplay srs↓↓↓↓↓↓见下面↓↓文章底部点击免费领取↓↓

    Stopped的三种子状态

    Uninitialized:当创建了一个MediaCodec对象,此时处于 Uninitialized状态,可以在任何状态下调用reset方法,返回Uninitialized状态

    Configured:当使用configure(…)方法对MediaCodec进行配置是进入Configured状态

    Error:MediaCodec当遇到错误时进入Error状态,

    Executing的三种子状态

    Flushed:在调用start方法MediaCodec进入Flushed状态,此时MediaCodec拥有所有的缓存,可以在Executing状态的任何时候调用flush()方法进入Flushed状态

    Running:一旦第一个输入队列input buffer被移出队列,MediaCodec就转入Running状态,这种状态占据了MediaCodec的大部分的声明周期,可以通过调用stop返回Uninitialized状态

    End-of-Stream:讲一个带有end-of-stream标志的输入buffer入队列时,MediaCodec将进入End-of-Stream状态,这种状态下MediaCodec不在接收之后的输入buffer,但他依然产生输出buffer,直到End-of-Stream标记输出

    Released

    当使用完MediaCodec之后,调用release方法释放资源,进入最终的Released状态

    MediaCodec API简介

    createDecoderByType/createEncoderByType

    根据特定的MIME类型比如(video/avc)来创建MediaCodec

    createByCodecName

    直到组件的确切名称(OMX.google.mp3.decoder)根据确切的名称创建MediaCodec,组件名称可以根据MediaCodecList类来获取

    configure

    1. public void configure(
    2. MediaFormat format,
    3. Surface surface, MediaCrypto crypto, int flags);

    MediaFormat format:数据所需要的格式,

    Surface surface:用于解码器输出的渲染,如果解码器不生成原始视频输出,或想配置输出解码器的ByteBuffer,则传null

    MediaCrypto crypto:指定一个crypto对象,对媒体数据进行安全解密,对于非安全的解码器传null,

    int flags:当组件是解码器时指定为CONFIGURE_FLAG_ENCODE

    MediaFormat

    封装描述灭提数据格式的信息,以及可选特性的元数据

    媒体数据格式为key/value,key是字符串,值可以是integer、long、float、String或ByteBuffer

    特性元数据指定为string/boolean

    dequeueInputBuffer

    public final int dequeueInputBuffer(long timeoutUs)
    

    返回用于填充有效数据输入buffer的索引,如果当前没有可用buffer则返回-1

    long timeoutUs:用于等待返回可用buffer的时间

    timeoutUs == 0立马返回

    timeoutUs < 0无限期等待可用buffer

    timeoutUs > 0等待timeoutUs时间

    queueInputBuffer

    将填充好的buffer发给MediaCodec

    1. public native final void queueInputBuffer(
    2. int index,
    3. int offset, int size, long presentationTimeUs, int flags)

     int index:dequeueInputBuffer方法返回的索引

    int offset:开始输入时buffer的偏移量

    int size:有效的输入字节数

    long presentationTimeUs:此buffer的PTS(以微秒为单位)。

    int flags:这个目前我也不太理解,我直接传 0

    dequeueOutputBuffer

    从MediaCodec中获取输出buffer

    1. public final int dequeueOutputBuffer(
    2. @NonNull BufferInfo info, long timeoutUs)

    返回值

    返回INFO_TRY_AGAIN_LATER而timeoutUs指定为了非负值,表示超时了。

    返回INFO_OUTPUT_FORMAT_CHANGED表示输出格式已更改,后续数据将遵循新格式。

    参数:

    BufferInfo info:输出buffer的metadata

    long timeoutUs:跟上方一样超时时间

    releaseOutputBuffer

    使用此方法把输出的buffer返回给codec或者渲染到surface上

    1. public void releaseOutputBuffer (int index,
    2. boolean render)

    boolean render:如果codec配置了有效的surface,当这个参数为true时,则把数据渲染到surface上,一旦不使用此buffer,surafce将buffer返还给codec

    实现

    直接上代码了

    视频数据nv21硬编码为h264

    1. public class CameraToH264 {
    2. public ArrayBlockingQueue<byte[]> yuv420Queue = new ArrayBlockingQueue<>(10);
    3. private boolean isRuning;
    4. private byte[] input;
    5. private int width;
    6. private int height;
    7. private MediaCodec mediaCodec;
    8. private MediaMuxer mediaMuxer;
    9. private int mVideoTrack=-1;
    10. private long nanoTime;
    11. public void init(int width, int heigth) {
    12. nanoTime = System.nanoTime();
    13. this.width = width;
    14. this.height = heigth;
    15. MediaFormat videoFormat = MediaFormat.createVideoFormat("video/avc", width, heigth);
    16. videoFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
    17. videoFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
    18. videoFormat.setInteger(MediaFormat.KEY_BIT_RATE, width * heigth * 5);
    19. videoFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
    20. try {
    21. mediaCodec = MediaCodec.createEncoderByType("video/avc");
    22. mediaCodec.configure(videoFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
    23. mediaCodec.start();
    24. mediaMuxer = new MediaMuxer("sdcard/aaapcm/camer1.mp4", MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
    25. } catch (IOException e) {
    26. e.printStackTrace();
    27. }
    28. }
    29. public void putData(byte[] buffer) {
    30. if (yuv420Queue.size() >= 10) {
    31. yuv420Queue.poll();
    32. }
    33. yuv420Queue.add(buffer);
    34. }
    35. public void startEncoder() {
    36. new Thread(new Runnable() {
    37. @Override
    38. public void run() {
    39. isRuning = true;
    40. while (isRuning) {
    41. if (yuv420Queue.size() > 0) {
    42. input = yuv420Queue.poll();
    43. byte[] yuv420sp = new byte[width * height * 3 / 2];
    44. // 必须要转格式,否则录制的内容播放出来为绿屏
    45. NV21ToNV12(input, yuv420sp, width, height);
    46. input = yuv420sp;
    47. } else {
    48. input = null;
    49. }
    50. if (input != null) {
    51. ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
    52. ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
    53. int inputBufferIndex = mediaCodec.dequeueInputBuffer(0);
    54. if (inputBufferIndex >= 0) {
    55. ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
    56. inputBuffer.clear();
    57. inputBuffer.put(input);
    58. mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, (System.nanoTime() - nanoTime) / 1000, 0);
    59. }
    60. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    61. int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
    62. if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
    63. mVideoTrack = mediaMuxer.addTrack(mediaCodec.getOutputFormat());
    64. Log.d("mmm", "改变format");
    65. if (mVideoTrack >= 0) {
    66. mediaMuxer.start();
    67. Log.d("mmm", "开始混合");
    68. }
    69. }
    70. while (outputBufferIndex > 0) {
    71. ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
    72. if (mVideoTrack >= 0) {
    73. mediaMuxer.writeSampleData(mVideoTrack, outputBuffer, bufferInfo);
    74. Log.d("mmm", "正在写入");
    75. }
    76. mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
    77. outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
    78. if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
    79. Log.e("mmm", "video end");
    80. }
    81. }
    82. }
    83. }
    84. Log.d("mmm", "停止写入");
    85. mediaMuxer.stop();
    86. mediaMuxer.release();
    87. mediaCodec.stop();
    88. mediaCodec.release();
    89. }
    90. }).start();
    91. }
    92. public void stop() {
    93. isRuning = false;
    94. }
    95. private void NV21ToNV12(byte[] nv21, byte[] nv12, int width, int height) {
    96. if (nv21 == null || nv12 == null) return;
    97. int framesize = width * height;
    98. int i = 0, j = 0;
    99. System.arraycopy(nv21, 0, nv12, 0, framesize);
    100. for (i = 0; i < framesize; i++) {
    101. nv12[i] = nv21[i];
    102. }
    103. for (j = 0; j < framesize / 2; j += 2) {
    104. nv12[framesize + j - 1] = nv21[j + framesize];
    105. }
    106. for (j = 0; j < framesize / 2; j += 2) {
    107. nv12[framesize + j] = nv21[j + framesize - 1];
    108. }
    109. }
    110. }

    putData方法,是外部输入视频数据,我们这里是从相机拿到的数据,其格式为nv21

    init方法:初始化MediaFormat,为编码器设置帧率,码率等关键参数,初始化MediaCodec用于编码,初始化MediaMuxer用于合成MP4文件

    NV21ToNV12:转换nv21格式为nv12格式

    音频数据pcm编码为aac

    1. public class PcmToAAC {
    2. private MediaCodec encoder;
    3. private MediaMuxer mediaMuxer;
    4. private boolean isRun;
    5. private int mAudioTrack;
    6. private long prevOutputPTSUs;
    7. private Queue queue;
    8. public void init() {
    9. queue = new Queue();
    10. queue.init(1024 * 100);
    11. try {
    12. MediaFormat audioFormat = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, 16000, 1);
    13. //设置比特率
    14. audioFormat.setInteger(MediaFormat.KEY_BIT_RATE, 64000);
    15. audioFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
    16. audioFormat.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1);
    17. audioFormat.setInteger(MediaFormat.KEY_SAMPLE_RATE, 16000);
    18. encoder = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);
    19. encoder.configure(audioFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
    20. encoder.start();
    21. mediaMuxer = new MediaMuxer("sdcard/pcm.aac", MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
    22. Log.d("mmm", "accinit");
    23. } catch (IOException e) {
    24. e.printStackTrace();
    25. }
    26. }
    27. public void putData(byte[] buffer, int len) {
    28. queue.addAll(buffer);
    29. }
    30. public void start() {
    31. isRun = true;
    32. Log.d("mmm", "aacstart");
    33. new Thread(new Runnable() {
    34. @Override
    35. public void run() {
    36. ByteBuffer[] inputBuffers = encoder.getInputBuffers();
    37. ByteBuffer[] outputBuffers = encoder.getOutputBuffers();
    38. while (isRun) {
    39. byte[] bytes = new byte[640];
    40. int all = queue.getAll(bytes, 640);
    41. if (all < 0) {
    42. try {
    43. Thread.sleep(50);
    44. continue;
    45. } catch (InterruptedException e) {
    46. e.printStackTrace();
    47. }
    48. }
    49. int inputBufferindex = encoder.dequeueInputBuffer(0);
    50. if (inputBufferindex > 0) {
    51. ByteBuffer inputBuffer = inputBuffers[inputBufferindex];
    52. inputBuffer.clear();
    53. inputBuffer.put(bytes);
    54. encoder.queueInputBuffer(inputBufferindex, 0, bytes.length, getPTSUs(), 0);
    55. }
    56. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    57. int outputBufferindex = encoder.dequeueOutputBuffer(bufferInfo, 0);
    58. if (outputBufferindex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
    59. Log.d("mmm", "改变format");
    60. mAudioTrack = mediaMuxer.addTrack(encoder.getOutputFormat());
    61. if (mAudioTrack >= 0) {
    62. mediaMuxer.start();
    63. Log.d("mmm", "开始混合");
    64. }
    65. } else if (outputBufferindex == MediaCodec.INFO_TRY_AGAIN_LATER) {
    66. Log.d("mmm", "try-later");
    67. }
    68. while (outputBufferindex > 0) {
    69. if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
    70. bufferInfo.size = 0;
    71. }
    72. bufferInfo.presentationTimeUs = getPTSUs();
    73. ByteBuffer outputBuffer = outputBuffers[outputBufferindex];
    74. // outputBuffer.position(bufferInfo.offset);
    75. if (mAudioTrack >= 0) {
    76. mediaMuxer.writeSampleData(mAudioTrack, outputBuffer, bufferInfo);
    77. Log.d("mmm", "写入文件");
    78. }
    79. prevOutputPTSUs = bufferInfo.presentationTimeUs;
    80. encoder.releaseOutputBuffer(outputBufferindex, false);
    81. outputBufferindex = encoder.dequeueOutputBuffer(bufferInfo, 0);
    82. }
    83. }
    84. Log.d("mmm", "aacstop");
    85. encoder.stop();
    86. encoder.release();
    87. mediaMuxer.stop();
    88. mediaMuxer.release();
    89. }
    90. }).start();
    91. }
    92. public void stop() {
    93. isRun = false;
    94. }
    95. private long getPTSUs() {
    96. long result = System.nanoTime() / 1000L;
    97. // presentationTimeUs should be monotonic
    98. // otherwise muxer fail to write
    99. if (result < prevOutputPTSUs)
    100. result = (prevOutputPTSUs - result) + result;
    101. return result;
    102. }
    103. }

    putData方法,是外部输入pcm音频数据,我们这里是从AudioRecoder拿到的pcm数据

    init方法:初始化MediaFormat,为编码器设置声道数和采样率等关键参数,初始化MediaCodec用于编码,初始化MediaMuxer用于合成wav文件

    本文福利, 免费领取C++音视频学习资料包、技术视频,内容包括(音视频开发,面试题,FFmpeg webRTC rtmp hls rtsp ffplay srs↓↓↓↓↓↓见下面↓↓文章底部点击免费领取↓↓

    采集音频和视频数据,合成MP4音视频文件

    1. public class AudioThread extends Thread {
    2. private static final int TIMEOUT_USEC = 10000;
    3. private static final String MIME_TYPE = "audio/mp4a-latm";
    4. private static final int SAMPLE_RATE = 16000;
    5. private static final int BIT_RATE = 64000;
    6. private MediaFormat audioFormat;
    7. private MediaCodec mMediaCodec;
    8. private final Queue queue;
    9. private boolean isRun;
    10. private long prevOutputPTSUs;
    11. private MuxerThread muxerThread;
    12. private MediaMuxer mediaMuxer;
    13. private int audiotrack;
    14. public AudioThread(MuxerThread muxerThread) {
    15. Log.d("mmm", "AudioThread");
    16. this.muxerThread = muxerThread;
    17. queue = new Queue();
    18. queue.init(1024 * 100);
    19. preper();
    20. }
    21. private void preper() {
    22. audioFormat = MediaFormat.createAudioFormat(MIME_TYPE, SAMPLE_RATE, 1);
    23. audioFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
    24. audioFormat.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1);
    25. audioFormat.setInteger(MediaFormat.KEY_SAMPLE_RATE, SAMPLE_RATE);
    26. try {
    27. mMediaCodec = MediaCodec.createEncoderByType(MIME_TYPE);
    28. mMediaCodec.configure(audioFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
    29. mMediaCodec.start();
    30. Log.d("mmm", "preper");
    31. } catch (IOException e) {
    32. e.printStackTrace();
    33. }
    34. isRun = true;
    35. }
    36. public void addAudioData(byte[] data) {
    37. if (!isRun) return;
    38. queue.addAll(data);
    39. }
    40. public void audioStop() {
    41. isRun = false;
    42. }
    43. @Override
    44. public void run() {
    45. while (isRun) {
    46. byte[] bytes = new byte[640];
    47. int all = queue.getAll(bytes, 640);
    48. if (all < 0) {
    49. try {
    50. Thread.sleep(50);
    51. continue;
    52. } catch (InterruptedException e) {
    53. e.printStackTrace();
    54. }
    55. }
    56. ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
    57. ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers();
    58. int inputBufferIndex = mMediaCodec.dequeueInputBuffer(TIMEOUT_USEC);
    59. if (inputBufferIndex > 0) {
    60. ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
    61. inputBuffer.clear();
    62. inputBuffer.put(bytes);
    63. mMediaCodec.queueInputBuffer(inputBufferIndex, 0, bytes.length, getPTSUs(), 0);
    64. }
    65. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    66. int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
    67. do {
    68. if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
    69. } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
    70. outputBuffers = mMediaCodec.getOutputBuffers();
    71. } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
    72. Log.e("mmm", "audioINFO_OUTPUT_FORMAT_CHANGED");
    73. MediaFormat format = mMediaCodec.getOutputFormat(); // API >= 16
    74. if (muxerThread != null) {
    75. Log.e("mmm", "添加音轨 INFO_OUTPUT_FORMAT_CHANGED " + format.toString());
    76. muxerThread.addTrackIndex(MuxerThread.TRACK_AUDIO, format);
    77. }
    78. } else if (outputBufferIndex < 0) {
    79. Log.e("mmm", "encoderStatus < 0");
    80. } else {
    81. final ByteBuffer encodedData = outputBuffers[outputBufferIndex];
    82. if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
    83. bufferInfo.size = 0;
    84. }
    85. if (bufferInfo.size != 0 && muxerThread != null && muxerThread.isStart()) {
    86. bufferInfo.presentationTimeUs = getPTSUs();
    87. muxerThread.addMuxerData(new MuxerData(MuxerThread.TRACK_AUDIO, encodedData, bufferInfo));
    88. prevOutputPTSUs = bufferInfo.presentationTimeUs;
    89. }
    90. mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
    91. }
    92. outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
    93. } while (outputBufferIndex >= 0);
    94. }
    95. mMediaCodec.stop();
    96. mMediaCodec.release();
    97. Log.d("mmm", "audiomMediaCodec");
    98. }
    99. private long getPTSUs() {
    100. long result = System.nanoTime() / 1000L;
    101. // presentationTimeUs should be monotonic
    102. // otherwise muxer fail to write
    103. if (result < prevOutputPTSUs)
    104. result = (prevOutputPTSUs - result) + result;
    105. return result;
    106. }
    107. }
    1. public class VideoThread extends Thread {
    2. private final MuxerThread muxerThread;
    3. private final int mWidth;
    4. private final int mHeigth;
    5. public static final int IMAGE_HEIGHT = 1080;
    6. public static final int IMAGE_WIDTH = 1920;
    7. // 编码相关参数
    8. private static final String MIME_TYPE = "video/avc"; // H.264 Advanced Video
    9. private static final int FRAME_RATE = 25; // 帧率
    10. private static final int IFRAME_INTERVAL = 10; // I帧间隔(GOP)
    11. private static final int TIMEOUT_USEC = 10000; // 编码超时时间
    12. private static final int COMPRESS_RATIO = 256;
    13. private static final int BIT_RATE = IMAGE_HEIGHT * IMAGE_WIDTH * 3 * 8 * FRAME_RATE / COMPRESS_RATIO; // bit rate CameraWrapper.
    14. private final Vector<byte[]> frameBytes;
    15. private final byte[] mFrameData;
    16. private MediaFormat mediaFormat;
    17. private MediaCodec mMediaCodec;
    18. private boolean isRun;
    19. private int videoTrack;
    20. public VideoThread(int width, int heigth, MuxerThread muxerThread) {
    21. Log.d("mmm", "VideoThread");
    22. this.muxerThread = muxerThread;
    23. this.mWidth = width;
    24. this.mHeigth = heigth;
    25. mFrameData = new byte[this.mWidth * this.mHeigth * 3 / 2];
    26. frameBytes = new Vector<byte[]>();
    27. preper();
    28. }
    29. private void preper() {
    30. mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeigth);
    31. mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
    32. mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
    33. mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
    34. mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
    35. try {
    36. mMediaCodec = MediaCodec.createEncoderByType("video/avc");
    37. mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
    38. mMediaCodec.start();
    39. Log.d("mmm", "preper");
    40. } catch (IOException e) {
    41. e.printStackTrace();
    42. }
    43. }
    44. public void add(byte[] data) {
    45. if (!isRun) return;
    46. if (frameBytes.size() > 10) {
    47. frameBytes.remove(0);
    48. }
    49. frameBytes.add(data);
    50. }
    51. @Override
    52. public void run() {
    53. isRun = true;
    54. while (isRun) {
    55. if (!frameBytes.isEmpty()) {
    56. byte[] bytes = this.frameBytes.remove(0);
    57. Log.e("ang-->", "解码视频数据:" + bytes.length);
    58. NV21toI420SemiPlanar(bytes, mFrameData, mWidth, mHeigth);
    59. ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers();
    60. ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
    61. int inputBufferindex = mMediaCodec.dequeueInputBuffer(TIMEOUT_USEC);
    62. if (inputBufferindex > 0) {
    63. ByteBuffer inputBuffer = inputBuffers[inputBufferindex];
    64. inputBuffer.clear();
    65. inputBuffer.put(mFrameData);
    66. mMediaCodec.queueInputBuffer(inputBufferindex, 0, mFrameData.length, System.nanoTime() / 1000, 0);
    67. }
    68. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    69. int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
    70. do {
    71. if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
    72. } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
    73. outputBuffers = mMediaCodec.getOutputBuffers();
    74. } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
    75. Log.e("mmm", "videoINFO_OUTPUT_FORMAT_CHANGED");
    76. MediaFormat newFormat = mMediaCodec.getOutputFormat();
    77. muxerThread.addTrackIndex(MuxerThread.TRACK_VIDEO, newFormat);
    78. } else if (outputBufferIndex < 0) {
    79. Log.e("mmm", "outputBufferIndex < 0");
    80. } else {
    81. ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
    82. if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
    83. Log.d("mmm", "ignoring BUFFER_FLAG_CODEC_CONFIG");
    84. bufferInfo.size = 0;
    85. }
    86. if (bufferInfo.size != 0 && muxerThread.isStart()) {
    87. muxerThread.addMuxerData(new MuxerData(MuxerThread.TRACK_VIDEO, outputBuffer, bufferInfo));
    88. }
    89. mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
    90. }
    91. outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
    92. } while (outputBufferIndex > 0);
    93. }
    94. }
    95. mMediaCodec.stop();
    96. mMediaCodec.release();
    97. Log.d("mmm","videomMediaCodec");
    98. }
    99. public void stopvideo() {
    100. isRun = false;
    101. }
    102. private static void NV21toI420SemiPlanar(byte[] nv21bytes, byte[] i420bytes, int width, int height) {
    103. System.arraycopy(nv21bytes, 0, i420bytes, 0, width * height);
    104. for (int i = width * height; i < nv21bytes.length; i += 2) {
    105. i420bytes[i] = nv21bytes[i + 1];
    106. i420bytes[i + 1] = nv21bytes[i];
    107. }
    108. }
    109. }
    1. public class MuxerThread extends Thread {
    2. public static final String TRACK_AUDIO = "TRACK_AUDIO";
    3. public static final String TRACK_VIDEO = "TRACK_VIDEO";
    4. private Vector<MuxerData> muxerDatas;
    5. private MediaMuxer mediaMuxer;
    6. private boolean isAddAudioTrack;
    7. private boolean isAddVideoTrack;
    8. private static MuxerThread muxerThread = new MuxerThread();
    9. private int videoTrack;
    10. private int audioTrack;
    11. private VideoThread videoThread;
    12. private AudioThread audioThread;
    13. private boolean isRun;
    14. private MuxerThread() {
    15. Log.d("mmm", "MuxerThread");
    16. }
    17. public static MuxerThread getInstance() {
    18. return muxerThread;
    19. }
    20. public synchronized void addTrackIndex(String track, MediaFormat format) {
    21. if (isAddAudioTrack && isAddVideoTrack) {
    22. return;
    23. }
    24. if (!isAddVideoTrack && track.equals(TRACK_VIDEO)) {
    25. Log.e("mmm", "添加视频轨");
    26. videoTrack = mediaMuxer.addTrack(format);
    27. if (videoTrack >= 0) {
    28. isAddVideoTrack = true;
    29. Log.e("mmm", "添加视频轨完成");
    30. }
    31. }
    32. if (!isAddAudioTrack && track.equals(TRACK_AUDIO)) {
    33. Log.e("mmm", "添加音频轨");
    34. audioTrack = mediaMuxer.addTrack(format);
    35. if (audioTrack >= 0) {
    36. isAddAudioTrack = true;
    37. Log.e("mmm", "添加音频轨完成");
    38. }
    39. }
    40. if (isStart()) {
    41. mediaMuxer.start();
    42. }
    43. }
    44. public boolean isStart() {
    45. return isAddAudioTrack && isAddVideoTrack;
    46. }
    47. public void addMuxerData(MuxerData muxerData) {
    48. muxerDatas.add(muxerData);
    49. }
    50. public void addVideoData(byte[] data) {
    51. if (!isRun) return;
    52. videoThread.add(data);
    53. }
    54. public void addAudioData(byte[] data) {
    55. if (!isRun || audioThread == null) return;
    56. audioThread.addAudioData(data);
    57. }
    58. public void startMuxer(int width, int height) {
    59. Log.d("mmm", "startMuxer");
    60. try {
    61. mediaMuxer = new MediaMuxer("sdcard/camer111.mp4", MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
    62. muxerDatas = new Vector<>();
    63. } catch (IOException e) {
    64. e.printStackTrace();
    65. }
    66. isRun = true;
    67. videoThread = new VideoThread(width, height, this);
    68. videoThread.start();
    69. audioThread = new AudioThread(this);
    70. audioThread.start();
    71. start();
    72. }
    73. public void exit() {
    74. isRun = false;
    75. videoThread.stopvideo();
    76. audioThread.audioStop();
    77. }
    78. @Override
    79. public void run() {
    80. while (isRun) {
    81. if (!muxerDatas.isEmpty() && isStart()) {
    82. MuxerData muxerData = muxerDatas.remove(0);
    83. if (muxerData.trackIndex.equals(TRACK_VIDEO) && videoTrack >= 0) {
    84. Log.d("mmm", "写入视频" + muxerData.bufferInfo.size);
    85. mediaMuxer.writeSampleData(videoTrack, muxerData.byteBuf, muxerData.bufferInfo);
    86. }
    87. if (muxerData.trackIndex.equals(TRACK_AUDIO) && audioTrack >= 0) {
    88. Log.d("mmm", "写入音频" + muxerData.bufferInfo.size);
    89. mediaMuxer.writeSampleData(audioTrack, muxerData.byteBuf, muxerData.bufferInfo);
    90. }
    91. }
    92. }
    93. mediaMuxer.stop();
    94. mediaMuxer.release();
    95. Log.d("mmm", "mediaMuxerstop");
    96. }
    97. }
    1. public class Queue {
    2. private byte[] buffer;
    3. private int head;
    4. private int tail;
    5. private int count;
    6. private int size;
    7. public void init(int n) {
    8. buffer = new byte[n];
    9. size = n;
    10. head = 0;
    11. tail = 0;
    12. count = 0;
    13. }
    14. public void add(byte data) {
    15. if (size == count) {
    16. // Log.d("mmm", "队列已满");
    17. get();
    18. }
    19. if (tail == size) {
    20. tail = 0;
    21. }
    22. buffer[tail] = data;
    23. tail++;
    24. count++;
    25. }
    26. public byte get() {
    27. if (count == 0) {
    28. Log.d("mmm", "队列为空");
    29. return -1;
    30. }
    31. if (head == size) {
    32. head = 0;
    33. }
    34. byte data = buffer[head];
    35. head++;
    36. count--;
    37. return data;
    38. }
    39. public void addAll(byte[] data) {
    40. synchronized (this) {
    41. for (byte b : data) {
    42. add(b);
    43. }
    44. }
    45. }
    46. public int getAll(byte[] data, int len) {
    47. synchronized (this) {
    48. if (count < len) {
    49. return -1;
    50. }
    51. int j = 0;
    52. for (int i = 0; i < len; i++) {
    53. byte b = get();
    54. data[i] = b;
    55. j++;
    56. }
    57. return j;
    58. }
    59. }
    60. }
    1. public class MuxerData {
    2. String trackIndex;
    3. ByteBuffer byteBuf;
    4. MediaCodec.BufferInfo bufferInfo;
    5. public MuxerData(String trackIndex, ByteBuffer byteBuf, MediaCodec.BufferInfo bufferInfo) {
    6. this.trackIndex = trackIndex;
    7. this.byteBuf = byteBuf;
    8. this.bufferInfo = bufferInfo;
    9. }
    10. }

    介绍一下各个类的作用

    AudioThread:负责收集音频pcm,然后硬编码为aac,最后把编码完的数据传递给合成器MuxerThread

    VideoThread:负责收集视频yuv,然后硬编码为h264,最后把编码完的数据给合成器MuxerThread

    Queue:就是一个普通的循环队列,负责装音频数据pcm

    MuxerData:负责封装编码完成的aac和h264

    MuxerThread:负责把编码好的aac和h264,合成音视频Mp4

    硬解码AAC

    1. public void init1() {
    2. //音频
    3. audioExtractor = new MediaExtractor();
    4. try {
    5. audioExtractor.setDataSource("/sdcard/camer111.mp4");
    6. int audioTrack = -1;
    7. int trackCount1 = audioExtractor.getTrackCount();
    8. for (int i = 0; i < trackCount1; i++) {
    9. MediaFormat trackFormat1 = audioExtractor.getTrackFormat(i);
    10. String string = trackFormat1.getString(MediaFormat.KEY_MIME);
    11. if (string.startsWith("audio/")) {
    12. audioTrack = i;
    13. Log.d("mmm", "找到轨道" + audioTrack);
    14. }
    15. }
    16. audioExtractor.selectTrack(audioTrack);
    17. audiodecoder = MediaCodec.createDecoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);
    18. MediaFormat trackFormat1 = audioExtractor.getTrackFormat(audioTrack);
    19. audiodecoder.configure(trackFormat1, null, null, 0);
    20. audiodecoder.start();
    21. int bufferSize = AudioTrack.getMinBufferSize(16000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
    22. audioTrackplay = new AudioTrack(AudioManager.STREAM_MUSIC, 16000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize, AudioTrack.MODE_STREAM);
    23. audioTrackplay.play();
    24. } catch (IOException e) {
    25. e.printStackTrace();
    26. }
    27. }
    28. public void start1() {
    29. new Thread(new Runnable() {
    30. @Override
    31. public void run() {
    32. while (true) {
    33. int inputBufferindex = audiodecoder.dequeueInputBuffer(0);
    34. if (inputBufferindex > 0) {
    35. ByteBuffer inputBuffer = audiodecoder.getInputBuffer(inputBufferindex);
    36. int sampleSize = audioExtractor.readSampleData(inputBuffer, 0);
    37. if (sampleSize > 0) {
    38. Log.d("mmm", "找到数据渲染pcm");
    39. audiodecoder.queueInputBuffer(inputBufferindex, 0, sampleSize, audioExtractor.getSampleTime(), 0);
    40. audioExtractor.advance();
    41. }
    42. }
    43. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    44. int outputBufferIndex = audiodecoder.dequeueOutputBuffer(bufferInfo, 0);
    45. while (outputBufferIndex > 0) {
    46. Log.d("mmm", "开始渲染pcm");
    47. ByteBuffer outputBuffer = audiodecoder.getOutputBuffer(outputBufferIndex);
    48. byte[] bytes = new byte[bufferInfo.size];
    49. outputBuffer.get(bytes);
    50. outputBuffer.clear();
    51. audioTrackplay.write(bytes, 0, bufferInfo.size);
    52. audiodecoder.releaseOutputBuffer(outputBufferIndex, true);
    53. outputBufferIndex = audiodecoder.dequeueOutputBuffer(bufferInfo, 0);
    54. }
    55. }
    56. }
    57. }).start();
    58. }

    逻辑很简单,从MP4文件取出音轨,然后硬解码AAC为pcm,最后用AudioTrack播放pcm

    硬解码h264,并且显示在surface上

    1. public void init(SurfaceHolder surfaceHolder) {
    2. try {
    3. //视频
    4. videoExtractor = new MediaExtractor();
    5. videoExtractor.setDataSource("/sdcard/camer111.mp4");
    6. int videoTrack = -1;
    7. int trackCount = videoExtractor.getTrackCount();
    8. for (int i = 0; i < trackCount; i++) {
    9. MediaFormat trackFormat = videoExtractor.getTrackFormat(i);
    10. String string = trackFormat.getString(MediaFormat.KEY_MIME);
    11. Log.d("mmm", "找到轨道" +string );
    12. if (string.startsWith("video/")) {
    13. videoTrack = i;
    14. Log.d("mmm", "找到轨道video" + videoTrack);
    15. }
    16. }
    17. videoExtractor.selectTrack(videoTrack);
    18. videodecoder = MediaCodec.createDecoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
    19. MediaFormat trackFormat = videoExtractor.getTrackFormat(videoTrack);
    20. videodecoder.configure(trackFormat, surfaceHolder.getSurface(), null, 0);
    21. videodecoder.start();
    22. } catch(
    23. IOException e)
    24. {
    25. e.printStackTrace();
    26. }
    27. }
    28. public void start() {
    29. isRunning = true;
    30. new Thread(new Runnable() {
    31. @Override
    32. public void run() {
    33. while (isRunning) {
    34. int inputBufferindex = videodecoder.dequeueInputBuffer(0);
    35. if (inputBufferindex > 0) {
    36. ByteBuffer inputBuffer = videodecoder.getInputBuffer(inputBufferindex);
    37. int sampleSize = videoExtractor.readSampleData(inputBuffer, 0);
    38. long sampleTime = videoExtractor.getSampleTime();
    39. if (sampleSize > 0) {
    40. Log.d("mmm", "找到数据渲染");
    41. videodecoder.queueInputBuffer(inputBufferindex, 0, sampleSize, videoExtractor.getSampleTime(), 0);
    42. videoExtractor.advance();
    43. try {
    44. sleep(30);
    45. } catch (InterruptedException e) {
    46. e.printStackTrace();
    47. }
    48. }
    49. }
    50. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    51. int outputBufferIndex = videodecoder.dequeueOutputBuffer(bufferInfo, 0);
    52. while (outputBufferIndex > 0) {
    53. Log.d("mmm", "开始渲染");
    54. videodecoder.releaseOutputBuffer(outputBufferIndex, true);
    55. outputBufferIndex = videodecoder.dequeueOutputBuffer(bufferInfo, 0);
    56. }
    57. }
    58. }
    59. }).start();
    60. }

    逻辑很简单,先从MP4中分离出视频轨道,然后取出h264数据解码然后直接显示在surface上

    h264转换为图片

    这部分代码也是网上扒的,做下记录

    1. public class H264ToBitmap {
    2. private final int decodeColorFormat = MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible;
    3. public static final int FILE_TypeI420 = 1;
    4. public static final int FILE_TypeNV21 = 2;
    5. public static final int FILE_TypeJPEG = 3;
    6. private int outputImageFileType = FILE_TypeJPEG;
    7. private static final int COLOR_FormatI420 = 1;
    8. private static final int COLOR_FormatNV21 = 2;
    9. public void videoDecode() {
    10. try {
    11. MediaExtractor mediaExtractor = new MediaExtractor();
    12. mediaExtractor.setDataSource("/sdcard/eee.mp4");
    13. int vidioTrack = selectTrack(mediaExtractor);
    14. if (vidioTrack < 0) {
    15. Log.d("mmm", "未找到正确轨道");
    16. }
    17. mediaExtractor.selectTrack(vidioTrack);
    18. MediaFormat mediaFormat = mediaExtractor.getTrackFormat(vidioTrack);
    19. String MIME = mediaFormat.getString(MediaFormat.KEY_MIME);
    20. MediaCodec decoder = MediaCodec.createDecoderByType(MIME);
    21. showSupportedColorFormat(decoder.getCodecInfo().getCapabilitiesForType(MIME));
    22. if (isColorFormatSupported(decodeColorFormat, decoder.getCodecInfo().getCapabilitiesForType(MIME))) {
    23. mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, decodeColorFormat);
    24. } else {
    25. Log.d("mmm", "不支持此类型");
    26. }
    27. decodeFramesToImage(decoder, mediaExtractor, mediaFormat);
    28. } catch (IOException e) {
    29. e.printStackTrace();
    30. }
    31. }
    32. private int selectTrack(MediaExtractor mediaExtractor) {
    33. int trackCount = mediaExtractor.getTrackCount();
    34. for (int i = 0; i < trackCount; i++) {
    35. MediaFormat trackFormat = mediaExtractor.getTrackFormat(i);
    36. String string = trackFormat.getString(MediaFormat.KEY_MIME);
    37. if (string.startsWith("video/")) {
    38. return i;
    39. }
    40. }
    41. return -1;
    42. }
    43. private void showSupportedColorFormat(MediaCodecInfo.CodecCapabilities capabilitiesForType) {
    44. for (int a : capabilitiesForType.colorFormats) {
    45. Log.d("mmm", "支持格式" + a + "/");
    46. }
    47. }
    48. private boolean isColorFormatSupported(int decodeColorFormat, MediaCodecInfo.CodecCapabilities capabilitiesForType) {
    49. for (int a : capabilitiesForType.colorFormats) {
    50. if (a == decodeColorFormat) {
    51. return true;
    52. }
    53. }
    54. return false;
    55. }
    56. private void decodeFramesToImage(MediaCodec decoder, MediaExtractor mediaExtractor, MediaFormat mediaFormat) {
    57. MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    58. boolean sawInputEOS = false;
    59. boolean sawOutputEOS = false;
    60. decoder.configure(mediaFormat, null, null, 0);
    61. decoder.start();
    62. final int width = mediaFormat.getInteger(MediaFormat.KEY_WIDTH);
    63. final int height = mediaFormat.getInteger(MediaFormat.KEY_HEIGHT);
    64. int outputFrameCount = 0;
    65. while (!sawOutputEOS) {
    66. if (!sawInputEOS) {
    67. int inputBufferId = decoder.dequeueInputBuffer(0);
    68. if (inputBufferId >= 0) {
    69. ByteBuffer inputBuffer = decoder.getInputBuffer(inputBufferId);
    70. int sampleSize = mediaExtractor.readSampleData(inputBuffer, 0);
    71. if (sampleSize >= 0) {
    72. long sampleTime = mediaExtractor.getSampleTime();
    73. decoder.queueInputBuffer(inputBufferId, 0, sampleSize, sampleTime, 0);
    74. mediaExtractor.advance();
    75. } else {
    76. decoder.queueInputBuffer(inputBufferId, 0, 0, 0L, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
    77. sawInputEOS = true;
    78. }
    79. }
    80. }
    81. int outputBufferId = decoder.dequeueOutputBuffer(bufferInfo, 0);
    82. if (outputBufferId >= 0) {
    83. if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
    84. sawOutputEOS = true;
    85. }
    86. if (bufferInfo.size != 0) {
    87. outputFrameCount++;
    88. Image outputImage = decoder.getOutputImage(outputBufferId);
    89. Log.d("mmm", "imageformat" + outputImage.getFormat());
    90. if (outputImageFileType != -1) {
    91. String fileName;
    92. switch (outputImageFileType) {
    93. case FILE_TypeI420:
    94. fileName = "/sdcard/aaapcm" + String.format("frame_%05d_I420_%dx%d.yuv", outputFrameCount, width, height);
    95. dumpFile(fileName, getDataFromImage(outputImage, COLOR_FormatI420));
    96. break;
    97. case FILE_TypeNV21:
    98. fileName = "/sdcard/aaapcm" + String.format("frame_%05d_NV21_%dx%d.yuv", outputFrameCount, width, height);
    99. dumpFile(fileName, getDataFromImage(outputImage, COLOR_FormatNV21));
    100. break;
    101. case FILE_TypeJPEG:
    102. fileName = "/sdcard/aaapcm/" + String.format("frame_%05d.jpg", outputFrameCount);
    103. compressToJpeg(fileName, outputImage);
    104. break;
    105. }
    106. }
    107. outputImage.close();
    108. decoder.releaseOutputBuffer(outputBufferId, false);
    109. }
    110. }
    111. }
    112. }
    113. private byte[] getDataFromImage(Image image, int colorFormat) {
    114. if (colorFormat != COLOR_FormatI420 && colorFormat != COLOR_FormatNV21) {
    115. Log.d("mmm", "only support COLOR_FormatI420");
    116. throw new IllegalArgumentException("only support COLOR_FormatI420 " + "and COLOR_FormatNV21");
    117. }
    118. if (!isImageFormatSupported(image)) {
    119. Log.d("mmm", "不支持图片类型");
    120. throw new RuntimeException("can't convert Image to byte array, format " + image.getFormat());
    121. }
    122. Rect crop = image.getCropRect();
    123. int format = image.getFormat();
    124. int width = crop.width();
    125. int height = crop.height();
    126. Image.Plane[] planes = image.getPlanes();
    127. byte[] data = new byte[width * height * ImageFormat.getBitsPerPixel(format) / 8];
    128. byte[] rowData = new byte[planes[0].getRowStride()];
    129. int channelOffset = 0;
    130. int outputStride = 1;
    131. for (int i = 0; i < planes.length; i++) {
    132. switch (i) {
    133. case 0:
    134. channelOffset = 0;
    135. outputStride = 1;
    136. break;
    137. case 1:
    138. if (colorFormat == COLOR_FormatI420) {
    139. channelOffset = width * height;
    140. outputStride = 1;
    141. } else if (colorFormat == COLOR_FormatNV21) {
    142. channelOffset = width * height + 1;
    143. outputStride = 2;
    144. }
    145. break;
    146. case 2:
    147. if (colorFormat == COLOR_FormatI420) {
    148. channelOffset = (int) (width * height * 1.25);
    149. outputStride = 1;
    150. } else if (colorFormat == COLOR_FormatNV21) {
    151. channelOffset = width * height;
    152. outputStride = 2;
    153. }
    154. break;
    155. }
    156. ByteBuffer buffer = planes[i].getBuffer();
    157. int rowStride = planes[i].getRowStride();
    158. int pixelStride = planes[i].getPixelStride();
    159. int shift = (i == 0) ? 0 : 1;
    160. int w = width >> shift;
    161. int h = height >> shift;
    162. buffer.position(rowStride * (crop.top >> shift) + pixelStride * (crop.left >> shift));
    163. for (int row = 0; row < h; row++) {
    164. int length;
    165. if (pixelStride == 1 && outputStride == 1) {
    166. length = w;
    167. buffer.get(data, channelOffset, length);
    168. channelOffset += length;
    169. } else {
    170. length = (w - 1) * pixelStride + 1;
    171. buffer.get(rowData, 0, length);
    172. for (int col = 0; col < w; col++) {
    173. data[channelOffset] = rowData[col * pixelStride];
    174. channelOffset += outputStride;
    175. }
    176. }
    177. if (row < h - 1) {
    178. buffer.position(buffer.position() + rowStride - length);
    179. }
    180. }
    181. }
    182. return data;
    183. }
    184. private static boolean isImageFormatSupported(Image image) {
    185. int format = image.getFormat();
    186. switch (format) {
    187. case ImageFormat.YUV_420_888:
    188. case ImageFormat.NV21:
    189. case ImageFormat.YV12:
    190. return true;
    191. }
    192. return false;
    193. }
    194. private void dumpFile(String fileName, byte[] data) {
    195. FileOutputStream outStream;
    196. try {
    197. outStream = new FileOutputStream(fileName);
    198. } catch (IOException ioe) {
    199. throw new RuntimeException("Unable to create output file " + fileName, ioe);
    200. }
    201. try {
    202. outStream.write(data);
    203. outStream.close();
    204. } catch (IOException ioe) {
    205. throw new RuntimeException("failed writing data to file " + fileName, ioe);
    206. }
    207. }
    208. private void compressToJpeg(String fileName, Image image) {
    209. FileOutputStream outStream;
    210. try {
    211. outStream = new FileOutputStream(fileName);
    212. } catch (IOException ioe) {
    213. throw new RuntimeException("Unable to create output file " + fileName, ioe);
    214. }
    215. Rect rect = image.getCropRect();
    216. YuvImage yuvImage = new YuvImage(getDataFromImage(image, COLOR_FormatNV21), ImageFormat.NV21, rect.width(), rect.height(), null);
    217. yuvImage.compressToJpeg(rect, 100, outStream);
    218. Log.d("mmm", "写入图片" + fileName);
    219. }
    220. }

    播放器音视频同步

    上方已经可以单独播放MP4的音频和视频

    本文福利, 免费领取C++音视频学习资料包、技术视频,内容包括(音视频开发,面试题,FFmpeg webRTC rtmp hls rtsp ffplay srs↓↓↓↓↓↓见下面↓↓文章底部点击免费领取↓↓

  • 相关阅读:
    node使用jsonwebtoken生成token与验证是否过期
    Rocky Linux 运维工具 mv
    利用APT技术实现安卓组件化的解耦(上)
    .。。。。。。。。。。
    kilo TextEditor-1
    技术分享 | 接口自动化测试如何处理 Header cookie
    SpringBoot配置全局异常处理
    ADE Explorer和Assembler的仿真技巧
    Impala计算日期差datediff
    OceanMind海睿思入选弯弓研究院《2023中国营销技术生态图谱8.0》
  • 原文地址:https://blog.csdn.net/m0_60259116/article/details/126560206