我的编程空间,编程开发者的网络收藏夹
学习永远不晚

Android进阶之视频压缩

短信预约 -IT技能 免费直播动态提醒
省份

北京

  • 北京
  • 上海
  • 天津
  • 重庆
  • 河北
  • 山东
  • 辽宁
  • 黑龙江
  • 吉林
  • 甘肃
  • 青海
  • 河南
  • 江苏
  • 湖北
  • 湖南
  • 江西
  • 浙江
  • 广东
  • 云南
  • 福建
  • 海南
  • 山西
  • 四川
  • 陕西
  • 贵州
  • 安徽
  • 广西
  • 内蒙
  • 西藏
  • 新疆
  • 宁夏
  • 兵团
手机号立即预约

请填写图片验证码后获取短信验证码

看不清楚,换张图片

免费获取短信验证码

Android进阶之视频压缩

视频压缩是一个有关视频类项目必不可少的环节,选择一个合适且稳定的压缩工具更是领开发者比较头疼的一件事情,网上压缩工具比比皆是,一旦入坑,如果出问题后期出现问题,各种成本更是令人畏惧,这篇文章或许可以让你少走一些“弯路”。
首先这里的视频压缩使用的是 VideoProcessor 介意者勿扰~,并且是音视频类实战项目长期稳定之后才写的此文章,压缩比基本保持在 7:3 左右。

接下来开始实战使用,以及遇到的问题。

导入依赖

com.github.yellowcath:VideoProcessor:2.4.2

调用方法

VideoProcessor.processor(mPresenter)              .input(url)              .outWidth(1600)              .outHeight(1200)              .output(outputUrl)              .bitrate(mBitrate)              .frameRate(10)              .process()

方法介绍

.processor(mPresenter) - mPresenter 传入当前引用.input(url) - url本地视频地址.outWidth(1600) - 压缩后的宽度.outHeight(1200) - 压缩后的高度.output(outputUrl) - 压缩后的地址.bitrate(mBitrate) - 比特率.frameRate(10) - 帧速率

比特率会影响到压缩视频之后的效果,可以动态去设置比特率和帧速率去调整压缩效果

//默认三百万,有数据后拿数据的百分之四十  var mBitrate = 3000000  try {         //拿到视频的比特率         var media = MediaMetadataRetriever()         media.setDataSource(locationVideoUrl)         val extractMetadata =               media.extractMetadata(MediaMetadataRetriever.METADATA_KEY_BITRATE)         Log.e(ContentValues.TAG, "当前视频的大小242412412 比特率->:${extractMetadata}")         if (extractMetadata != null && extractMetadata.isNotEmpty()) {                mBitrate = (extractMetadata.toInt() * 0.4).toInt()          }        } catch (e: Exception) {           e.printStackTrace()        }

以上就是压缩视频的使用步骤
以下是出现的问题

首先上述代码中很明显的错误就是压缩后的宽高是写死的,这样当用户传入不同形状的大小肯定会变形,所以我们可以根据原视频宽高进行压缩

基本我们会去本地拿资源会经过 onActivityResult 回调并且拿到 Intent data
可以通过

    public static List obtainMultipleResult(Intent data) {        List result = new ArrayList<>();        if (data != null) {            result = (List) data.getSerializableExtra(PictureConfig.EXTRA_RESULT_SELECTION);            if (result == null) {                result = new ArrayList<>();            }            return result;        }        return result;    }

LocalMedia

public class LocalMedia implements Parcelable {    private String path;    private String compressPath;    private String cutPath;    private long duration;    private boolean isChecked;    private boolean isCut;    public int position;    private int num;    private int mimeType;    private String pictureType;    private boolean compressed;    private int width;    private int height;    public String ossUrl;//记录上传成功后的图片地址    public boolean isFail;//新增业务字段 是否是违规    public LocalMedia() {    }    public LocalMedia(String path, long duration, int mimeType, String pictureType) {        this.path = path;        this.duration = duration;        this.mimeType = mimeType;        this.pictureType = pictureType;    }    public LocalMedia(String path, long duration, int mimeType, String pictureType, int width, int height) {        this.path = path;        this.duration = duration;        this.mimeType = mimeType;        this.pictureType = pictureType;        this.width = width;        this.height = height;    }    public LocalMedia(String path, long duration,                      boolean isChecked, int position, int num, int mimeType) {        this.path = path;        this.duration = duration;        this.isChecked = isChecked;        this.position = position;        this.num = num;        this.mimeType = mimeType;    }    public String getPictureType() {        if (TextUtils.isEmpty(pictureType)) {            pictureType = "image/jpeg";        }        return pictureType;    }    public void setPictureType(String pictureType) {        this.pictureType = pictureType;    }    public String getPath() {        return path;    }    public void setPath(String path) {        this.path = path;    }    public String getCompressPath() {        return compressPath;    }    public void setCompressPath(String compressPath) {        this.compressPath = compressPath;    }    public String getCutPath() {        return cutPath;    }    public void setCutPath(String cutPath) {        this.cutPath = cutPath;    }    public long getDuration() {        return duration;    }    public void setDuration(long duration) {        this.duration = duration;    }    public boolean isChecked() {        return isChecked;    }    public void setChecked(boolean checked) {        isChecked = checked;    }    public boolean isCut() {        return isCut;    }    public void setCut(boolean cut) {        isCut = cut;    }    public int getPosition() {        return position;    }    public void setPosition(int position) {        this.position = position;    }    public int getNum() {        return num;    }    public void setNum(int num) {        this.num = num;    }    public int getMimeType() {        return mimeType;    }    public void setMimeType(int mimeType) {        this.mimeType = mimeType;    }    public boolean isCompressed() {        return compressed;    }    public void setCompressed(boolean compressed) {        this.compressed = compressed;    }    public int getWidth() {        return width;    }    public void setWidth(int width) {        this.width = width;    }    public int getHeight() {        return height;    }    public void setHeight(int height) {        this.height = height;    }    @Override    public int describeContents() {        return 0;    }    @Override    public void writeToParcel(Parcel dest, int flags) {        dest.writeString(this.path);        dest.writeString(this.compressPath);        dest.writeString(this.cutPath);        dest.writeLong(this.duration);        dest.writeByte(this.isChecked ? (byte) 1 : (byte) 0);        dest.writeByte(this.isCut ? (byte) 1 : (byte) 0);        dest.writeInt(this.position);        dest.writeInt(this.num);        dest.writeInt(this.mimeType);        dest.writeString(this.pictureType);        dest.writeByte(this.compressed ? (byte) 1 : (byte) 0);        dest.writeInt(this.width);        dest.writeInt(this.height);    }    protected LocalMedia(Parcel in) {        this.path = in.readString();        this.compressPath = in.readString();        this.cutPath = in.readString();        this.duration = in.readLong();        this.isChecked = in.readByte() != 0;        this.isCut = in.readByte() != 0;        this.position = in.readInt();        this.num = in.readInt();        this.mimeType = in.readInt();        this.pictureType = in.readString();        this.compressed = in.readByte() != 0;        this.width = in.readInt();        this.height = in.readInt();    }    @Override    public boolean equals(Object o) {        if (this == o) return true;        if (o == null || getClass() != o.getClass()) return false;        LocalMedia that = (LocalMedia) o;        return path != null && path.equals(that.path);    }    @Override    public int hashCode() {        if (path != null) {            return path.hashCode();        } else {            return 0;        }    }    public static final Parcelable.Creator CREATOR = new Parcelable.Creator() {        @Override        public LocalMedia createFromParcel(Parcel source) {            return new LocalMedia(source);        }        @Override        public LocalMedia[] newArray(int size) {            return new LocalMedia[size];        }    };}

将拿到的data 转成 LocalMedia 的集合,默认取第一个 result.get(0)
此时我们就拿到了LocalMedia 这里面有我们需要的宽、高、时间、本地路径等信息
此时我们就可以将上面代码改成

Int  videoWith = 1600Int  videoHeight = 1200if (videoMedia.width != null && videoMedia.width != 0){          videoWith = videoMedia.width   }if (videoMedia.height != null && videoMedia.height != 0){          videoHeight = videoMedia.height   }VideoProcessor.processor(mPresenter)              .input(url)              .outWidth(videoWith )              .outHeight(videoHeight )              .output(outputUrl)              .bitrate(mBitrate)              .frameRate(10)              .process()

以上,视频压缩之后变形的问题就解决了

我们压缩之后会出现一个新的压缩后的路径,如果不及时删除,用户手机上就会多一个压缩之后的文件,会影响用户的使用体验,当我们使用之后要及时去删除压缩后的文件(这里不贴代码了,百度一大堆,如有需要请留言~)
删除的时候,部分机型会报错,此时需要注意了!安卓现在已经不允许对 /0 文件也就是系统默认文件进行操作了,所以我们设置的压缩后的路径不要放在 /0 目录下
可以这样

//压缩视频,使用完后别忘了把压缩后的视频删除掉                    val FILENAME_FORMAT = "yyyy-MM-dd-HH-mm-ss-SSS"                    var outputUrl =                        mPresenter.getExternalFilesDir("").toString() + "/Kome" + SimpleDateFormat(FILENAME_FORMAT,Locale.CHINA                        ).format(System.currentTimeMillis()) + ".mp4"

这样基本都压缩步骤已经完成了,不出意外的话就要发版了,但是发上去之后就会发现部分机型会出现空指针的问题,经排查问题发生在底层源码里面 …processVideo()方法里面报错

可能最新版本已经修复此问题,如果没有可以直接重写 VideoProcessor 类,加一下防护,类似于 默认数据都是原有参数,尽量自己不要乱改

        int originWidth = 1600;        if (retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH) != null){            originWidth = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH));        }

最后贴上加防护后的VideoProcessor 代码

import android.annotation.SuppressLint;import android.annotation.TargetApi;import android.content.Context;import android.media.AudioFormat;import android.media.MediaCodec;import android.media.MediaCodecInfo;import android.media.MediaExtractor;import android.media.MediaFormat;import android.media.MediaMetadataRetriever;import android.media.MediaMuxer;import android.media.MediaRecorder;import android.net.Uri;import android.util.Pair;import org.jetbrains.annotations.NotNull;import org.jetbrains.annotations.Nullable;import java.io.File;import java.io.IOException;import java.nio.ByteBuffer;import java.util.ArrayList;import java.util.List;import java.util.concurrent.CountDownLatch;import java.util.concurrent.atomic.AtomicBoolean;import static com.hw.videoprocessor.util.AudioUtil.getAudioBitrate;import com.hw.videoprocessor.AudioProcessThread;import com.hw.videoprocessor.VideoDecodeThread;import com.hw.videoprocessor.VideoEncodeThread;import com.hw.videoprocessor.VideoUtil;import com.hw.videoprocessor.util.AudioFadeUtil;import com.hw.videoprocessor.util.AudioUtil;import com.hw.videoprocessor.util.CL;import com.hw.videoprocessor.util.PcmToWavUtil;import com.hw.videoprocessor.util.VideoMultiStepProgress;import com.hw.videoprocessor.util.VideoProgressAve;import com.hw.videoprocessor.util.VideoProgressListener;@TargetApi(21)public class VideoProcessor {    final static String TAG = "VideoProcessor";    final static String OUTPUT_MIME_TYPE = "video/avc";    public static int DEFAULT_FRAME_RATE = 20;        public final static int DEFAULT_I_FRAME_INTERVAL = 1;    public final static int DEFAULT_AAC_BITRATE = 192 * 1000;        public static boolean AUDIO_MIX_REPEAT = true;    final static int TIMEOUT_USEC = 2500;    public static void scaleVideo(Context context, Uri input, String output,      int outWidth, int outHeight) throws Exception {        processor(context)                .input(input)                .output(output)                .outWidth(outWidth)                .outHeight(outHeight)                .process();    }    public static void cutVideo(Context context, Uri input, String output, int startTimeMs, int endTimeMs) throws Exception {        processor(context)                .input(input)                .output(output)                .startTimeMs(startTimeMs)                .endTimeMs(endTimeMs)                .process();    }    public static void changeVideoSpeed(Context context, Uri input, String output, float speed) throws Exception {        processor(context)                .input(input)                .output(output)                .speed(speed)                .process();    }        public static void reverseVideo(Context context, com.hw.videoprocessor.VideoProcessor.MediaSource input, String output, boolean reverseAudio, @Nullable VideoProgressListener listener) throws Exception {        File tempFile = new File(context.getCacheDir(), System.currentTimeMillis() + ".temp");        File temp2File = new File(context.getCacheDir(), System.currentTimeMillis() + ".temp2");        try {            MediaExtractor extractor = new MediaExtractor();            input.setDataSource(extractor);            int trackIndex = VideoUtil.selectTrack(extractor, false);            extractor.selectTrack(trackIndex);            int keyFrameCount = 0;            int frameCount = 0;            List frameTimeStamps = new ArrayList<>();            while (true) {                int flags = extractor.getSampleFlags();                if (flags > 0 && (flags & MediaExtractor.SAMPLE_FLAG_SYNC) != 0) {                    keyFrameCount++;                }                long sampleTime = extractor.getSampleTime();                if (sampleTime < 0) {                    break;                }                frameTimeStamps.add(sampleTime);                frameCount++;                extractor.advance();            }            extractor.release();            if (frameCount == keyFrameCount || frameCount == keyFrameCount + 1) {                reverseVideoNoDecode(input, output, reverseAudio, frameTimeStamps, listener);            } else {                VideoMultiStepProgress stepProgress = new VideoMultiStepProgress(new float[]{0.45f, 0.1f, 0.45f}, listener);                stepProgress.setCurrentStep(0);                float bitrateMultiple = (frameCount - keyFrameCount) / (float) keyFrameCount + 1;                MediaMetadataRetriever retriever = new MediaMetadataRetriever();                input.setDataSource(retriever);                int oriBitrate = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_BITRATE));                int duration = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION));                try {                    processor(context).input(input).output(tempFile.getAbsolutePath()).bitrate((int) (oriBitrate * bitrateMultiple)).iFrameInterval(0).progressListener(stepProgress).process();                } catch (MediaCodec.CodecException e) {                    CL.e(e);                                        processor(context).input(input).output(tempFile.getAbsolutePath()).bitrate((int) (oriBitrate * bitrateMultiple)).iFrameInterval(-1).progressListener(stepProgress).process();                }                stepProgress.setCurrentStep(1);                reverseVideoNoDecode(new com.hw.videoprocessor.VideoProcessor.MediaSource(tempFile.getAbsolutePath()), temp2File.getAbsolutePath(), reverseAudio, null, stepProgress);                int oriIFrameInterval = (int) (keyFrameCount / (duration / 1000f));                oriIFrameInterval = oriIFrameInterval == 0 ? 1 : oriIFrameInterval;                stepProgress.setCurrentStep(2);                processor(context)                        .input(temp2File.getAbsolutePath())                        .output(output)                        .bitrate(oriBitrate)                        .iFrameInterval(oriIFrameInterval)                        .progressListener(stepProgress)                        .process();            }        } finally {            tempFile.delete();            temp2File.delete();        }    }        public static void processVideo(@NotNull Context context, @NotNull Processor processor) throws Exception {        MediaMetadataRetriever retriever = new MediaMetadataRetriever();        processor.input.setDataSource(retriever);        int originWidth = 1600;        if (retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH) != null){            originWidth = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH));        }        int originHeight = 1200;        if (retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_HEIGHT) != null){            originHeight = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_HEIGHT));        }        int rotationValue = 270;        if (retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION) != null){            rotationValue = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION)); //270        }        int oriBitrate = 8338866;        if (retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_BITRATE) != null){            oriBitrate = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_BITRATE)); //8338866        }        int durationMs = 15111;        if (retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION) != null){            durationMs = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION)); //15111        }        retriever.release();        if (processor.bitrate == null) {            processor.bitrate = oriBitrate;        }        if (processor.iFrameInterval == null) {            processor.iFrameInterval = DEFAULT_I_FRAME_INTERVAL;        }        int resultWidth = processor.outWidth == null ? originWidth : processor.outWidth;        int resultHeight = processor.outHeight == null ? originHeight : processor.outHeight;        resultWidth = resultWidth % 2 == 0 ? resultWidth : resultWidth + 1;        resultHeight = resultHeight % 2 == 0 ? resultHeight : resultHeight + 1;        if (rotationValue == 90 || rotationValue == 270) {            int temp = resultHeight;            resultHeight = resultWidth;            resultWidth = temp;        }        MediaExtractor extractor = new MediaExtractor();        processor.input.setDataSource(extractor);        int videoIndex = VideoUtil.selectTrack(extractor, false);        int audioIndex = VideoUtil.selectTrack(extractor, true);        MediaMuxer mediaMuxer = new MediaMuxer(processor.output, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);        int muxerAudioTrackIndex = 0;        boolean shouldChangeAudioSpeed = processor.changeAudioSpeed == null ? true : processor.changeAudioSpeed;        Integer audioEndTimeMs = processor.endTimeMs;        if (audioIndex >= 0) {            MediaFormat audioTrackFormat = extractor.getTrackFormat(audioIndex);            String audioMimeType = MediaFormat.MIMETYPE_AUDIO_AAC;            int bitrate = getAudioBitrate(audioTrackFormat);            int channelCount = audioTrackFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT);            int sampleRate = audioTrackFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE);            int maxBufferSize = AudioUtil.getAudioMaxBufferSize(audioTrackFormat);            MediaFormat audioEncodeFormat = MediaFormat.createAudioFormat(audioMimeType, sampleRate, channelCount);//参数对应-> mime type、采样率、声道数            audioEncodeFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);//比特率            audioEncodeFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);            audioEncodeFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, maxBufferSize);            if (shouldChangeAudioSpeed) {                if (processor.startTimeMs != null || processor.endTimeMs != null || processor.speed != null) {                    long durationUs = audioTrackFormat.getLong(MediaFormat.KEY_DURATION);                    if (processor.startTimeMs != null && processor.endTimeMs != null) {                        durationUs = (processor.endTimeMs - processor.startTimeMs) * 1000;                    }                    if (processor.speed != null) {                        durationUs /= processor.speed;                    }                    audioEncodeFormat.setLong(MediaFormat.KEY_DURATION, durationUs);                }            } else {                long videoDurationUs = durationMs * 1000;                long audioDurationUs = audioTrackFormat.getLong(MediaFormat.KEY_DURATION);                if (processor.startTimeMs != null || processor.endTimeMs != null || processor.speed != null) {                    if (processor.startTimeMs != null && processor.endTimeMs != null) {                        videoDurationUs = (processor.endTimeMs - processor.startTimeMs) * 1000;                    }                    if (processor.speed != null) {                        videoDurationUs /= processor.speed;                    }                    long avDurationUs = videoDurationUs < audioDurationUs ? videoDurationUs : audioDurationUs;                    audioEncodeFormat.setLong(MediaFormat.KEY_DURATION, avDurationUs);                    audioEndTimeMs = (processor.startTimeMs == null ? 0 : processor.startTimeMs) + (int) (avDurationUs / 1000);                }            }            AudioUtil.checkCsd(audioEncodeFormat,                    MediaCodecInfo.CodecProfileLevel.AACObjectLC,                    sampleRate,                    channelCount            );            //提前推断出音頻格式加到MeidaMuxer,不然实际上应该到音频预处理完才能addTrack,会卡住视频编码的进度            muxerAudioTrackIndex = mediaMuxer.addTrack(audioEncodeFormat);        }        extractor.selectTrack(videoIndex);        if (processor.startTimeMs != null) {            extractor.seekTo(processor.startTimeMs * 1000, MediaExtractor.SEEK_TO_PREVIOUS_SYNC);        } else {            extractor.seekTo(0, MediaExtractor.SEEK_TO_PREVIOUS_SYNC);        }        VideoProgressAve progressAve = new VideoProgressAve(processor.listener);        progressAve.setSpeed(processor.speed);        progressAve.setStartTimeMs(processor.startTimeMs == null ? 0 : processor.startTimeMs);        progressAve.setEndTimeMs(processor.endTimeMs == null ? durationMs : processor.endTimeMs);        AtomicBoolean decodeDone = new AtomicBoolean(false);        CountDownLatch muxerStartLatch = new CountDownLatch(1);        VideoEncodeThread encodeThread = new VideoEncodeThread(extractor, mediaMuxer,processor.bitrate,                resultWidth, resultHeight, processor.iFrameInterval, processor.frameRate == null ? DEFAULT_FRAME_RATE : processor.frameRate, videoIndex,                decodeDone, muxerStartLatch);        int class="lazy" data-srcFrameRate = VideoUtil.getFrameRate(processor.input);        if (class="lazy" data-srcFrameRate <= 0) {            class="lazy" data-srcFrameRate = (int) Math.ceil(VideoUtil.getAveFrameRate(processor.input));        }        VideoDecodeThread decodeThread = new VideoDecodeThread(encodeThread, extractor, processor.startTimeMs, processor.endTimeMs, class="lazy" data-srcFrameRate,                processor.frameRate == null ? DEFAULT_FRAME_RATE : processor.frameRate, processor.speed, processor.dropFrames, videoIndex, decodeDone);        AudioProcessThread audioProcessThread = new AudioProcessThread(context, processor.input, mediaMuxer, processor.startTimeMs, audioEndTimeMs,                shouldChangeAudioSpeed ? processor.speed : null, muxerAudioTrackIndex, muxerStartLatch);        encodeThread.setProgressAve(progressAve);        audioProcessThread.setProgressAve(progressAve);        decodeThread.start();        encodeThread.start();        audioProcessThread.start();        try {            long s = System.currentTimeMillis();            decodeThread.join();            encodeThread.join();            long e1 = System.currentTimeMillis();            audioProcessThread.join();            long e2 = System.currentTimeMillis();            CL.w(String.format("编解码:%dms,音频:%dms", e1 - s, e2 - s));        } catch (InterruptedException e) {            e.printStackTrace();        }        try {            mediaMuxer.release();            extractor.release();        } catch (Exception e2) {            CL.e(e2);        }        if (encodeThread.getException() != null) {            throw encodeThread.getException();        } else if (decodeThread.getException() != null) {            throw decodeThread.getException();        } else if (audioProcessThread.getException() != null) {            throw audioProcessThread.getException();        }    }    public static void reverseVideoNoDecode(com.hw.videoprocessor.VideoProcessor.MediaSource input, String output, boolean reverseAudio) throws IOException {        reverseVideoNoDecode(input, output, reverseAudio, null, null);    }        @SuppressLint("WrongConstant")    public static void reverseVideoNoDecode(com.hw.videoprocessor.VideoProcessor.MediaSource input, String output, boolean reverseAudio, List videoFrameTimeStamps, @Nullable VideoProgressListener listener) throws IOException {        MediaMetadataRetriever retriever = new MediaMetadataRetriever();        input.setDataSource(retriever);        int durationMs = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION));        retriever.release();        MediaExtractor extractor = new MediaExtractor();        input.setDataSource(extractor);        int videoTrackIndex = VideoUtil.selectTrack(extractor, false);        int audioTrackIndex = VideoUtil.selectTrack(extractor, true);        boolean audioExist = audioTrackIndex >= 0;        final int MIN_FRAME_INTERVAL = 10 * 1000;        MediaMuxer mediaMuxer = new MediaMuxer(output, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);        extractor.selectTrack(videoTrackIndex);        MediaFormat videoTrackFormat = extractor.getTrackFormat(videoTrackIndex);        long videoDurationUs = videoTrackFormat.getLong(MediaFormat.KEY_DURATION);        long audioDurationUs = 0;        int videoMuxerTrackIndex = mediaMuxer.addTrack(videoTrackFormat);        int audioMuxerTrackIndex = 0;        if (audioExist) {            MediaFormat audioTrackFormat = extractor.getTrackFormat(audioTrackIndex);            audioMuxerTrackIndex = mediaMuxer.addTrack(audioTrackFormat);            audioDurationUs = audioTrackFormat.getLong(MediaFormat.KEY_DURATION);        }        mediaMuxer.start();        int maxBufferSize = videoTrackFormat.getInteger(MediaFormat.KEY_MAX_INPUT_SIZE);        ByteBuffer buffer = ByteBuffer.allocateDirect(maxBufferSize);        VideoUtil.seekToLastFrame(extractor, videoTrackIndex, durationMs);        long lastFrameTimeUs = -1;        MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();        try {            //写视频帧            if (videoFrameTimeStamps != null && videoFrameTimeStamps.size() > 0) {                for (int i = videoFrameTimeStamps.size() - 1; i >= 0; i--) {                    extractor.seekTo(videoFrameTimeStamps.get(i), MediaExtractor.SEEK_TO_CLOSEST_SYNC);                    long sampleTime = extractor.getSampleTime();                    if (lastFrameTimeUs == -1) {                        lastFrameTimeUs = sampleTime;                    }                    info.presentationTimeUs = lastFrameTimeUs - sampleTime;                    info.size = extractor.readSampleData(buffer, 0);                    info.flags = extractor.getSampleFlags();                    if (info.size < 0) {                        break;                    }                    mediaMuxer.writeSampleData(videoMuxerTrackIndex, buffer, info);                    if (listener != null) {                        float videoProgress = info.presentationTimeUs / (float) videoDurationUs;                        videoProgress = videoProgress > 1 ? 1 : videoProgress;                        videoProgress *= 0.7f;                        listener.onProgress(videoProgress);                    }                }            } else {                while (true) {                    long sampleTime = extractor.getSampleTime();                    if (lastFrameTimeUs == -1) {                        lastFrameTimeUs = sampleTime;                    }                    info.presentationTimeUs = lastFrameTimeUs - sampleTime;                    info.size = extractor.readSampleData(buffer, 0);                    info.flags = extractor.getSampleFlags();                    if (info.size < 0) {                        break;                    }                    mediaMuxer.writeSampleData(videoMuxerTrackIndex, buffer, info);                    if (listener != null) {                        float videoProgress = info.presentationTimeUs / (float) videoDurationUs;                        videoProgress = videoProgress > 1 ? 1 : videoProgress;                        videoProgress *= 0.7f;                        listener.onProgress(videoProgress);                    }                    long seekTime = sampleTime - MIN_FRAME_INTERVAL;                    if (seekTime <= 0) {                        break;                    }                    extractor.seekTo(seekTime, MediaExtractor.SEEK_TO_PREVIOUS_SYNC);                }            }            //写音频帧            if (audioExist) {                extractor.unselectTrack(videoTrackIndex);                extractor.selectTrack(audioTrackIndex);                if (reverseAudio) {                    List audioFrameStamps = getFrameTimeStampsList(extractor);                    lastFrameTimeUs = -1;                    for (int i = audioFrameStamps.size() - 1; i >= 0; i--) {                        extractor.seekTo(audioFrameStamps.get(i), MediaExtractor.SEEK_TO_CLOSEST_SYNC);                        long sampleTime = extractor.getSampleTime();                        if (lastFrameTimeUs == -1) {lastFrameTimeUs = sampleTime;                        }                        info.presentationTimeUs = lastFrameTimeUs - sampleTime;                        info.size = extractor.readSampleData(buffer, 0);                        info.flags = extractor.getSampleFlags();                        if (info.size < 0) {break;                        }                        mediaMuxer.writeSampleData(audioMuxerTrackIndex, buffer, info);                        if (listener != null) {float audioProgress = info.presentationTimeUs / (float) audioDurationUs;audioProgress = audioProgress > 1 ? 1 : audioProgress;audioProgress = 0.7f + audioProgress * 0.3f;listener.onProgress(audioProgress);                        }                    }                } else {                    extractor.seekTo(0, MediaExtractor.SEEK_TO_CLOSEST_SYNC);                    while (true) {                        long sampleTime = extractor.getSampleTime();                        if (sampleTime == -1) {break;                        }                        info.presentationTimeUs = sampleTime;                        info.size = extractor.readSampleData(buffer, 0);                        info.flags = extractor.getSampleFlags();                        if (info.size < 0) {break;                        }                        mediaMuxer.writeSampleData(audioMuxerTrackIndex, buffer, info);                        if (listener != null) {float audioProgress = info.presentationTimeUs / (float) audioDurationUs;audioProgress = audioProgress > 1 ? 1 : audioProgress;audioProgress = 0.7f + audioProgress * 0.3f;listener.onProgress(audioProgress);                        }                        extractor.advance();                    }                }            }            if (listener != null) {                listener.onProgress(1f);            }        } catch (Exception e) {            CL.e(e);        } finally {            extractor.release();            mediaMuxer.release();        }    }    static List getFrameTimeStampsList(MediaExtractor extractor){        List frameTimeStamps = new ArrayList<>();        while (true) {            long sampleTime = extractor.getSampleTime();            if (sampleTime < 0) {                break;            }            frameTimeStamps.add(sampleTime);            extractor.advance();        }        return frameTimeStamps;    }        @SuppressLint("WrongConstant")    public static void adjustVideoVolume(Context context, final com.hw.videoprocessor.VideoProcessor.MediaSource mediaSource, final String output, int videoVolume, float faceInSec, float fadeOutSec) throws IOException {        if (videoVolume == 100 && faceInSec == 0f && fadeOutSec == 0f) {            AudioUtil.copyFile(String.valueOf(mediaSource), output);            return;        }        File cacheDir = new File(context.getCacheDir(), "pcm");        cacheDir.mkdir();        MediaExtractor oriExtrator = new MediaExtractor();        mediaSource.setDataSource(oriExtrator);        int oriAudioIndex = VideoUtil.selectTrack(oriExtrator, true);        if (oriAudioIndex < 0) {            CL.e("no audio stream!");            AudioUtil.copyFile(String.valueOf(mediaSource), output);            return;        }        long time = System.currentTimeMillis();        final File videoPcmFile = new File(cacheDir, "video_" + time + ".pcm");        final File videoPcmAdjustedFile = new File(cacheDir, "video_" + time + "_adjust.pcm");        final File videoWavFile = new File(cacheDir, "video_" + time + ".wav");        AudioUtil.decodeToPCM(mediaSource, videoPcmFile.getAbsolutePath(), null, null);        AudioUtil.adjustPcmVolume(videoPcmFile.getAbsolutePath(), videoPcmAdjustedFile.getAbsolutePath(), videoVolume);        MediaFormat audioTrackFormat = oriExtrator.getTrackFormat(oriAudioIndex);        final int sampleRate = audioTrackFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE);        int channelCount = audioTrackFormat.containsKey(MediaFormat.KEY_CHANNEL_COUNT) ? audioTrackFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT) : 1;        int channelConfig = AudioFormat.CHANNEL_IN_MONO;        if (channelCount == 2) {            channelConfig = AudioFormat.CHANNEL_IN_STEREO;        }        if (faceInSec > 0 || fadeOutSec > 0) {            AudioFadeUtil.audioFade(videoPcmAdjustedFile.getAbsolutePath(), sampleRate, channelCount, faceInSec, fadeOutSec);        }        new PcmToWavUtil(sampleRate, channelConfig, channelCount, AudioFormat.ENCODING_PCM_16BIT).pcmToWav(videoPcmAdjustedFile.getAbsolutePath(), videoWavFile.getAbsolutePath());        final int TIMEOUT_US = 2500;        //重新将速率变化过后的pcm写入        int audioBitrate = getAudioBitrate(audioTrackFormat);        int oriVideoIndex = VideoUtil.selectTrack(oriExtrator, false);        MediaFormat oriVideoFormat = oriExtrator.getTrackFormat(oriVideoIndex);        int rotation = oriVideoFormat.containsKey(MediaFormat.KEY_ROTATION) ? oriVideoFormat.getInteger(MediaFormat.KEY_ROTATION) : 0;        MediaMuxer mediaMuxer = new MediaMuxer(output, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);        mediaMuxer.setOrientationHint(rotation);        int muxerVideoIndex = mediaMuxer.addTrack(oriVideoFormat);        int muxerAudioIndex = mediaMuxer.addTrack(audioTrackFormat);        //重新写入音频        mediaMuxer.start();        MediaExtractor pcmExtrator = new MediaExtractor();        pcmExtrator.setDataSource(videoWavFile.getAbsolutePath());        int audioTrack = VideoUtil.selectTrack(pcmExtrator, true);        pcmExtrator.selectTrack(audioTrack);        MediaFormat pcmTrackFormat = pcmExtrator.getTrackFormat(audioTrack);        int maxBufferSize = AudioUtil.getAudioMaxBufferSize(pcmTrackFormat);        ByteBuffer buffer = ByteBuffer.allocateDirect(maxBufferSize);        MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();        MediaFormat encodeFormat = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, sampleRate, channelCount);//参数对应-> mime type、采样率、声道数        encodeFormat.setInteger(MediaFormat.KEY_BIT_RATE, audioBitrate);//比特率        encodeFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);        encodeFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, maxBufferSize);        MediaCodec encoder = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);        encoder.configure(encodeFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);        encoder.start();        boolean encodeInputDone = false;        boolean encodeDone = false;        long lastAudioFrameTimeUs = -1;        final int AAC_FRAME_TIME_US = 1024 * 1000 * 1000 / sampleRate;        boolean detectTimeError = false;        try {            while (!encodeDone) {                int inputBufferIndex = encoder.dequeueInputBuffer(TIMEOUT_US);                if (!encodeInputDone && inputBufferIndex >= 0) {                    long sampleTime = pcmExtrator.getSampleTime();                    if (sampleTime < 0) {                        encodeInputDone = true;                        encoder.queueInputBuffer(inputBufferIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);                    } else {                        int flags = pcmExtrator.getSampleFlags();                        buffer.clear();                        int size = pcmExtrator.readSampleData(buffer, 0);                        ByteBuffer inputBuffer = encoder.getInputBuffer(inputBufferIndex);                        inputBuffer.clear();                        inputBuffer.put(buffer);                        inputBuffer.position(0);                        CL.it(TAG, "audio queuePcmBuffer " + sampleTime / 1000 + " size:" + size);                        encoder.queueInputBuffer(inputBufferIndex, 0, size, sampleTime, flags);                        pcmExtrator.advance();                    }                }                while (true) {                    int outputBufferIndex = encoder.dequeueOutputBuffer(info, TIMEOUT_US);                    if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {                        break;                    } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {                        MediaFormat newFormat = encoder.getOutputFormat();                        CL.it(TAG, "audio decode newFormat = " + newFormat);                    } else if (outputBufferIndex < 0) {                        //ignore                        CL.et(TAG, "unexpected result from audio decoder.dequeueOutputBuffer: " + outputBufferIndex);                    } else {                        if (info.flags == MediaCodec.BUFFER_FLAG_END_OF_STREAM) {encodeDone = true;break;                        }                        ByteBuffer encodeOutputBuffer = encoder.getOutputBuffer(outputBufferIndex);                        CL.it(TAG, "audio writeSampleData " + info.presentationTimeUs + " size:" + info.size + " flags:" + info.flags);                        if (!detectTimeError && lastAudioFrameTimeUs != -1 && info.presentationTimeUs < lastAudioFrameTimeUs + AAC_FRAME_TIME_US) {//某些情况下帧时间会出错,目前未找到原因(系统相机录得双声道视频正常,我录的单声道视频不正常)CL.et(TAG, "audio 时间戳错误,lastAudioFrameTimeUs:" + lastAudioFrameTimeUs + " " +        "info.presentationTimeUs:" + info.presentationTimeUs);detectTimeError = true;                        }                        if (detectTimeError) {info.presentationTimeUs = lastAudioFrameTimeUs + AAC_FRAME_TIME_US;CL.et(TAG, "audio 时间戳错误,使用修正的时间戳:" + info.presentationTimeUs);detectTimeError = false;                        }                        if (info.flags != MediaCodec.BUFFER_FLAG_CODEC_CONFIG) {lastAudioFrameTimeUs = info.presentationTimeUs;                        }                        mediaMuxer.writeSampleData(muxerAudioIndex, encodeOutputBuffer, info);                        encodeOutputBuffer.clear();                        encoder.releaseOutputBuffer(outputBufferIndex, false);                    }                }            }            //重新将视频写入            if (oriAudioIndex >= 0) {                oriExtrator.unselectTrack(oriAudioIndex);            }            oriExtrator.selectTrack(oriVideoIndex);            oriExtrator.seekTo(0, MediaExtractor.SEEK_TO_PREVIOUS_SYNC);            maxBufferSize = oriVideoFormat.getInteger(MediaFormat.KEY_MAX_INPUT_SIZE);            int frameRate = oriVideoFormat.containsKey(MediaFormat.KEY_FRAME_RATE) ? oriVideoFormat.getInteger(MediaFormat.KEY_FRAME_RATE) : (int) Math.ceil(VideoUtil.getAveFrameRate(mediaSource));            buffer = ByteBuffer.allocateDirect(maxBufferSize);            final int VIDEO_FRAME_TIME_US = (int) (1000 * 1000f / frameRate);            long lastVideoFrameTimeUs = -1;            detectTimeError = false;            while (true) {                long sampleTimeUs = oriExtrator.getSampleTime();                if (sampleTimeUs == -1) {                    break;                }                info.presentationTimeUs = sampleTimeUs;                info.flags = oriExtrator.getSampleFlags();                info.size = oriExtrator.readSampleData(buffer, 0);                if (info.size < 0) {                    break;                }                //写入视频                if (!detectTimeError && lastVideoFrameTimeUs != -1 && info.presentationTimeUs < lastVideoFrameTimeUs + VIDEO_FRAME_TIME_US) {                    //某些视频帧时间会出错                    CL.et(TAG, "video 时间戳错误,lastVideoFrameTimeUs:" + lastVideoFrameTimeUs + " " +"info.presentationTimeUs:" + info.presentationTimeUs + " VIDEO_FRAME_TIME_US:" + VIDEO_FRAME_TIME_US);                    detectTimeError = true;                }                if (detectTimeError) {                    info.presentationTimeUs = lastVideoFrameTimeUs + VIDEO_FRAME_TIME_US;                    CL.et(TAG, "video 时间戳错误,使用修正的时间戳:" + info.presentationTimeUs);                    detectTimeError = false;                }                if (info.flags != MediaCodec.BUFFER_FLAG_CODEC_CONFIG) {                    lastVideoFrameTimeUs = info.presentationTimeUs;                }                CL.wt(TAG, "video writeSampleData:" + info.presentationTimeUs + " type:" + info.flags + " size:" + info.size);                mediaMuxer.writeSampleData(muxerVideoIndex, buffer, info);                oriExtrator.advance();            }        } finally {            videoPcmFile.delete();            videoPcmAdjustedFile.delete();            videoWavFile.delete();            try {                pcmExtrator.release();                oriExtrator.release();                mediaMuxer.release();                encoder.stop();                encoder.release();            } catch (Exception e) {                CL.e(e);            }        }    }        @SuppressLint("WrongConstant")    public static void mixAudioTrack(Context context, final com.hw.videoprocessor.VideoProcessor.MediaSource videoInput, final com.hw.videoprocessor.VideoProcessor.MediaSource audioInput, final String output,         Integer startTimeMs, Integer endTimeMs,         int videoVolume,         int aacVolume,         float fadeInSec, float fadeOutSec) throws IOException {        File cacheDir = new File(context.getCacheDir(), "pcm");        cacheDir.mkdir();        final File videoPcmFile = new File(cacheDir, "video_" + System.currentTimeMillis() + ".pcm");        File aacPcmFile = new File(cacheDir, "aac_" + System.currentTimeMillis() + ".pcm");        final Integer startTimeUs = startTimeMs == null ? 0 : startTimeMs * 1000;        final Integer endTimeUs = endTimeMs == null ? null : endTimeMs * 1000;        final int videoDurationMs;        if (endTimeUs == null) {            MediaMetadataRetriever retriever = new MediaMetadataRetriever();            videoInput.setDataSource(retriever);            videoDurationMs = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION));        } else {            videoDurationMs = (endTimeUs - startTimeUs) / 1000;        }        MediaMetadataRetriever retriever = new MediaMetadataRetriever();        audioInput.setDataSource(retriever);        final int aacDurationMs = Integer.parseInt(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION));        retriever.release();        MediaExtractor oriExtrator = new MediaExtractor();        videoInput.setDataSource(oriExtrator);        int oriAudioIndex = VideoUtil.selectTrack(oriExtrator, true);        MediaExtractor audioExtractor = new MediaExtractor();        audioInput.setDataSource(audioExtractor);        int aacAudioIndex = VideoUtil.selectTrack(audioExtractor, true);        File wavFile;        int sampleRate;        File adjustedPcm;        int channelCount;        int audioBitrate;        final int TIMEOUT_US = 2500;        //重新将速率变化过后的pcm写入        int oriVideoIndex = VideoUtil.selectTrack(oriExtrator, false);        MediaFormat oriVideoFormat = oriExtrator.getTrackFormat(oriVideoIndex);        int rotation = oriVideoFormat.containsKey(MediaFormat.KEY_ROTATION) ? oriVideoFormat.getInteger(MediaFormat.KEY_ROTATION) : 0;        MediaMuxer mediaMuxer = new MediaMuxer(output, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);        mediaMuxer.setOrientationHint(rotation);        int muxerVideoIndex = mediaMuxer.addTrack(oriVideoFormat);        int muxerAudioIndex;        if (oriAudioIndex >= 0) {            long s1 = System.currentTimeMillis();            final CountDownLatch latch = new CountDownLatch(2);            //音频转化为PCM            new Thread(new Runnable() {                @Override                public void run() {                    try {                        AudioUtil.decodeToPCM(videoInput, videoPcmFile.getAbsolutePath(), startTimeUs, endTimeUs);                    } catch (IOException e) {                        throw new RuntimeException(e);                    } finally {                        latch.countDown();                    }                }            }).start();            final File finalAacPcmFile = aacPcmFile;            new Thread(new Runnable() {                @Override                public void run() {                    try {                        AudioUtil.decodeToPCM(audioInput, finalAacPcmFile.getAbsolutePath(), 0, aacDurationMs > videoDurationMs ? videoDurationMs * 1000 : aacDurationMs * 1000);                    } catch (IOException e) {                        throw new RuntimeException(e);                    } finally {                        latch.countDown();                    }                }            }).start();            try {                latch.await();            } catch (InterruptedException e) {                e.printStackTrace();            }            long s2 = System.currentTimeMillis();            //检查两段音频格式是否一致,不一致则统一转换为单声道+44100            Pair resultPair = AudioUtil.checkAndAdjustAudioFormat(videoPcmFile.getAbsolutePath(),                    aacPcmFile.getAbsolutePath(),                    oriExtrator.getTrackFormat(oriAudioIndex),                    audioExtractor.getTrackFormat(aacAudioIndex)            );            channelCount = resultPair.first;            sampleRate = resultPair.second;            audioExtractor.release();            long s3 = System.currentTimeMillis();            //检查音频长度是否需要重复填充            if (AUDIO_MIX_REPEAT) {                aacPcmFile = AudioUtil.checkAndFillPcm(aacPcmFile, aacDurationMs, videoDurationMs);            }            //混合并调整音量            adjustedPcm = new File(cacheDir, "adjusted_" + System.currentTimeMillis() + ".pcm");            AudioUtil.mixPcm(videoPcmFile.getAbsolutePath(), aacPcmFile.getAbsolutePath(), adjustedPcm.getAbsolutePath()                    , videoVolume, aacVolume);            wavFile = new File(context.getCacheDir(), adjustedPcm.getName() + ".wav");            long s4 = System.currentTimeMillis();            int channelConfig = AudioFormat.CHANNEL_IN_MONO;            if (channelCount == 2) {                channelConfig = AudioFormat.CHANNEL_IN_STEREO;            }            //淡入淡出            if (fadeInSec != 0 || fadeOutSec != 0) {                AudioFadeUtil.audioFade(adjustedPcm.getAbsolutePath(), sampleRate, channelCount, fadeInSec, fadeOutSec);            }            //PCM转WAV            new PcmToWavUtil(sampleRate, channelConfig, channelCount, AudioFormat.ENCODING_PCM_16BIT).pcmToWav(adjustedPcm.getAbsolutePath(), wavFile.getAbsolutePath());            long s5 = System.currentTimeMillis();            CL.et("hwLog", String.format("decode:%dms,resample:%dms,mix:%dms,fade:%dms", s2 - s1, s3 - s2, s4 - s3, s5 - s4));            MediaFormat oriAudioFormat = oriExtrator.getTrackFormat(oriAudioIndex);            audioBitrate = getAudioBitrate(oriAudioFormat);            oriAudioFormat.setString(MediaFormat.KEY_MIME, MediaFormat.MIMETYPE_AUDIO_AAC);            AudioUtil.checkCsd(oriAudioFormat,                    MediaCodecInfo.CodecProfileLevel.AACObjectLC,                    sampleRate,                    channelCount            );            muxerAudioIndex = mediaMuxer.addTrack(oriAudioFormat);        } else {            AudioUtil.decodeToPCM(audioInput, aacPcmFile.getAbsolutePath(), 0,                    aacDurationMs > videoDurationMs ? videoDurationMs * 1000 : aacDurationMs * 1000);            MediaFormat audioTrackFormat = audioExtractor.getTrackFormat(aacAudioIndex);            audioBitrate = getAudioBitrate(audioTrackFormat);            channelCount = audioTrackFormat.containsKey(MediaFormat.KEY_CHANNEL_COUNT) ?                    audioTrackFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT) : 1;            sampleRate = audioTrackFormat.containsKey(MediaFormat.KEY_SAMPLE_RATE) ?                    audioTrackFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE) : 44100;            int channelConfig = AudioFormat.CHANNEL_IN_MONO;            if (channelCount == 2) {                channelConfig = AudioFormat.CHANNEL_IN_STEREO;            }            AudioUtil.checkCsd(audioTrackFormat,                    MediaCodecInfo.CodecProfileLevel.AACObjectLC,                    sampleRate,                    channelCount);            audioTrackFormat.setString(MediaFormat.KEY_MIME, MediaFormat.MIMETYPE_AUDIO_AAC);            muxerAudioIndex = mediaMuxer.addTrack(audioTrackFormat);            sampleRate = audioTrackFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE);            channelCount = audioTrackFormat.containsKey(MediaFormat.KEY_CHANNEL_COUNT) ? audioTrackFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT) : 1;            if (channelCount > 2) {                File tempFile = new File(aacPcmFile + ".channel");                AudioUtil.stereoToMonoSimple(aacPcmFile.getAbsolutePath(), tempFile.getAbsolutePath(), channelCount);                channelCount = 1;                aacPcmFile.delete();                aacPcmFile = tempFile;            }            if (aacVolume != 50) {                adjustedPcm = new File(cacheDir, "adjusted_" + System.currentTimeMillis() + ".pcm");                AudioUtil.adjustPcmVolume(aacPcmFile.getAbsolutePath(), adjustedPcm.getAbsolutePath(), aacVolume);            } else {                adjustedPcm = aacPcmFile;            }            channelConfig = AudioFormat.CHANNEL_IN_MONO;            if (channelCount == 2) {                channelConfig = AudioFormat.CHANNEL_IN_STEREO;            }            wavFile = new File(context.getCacheDir(), adjustedPcm.getName() + ".wav");            //淡入淡出            if (fadeInSec != 0 || fadeOutSec != 0) {                AudioFadeUtil.audioFade(adjustedPcm.getAbsolutePath(), sampleRate, channelCount, fadeInSec, fadeOutSec);            }            //PCM转WAV            new PcmToWavUtil(sampleRate, channelConfig, channelCount, AudioFormat.ENCODING_PCM_16BIT).pcmToWav(adjustedPcm.getAbsolutePath(), wavFile.getAbsolutePath());        }        //重新写入音频        mediaMuxer.start();        MediaExtractor pcmExtrator = new MediaExtractor();        pcmExtrator.setDataSource(wavFile.getAbsolutePath());        int audioTrack = VideoUtil.selectTrack(pcmExtrator, true);        pcmExtrator.selectTrack(audioTrack);        MediaFormat pcmTrackFormat = pcmExtrator.getTrackFormat(audioTrack);        int maxBufferSize = AudioUtil.getAudioMaxBufferSize(pcmTrackFormat);        ByteBuffer buffer = ByteBuffer.allocateDirect(maxBufferSize);        MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();        MediaFormat encodeFormat = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, sampleRate, channelCount);//参数对应-> mime type、采样率、声道数        encodeFormat.setInteger(MediaFormat.KEY_BIT_RATE, audioBitrate);//比特率        encodeFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);        encodeFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, maxBufferSize);        MediaCodec encoder = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);        encoder.configure(encodeFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);        encoder.start();        boolean encodeInputDone = false;        boolean encodeDone = false;        long lastAudioFrameTimeUs = -1;        final int AAC_FRAME_TIME_US = 1024 * 1000 * 1000 / sampleRate;        boolean detectTimeError = false;        try {            while (!encodeDone) {                int inputBufferIndex = encoder.dequeueInputBuffer(TIMEOUT_US);                if (!encodeInputDone && inputBufferIndex >= 0) {                    long sampleTime = pcmExtrator.getSampleTime();                    if (sampleTime < 0) {                        encodeInputDone = true;                        encoder.queueInputBuffer(inputBufferIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);                    } else {                        int flags = pcmExtrator.getSampleFlags();                        buffer.clear();                        int size = pcmExtrator.readSampleData(buffer, 0);                        ByteBuffer inputBuffer = encoder.getInputBuffer(inputBufferIndex);                        inputBuffer.clear();                        inputBuffer.put(buffer);                        inputBuffer.position(0);                        CL.it(TAG, "audio queuePcmBuffer " + sampleTime / 1000 + " size:" + size);                        encoder.queueInputBuffer(inputBufferIndex, 0, size, sampleTime, flags);                        pcmExtrator.advance();                    }                }                while (true) {                    int outputBufferIndex = encoder.dequeueOutputBuffer(info, TIMEOUT_US);                    if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {                        break;                    } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {                        MediaFormat newFormat = encoder.getOutputFormat();                        CL.it(TAG, "audio decode newFormat = " + newFormat);                    } else if (outputBufferIndex < 0) {                        //ignore                        CL.et(TAG, "unexpected result from audio decoder.dequeueOutputBuffer: " + outputBufferIndex);                    } else {                        if (info.flags == MediaCodec.BUFFER_FLAG_END_OF_STREAM) {encodeDone = true;break;                        }                        ByteBuffer encodeOutputBuffer = encoder.getOutputBuffer(outputBufferIndex);                        CL.it(TAG, "audio writeSampleData " + info.presentationTimeUs + " size:" + info.size + " flags:" + info.flags);                        if (!detectTimeError && lastAudioFrameTimeUs != -1 && info.presentationTimeUs < lastAudioFrameTimeUs + AAC_FRAME_TIME_US) {//某些情况下帧时间会出错,目前未找到原因(系统相机录得双声道视频正常,我录的单声道视频不正常)CL.et(TAG, "audio 时间戳错误,lastAudioFrameTimeUs:" + lastAudioFrameTimeUs + " " +        "info.presentationTimeUs:" + info.presentationTimeUs);detectTimeError = true;                        }                        if (detectTimeError) {info.presentationTimeUs = lastAudioFrameTimeUs + AAC_FRAME_TIME_US;CL.et(TAG, "audio 时间戳错误,使用修正的时间戳:" + info.presentationTimeUs);detectTimeError = false;                        }                        if (info.flags != MediaCodec.BUFFER_FLAG_CODEC_CONFIG) {lastAudioFrameTimeUs = info.presentationTimeUs;                        }                        mediaMuxer.writeSampleData(muxerAudioIndex, encodeOutputBuffer, info);                        encodeOutputBuffer.clear();                        encoder.releaseOutputBuffer(outputBufferIndex, false);                    }                }            }            //重新将视频写入            if (oriAudioIndex >= 0) {                oriExtrator.unselectTrack(oriAudioIndex);            }            oriExtrator.selectTrack(oriVideoIndex);            oriExtrator.seekTo(startTimeUs, MediaExtractor.SEEK_TO_PREVIOUS_SYNC);            maxBufferSize = oriVideoFormat.getInteger(MediaFormat.KEY_MAX_INPUT_SIZE);            int frameRate = oriVideoFormat.containsKey(MediaFormat.KEY_FRAME_RATE) ? oriVideoFormat.getInteger(MediaFormat.KEY_FRAME_RATE) : (int) Math.ceil(VideoUtil.getAveFrameRate(videoInput));            buffer = ByteBuffer.allocateDirect(maxBufferSize);            final int VIDEO_FRAME_TIME_US = (int) (1000 * 1000f / frameRate);            long lastVideoFrameTimeUs = -1;            detectTimeError = false;            while (true) {                long sampleTimeUs = oriExtrator.getSampleTime();                if (sampleTimeUs == -1) {                    break;                }                if (sampleTimeUs < startTimeUs) {                    oriExtrator.advance();                    continue;                }                if (endTimeUs != null && sampleTimeUs > endTimeUs) {                    break;                }                info.presentationTimeUs = sampleTimeUs - startTimeUs;                info.flags = oriExtrator.getSampleFlags();                info.size = oriExtrator.readSampleData(buffer, 0);                if (info.size < 0) {                    break;                }                //写入视频                if (!detectTimeError && lastVideoFrameTimeUs != -1 && info.presentationTimeUs < lastVideoFrameTimeUs + VIDEO_FRAME_TIME_US) {                    //某些视频帧时间会出错                    CL.et(TAG, "video 时间戳错误,lastVideoFrameTimeUs:" + lastVideoFrameTimeUs + " " +"info.presentationTimeUs:" + info.presentationTimeUs + " VIDEO_FRAME_TIME_US:" + VIDEO_FRAME_TIME_US);                    detectTimeError = true;                }                if (detectTimeError) {                    info.presentationTimeUs = lastVideoFrameTimeUs + VIDEO_FRAME_TIME_US;                    CL.et(TAG, "video 时间戳错误,使用修正的时间戳:" + info.presentationTimeUs);                    detectTimeError = false;                }                if (info.flags != MediaCodec.BUFFER_FLAG_CODEC_CONFIG) {                    lastVideoFrameTimeUs = info.presentationTimeUs;                }                CL.wt(TAG, "video writeSampleData:" + info.presentationTimeUs + " type:" + info.flags + " size:" + info.size);                mediaMuxer.writeSampleData(muxerVideoIndex, buffer, info);                oriExtrator.advance();            }        } finally {            aacPcmFile.delete();            videoPcmFile.delete();            adjustedPcm.delete();            wavFile.delete();            try {                pcmExtrator.release();                oriExtrator.release();                encoder.stop();                encoder.release();                mediaMuxer.release();            } catch (Exception e) {                CL.e(e);            }        }    }    public static Processor processor(Context context) {        return new Processor(context);    }    public static class Processor {        private Context context;        private com.hw.videoprocessor.VideoProcessor.MediaSource input;        private String output;        @Nullable        private Integer outWidth;        @Nullable        private Integer outHeight;        @Nullable        private Integer startTimeMs;        @Nullable        private Integer endTimeMs;        @Nullable        private Float speed;        @Nullable        private Boolean changeAudioSpeed;        @Nullable        private Integer bitrate;        @Nullable        private Integer frameRate;        @Nullable        private Integer iFrameInterval;        @Nullable        private VideoProgressListener listener;                private boolean dropFrames = true;        public Processor(Context context) {            this.context = context;        }        public Processor input(com.hw.videoprocessor.VideoProcessor.MediaSource input) {            this.input = input;            return this;        }        public Processor input(Uri input) {            this.input = new com.hw.videoprocessor.VideoProcessor.MediaSource(context,input);            return this;        }        public Processor input(String input) {            this.input = new com.hw.videoprocessor.VideoProcessor.MediaSource(input);            return this;        }        public Processor output(String output) {            this.output = output;            return this;        }        public Processor outWidth(int outWidth) {            this.outWidth = outWidth;            return this;        }        public Processor outHeight(int outHeight) {            this.outHeight = outHeight;            return this;        }        public Processor startTimeMs(int startTimeMs) {            this.startTimeMs = startTimeMs;            return this;        }        public Processor endTimeMs(int endTimeMs) {            this.endTimeMs = endTimeMs;            return this;        }        public Processor speed(float speed) {            this.speed = speed;            return this;        }        public Processor changeAudioSpeed(boolean changeAudioSpeed) {            this.changeAudioSpeed = changeAudioSpeed;            return this;        }        public Processor bitrate(int bitrate) {            this.bitrate = bitrate;            return this;        }        public Processor frameRate(int frameRate) {            this.frameRate = frameRate;            return this;        }        public Processor iFrameInterval(int iFrameInterval) {            this.iFrameInterval = iFrameInterval;            return this;        }                public Processor dropFrames(boolean dropFrames) {            this.dropFrames = dropFrames;            return this;        }        public Processor progressListener(VideoProgressListener listener) {            this.listener = listener;            return this;        }        public void process() throws Exception {            processVideo(context, this);        }    }}

来源地址:https://blog.csdn.net/As_thin/article/details/131536854

免责声明:

① 本站未注明“稿件来源”的信息均来自网络整理。其文字、图片和音视频稿件的所属权归原作者所有。本站收集整理出于非商业性的教育和科研之目的,并不意味着本站赞同其观点或证实其内容的真实性。仅作为临时的测试数据,供内部测试之用。本站并未授权任何人以任何方式主动获取本站任何信息。

② 本站未注明“稿件来源”的临时测试数据将在测试完成后最终做删除处理。有问题或投稿请发送至: 邮箱/279061341@qq.com QQ/279061341

Android进阶之视频压缩

下载Word文档到电脑,方便收藏和打印~

下载Word文档

猜你喜欢

怎么在Android中对视频进行压缩

这篇文章将为大家详细讲解有关怎么在Android中对视频进行压缩,文章内容质量较高,因此小编分享给大家做个参考,希望大家阅读完这篇文章后对相关知识有一定的了解。cmd = "-y -i /storage/emulated/0/coollan
2023-05-30

怎么用Powerpoint进行视频压缩

本篇内容主要讲解“怎么用Powerpoint进行视频压缩”,感兴趣的朋友不妨来看看。本文介绍的方法操作简单快捷,实用性强。下面就让小编来带大家学习“怎么用Powerpoint进行视频压缩”吧!下面是详细步骤。1. 创建一个空的powerpo
2023-06-04

uniapp怎么压缩视频

随着移动互联网的快速发展,短视频已成为人们日常娱乐和分享生活的重要方式。而在APP开发领域,uniapp无疑是一款备受青睐的跨平台开发框架。那么,在uniapp中如何进行视频压缩呢?本文将对此进行详细介绍。## 一、什么是视频压缩?在开始介绍uniapp如何进行视频压缩之前,我们先来了解一下什么是视频压缩。视频压缩是指将视频原始数据进行压缩处理,使得视频文件大小更小,同时保证视
2023-05-14

vue 使用ffmpeg上传视频前压缩视频,压缩后在上传到服务器

1,使用ffmpeg:npm install @ffmpeg/ffmpeg @ffmpeg/core -S; 2,vue 引入ffmpeg import { createFFmpeg, fetchFile } from '@ffmpeg/f
2023-08-24

vue如何实现录制视频并压缩视频文件

这篇“vue如何实现录制视频并压缩视频文件”文章的知识点大部分人都不太理解,所以小编给大家总结了以下内容,内容详细,步骤清晰,具有一定的借鉴价值,希望大家阅读完这篇文章能有所收获,下面我们一起来看看这篇“vue如何实现录制视频并压缩视频文件
2023-07-04

Springboot实现视频上传及压缩功能

这篇文章主要介绍了Springboot实现视频上传及压缩功能,本文通过实例代码给大家介绍的非常详细,对大家的学习或工作具有一定的参考借鉴价值,需要的朋友可以参考下
2023-03-03

Android 音视频之FFmpeg

FFmpeg介绍 FFmpeg是一套可以用来记录、处理数字音频、视频,并将其转换为流的开源框架,采用LPL或GPL许可证,提供了录制、转换以及流化音视频的完整解决方案。它的可移植性或者说跨平台特性非常强大。 默认的编译会生成4个可执行文件和
2022-06-06

FFmpeg进阶教程之给视频添加文字水印

FFmpeg是一套可以用来记录、转换数字音频、视频,并能将其转化为流的开源计算机程序,下面这篇文章主要给大家介绍了关于FFmpeg进阶教程之给视频添加文字水印的相关资料,需要的朋友可以参考下
2022-11-13

Android 播放视频之ExoPlayer

在上一篇中,我们了解了系统自带的VideoView来播放视频,但其支持的视频格式偏少,我们需要其他的视频播放器。 ExoPlayer是Google的开源的应用级媒体播放器项目,支持多种视频格式和流媒体播放,GitHub地址。 1. ExoP
2023-08-21

Android应用中如何对文件进行压缩与解压缩

Android应用中如何对文件进行压缩与解压缩?相信很多没有经验的人对此束手无策,为此本文总结了问题出现的原因和解决方法,通过这篇文章希望你能解决这个问题。使用场景当我们在应用的Assets目录中需要加入文件时,可以直接将源文件放入,但这样
2023-05-31

Springboot如何实现视频上传及压缩功能

这篇“Springboot如何实现视频上传及压缩功能”文章的知识点大部分人都不太理解,所以小编给大家总结了以下内容,内容详细,步骤清晰,具有一定的借鉴价值,希望大家阅读完这篇文章能有所收获,下面我们一起来看看这篇“Springboot如何实
2023-07-05

Android图片压缩上传之基础篇

在android程序开发中我们经常见到需要上传图片的场景,在这里有个技术点,需要把图片压缩处理,然后再进行上传。这样可以减少流量的消耗,提高图片的上传速度等问题。 关于android如何压缩,网上的资料也是很多,但大多数都是代码片段,讲解压
2022-06-06

Android如何获取图片或视频略缩图

根据指定的图像路径和大小来获取缩略图 此方法有两点好处: 1.使用较小的内存空间,第一次获取的bitmap实际上为null,只是为了读取宽度和高度,第二次读取的bitmap是根据比例压缩过的图像,第三次读取的bitmap是所要的缩略图。 2
2022-06-06

Android进阶之路 - 系统功能

今年开始写售货机的项目,有点像物联网 - - ~ 但具体也不太清楚; 此篇来源:在开发中需要跳转到WIFI列表,跳转之后发现无法返回当前APP了,针对于此,顺带总结了此篇系统自带的功能 未完,下午写完继续发布系统设置使用方式DemoMani
2022-06-06

Android提高之MediaPlayer音视频播放

前面文章已经详细介绍了Android界面的入门技术,相信大家在看完和跟着练习之后,会对于常用的Layout和View都会有一定的了解了,接下来就不再强调介绍界面了,而是针对具体的常见功能而展开。 本文将介绍MediaPlayer的使用。Me
2022-06-06

Android多媒体之VideoView视频播放器

本文实例为大家分享了视频播放器的两种方式,供大家参考,具体内容如下1)、SurfaceView 在布局文件中 2022-06-06

编程热搜

  • Android:VolumeShaper
    VolumeShaper(支持版本改一下,minsdkversion:26,android8.0(api26)进一步学习对声音的编辑,可以让音频的声音有变化的播放 VolumeShaper.Configuration的三个参数 durati
    Android:VolumeShaper
  • Android崩溃异常捕获方法
    开发中最让人头疼的是应用突然爆炸,然后跳回到桌面。而且我们常常不知道这种状况会何时出现,在应用调试阶段还好,还可以通过调试工具的日志查看错误出现在哪里。但平时使用的时候给你闹崩溃,那你就欲哭无泪了。 那么今天主要讲一下如何去捕捉系统出现的U
    Android崩溃异常捕获方法
  • android开发教程之获取power_profile.xml文件的方法(android运行时能耗值)
    系统的设置–>电池–>使用情况中,统计的能耗的使用情况也是以power_profile.xml的value作为基础参数的1、我的手机中power_profile.xml的内容: HTC t328w代码如下:
    android开发教程之获取power_profile.xml文件的方法(android运行时能耗值)
  • Android SQLite数据库基本操作方法
    程序的最主要的功能在于对数据进行操作,通过对数据进行操作来实现某个功能。而数据库就是很重要的一个方面的,Android中内置了小巧轻便,功能却很强的一个数据库–SQLite数据库。那么就来看一下在Android程序中怎么去操作SQLite数
    Android SQLite数据库基本操作方法
  • ubuntu21.04怎么创建桌面快捷图标?ubuntu软件放到桌面的技巧
    工作的时候为了方便直接打开编辑文件,一些常用的软件或者文件我们会放在桌面,但是在ubuntu20.04下直接直接拖拽文件到桌面根本没有效果,在进入桌面后发现软件列表中的软件只能收藏到面板,无法复制到桌面使用,不知道为什么会这样,似乎并不是很
    ubuntu21.04怎么创建桌面快捷图标?ubuntu软件放到桌面的技巧
  • android获取当前手机号示例程序
    代码如下: public String getLocalNumber() { TelephonyManager tManager =
    android获取当前手机号示例程序
  • Android音视频开发(三)TextureView
    简介 TextureView与SurfaceView类似,可用于显示视频或OpenGL场景。 与SurfaceView的区别 SurfaceView不能使用变换和缩放等操作,不能叠加(Overlay)两个SurfaceView。 Textu
    Android音视频开发(三)TextureView
  • android获取屏幕高度和宽度的实现方法
    本文实例讲述了android获取屏幕高度和宽度的实现方法。分享给大家供大家参考。具体分析如下: 我们需要获取Android手机或Pad的屏幕的物理尺寸,以便于界面的设计或是其他功能的实现。下面就介绍讲一讲如何获取屏幕的物理尺寸 下面的代码即
    android获取屏幕高度和宽度的实现方法
  • Android自定义popupwindow实例代码
    先来看看效果图:一、布局
  • Android第一次实验
    一、实验原理 1.1实验目标 编程实现用户名与密码的存储与调用。 1.2实验要求 设计用户登录界面、登录成功界面、用户注册界面,用户注册时,将其用户名、密码保存到SharedPreference中,登录时输入用户名、密码,读取SharedP
    Android第一次实验

目录