gpt4 book ai didi

video - Android : mp4 output file not fully playable 上没有预览的 Camera2 视频录制

转载 作者:行者123 更新时间:2023-11-28 21:39:09 50 4
gpt4 key购买 nike

我正在尝试从我的 Samsung Galaxy S6(支持 1920x1080,约 30 fps)上的后置摄像头(面对脸部的摄像头)录制视频。如果不需要,我不想使用任何表面进行预览,因为这只会在后台发生。

我似乎让它工作了,但输出文件无法以实际正确的方式播放。在我的 Windows 10 PC 上,Windows Media Player 将显示第一帧然后播放音频,VLC 不会显示任何帧。在我的手机上,录制的文件可以播放,但不能完全播放。它将保持第一帧 5-8 秒,然后在最后,剩余时间变为 0,显示的总时间发生变化,然后实际视频帧开始播放。在我的 Mac (10.9.5) 上,Quicktime 不会显示视频(虽然没有错误),但 Google Picasa 可以完美播放。我想在我的 PC 上试用 Picasa 看看它是否在那里工作,但我无法再下载 Google Picasa,因为已经日落了。

我尝试安装我找到的适用于 Windows 的编解码器包,但没有解决任何问题。 MediaInfo v0.7.85 报告有关该文件的信息:

GeneralComplete name               : C:\...\1465655479915.mp4Format                      : MPEG-4Format profile              : Base Media / Version 2Codec ID                    : mp42 (isom/mp42)File size                   : 32.2 MiBDuration                    : 15s 744msOverall bit rate            : 17.1 MbpsEncoded date                : UTC 2016-06-11 14:31:50Tagged date                 : UTC 2016-06-11 14:31:50com.android.version         : 6.0.1VideoID                          : 1Format                      : AVCFormat/Info                 : Advanced Video CodecFormat profile              : High@L4Format settings, CABAC      : YesFormat settings, ReFrames   : 1 frameFormat settings, GOP        : M=1, N=30Codec ID                    : avc1Codec ID/Info               : Advanced Video CodingDuration                    : 15s 627msBit rate                    : 16.2 MbpsWidth                       : 1 920 pixelsHeight                      : 1 080 pixelsDisplay aspect ratio        : 16:9Frame rate mode             : VariableFrame rate                  : 0.000 (0/1000) fpsMinimum frame rate          : 0.000 fpsMaximum frame rate          : 30.540 fpsColor space                 : YUVChroma subsampling          : 4:2:0Bit depth                   : 8 bitsScan type                   : ProgressiveStream size                 : 0.00 Byte (0%)Source stream size          : 31.7 MiB (98%)Title                       : VideoHandleLanguage                    : EnglishEncoded date                : UTC 2016-06-11 14:31:50Tagged date                 : UTC 2016-06-11 14:31:50mdhd_Duration               : 15627AudioID                          : 2Format                      : AACFormat/Info                 : Advanced Audio CodecFormat profile              : LCCodec ID                    : 40Duration                    : 15s 744msBit rate mode               : ConstantBit rate                    : 256 KbpsChannel(s)                  : 2 channelsChannel positions           : Front: L RSampling rate               : 48.0 KHzFrame rate                  : 46.875 fps (1024 spf)Compression mode            : LossyStream size                 : 492 KiB (1%)Title                       : SoundHandleLanguage                    : EnglishEncoded date                : UTC 2016-06-11 14:31:50Tagged date                 : UTC 2016-06-11 14:31:50

The code that I am using to create this is:

package invisiblevideorecorder;

import android.content.Context;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.os.Environment;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.view.Surface;

import java.io.File;
import java.io.IOException;
import java.util.Arrays;

/**
* @author Mark
* @since 6/10/2016
*/
public class InvisibleVideoRecorder {
private static final String TAG = "InvisibleVideoRecorder";
private final CameraCaptureSessionStateCallback cameraCaptureSessionStateCallback = new CameraCaptureSessionStateCallback();
private final CameraDeviceStateCallback cameraDeviceStateCallback = new CameraDeviceStateCallback();
private MediaRecorder mediaRecorder;
private CameraManager cameraManager;
private Context context;

private CameraDevice cameraDevice;

private HandlerThread handlerThread;
private Handler handler;

public InvisibleVideoRecorder(Context context) {
this.context = context;
handlerThread = new HandlerThread("camera");
handlerThread.start();
handler = new Handler(handlerThread.getLooper());

try {
mediaRecorder = new MediaRecorder();

mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);

final String filename = context.getExternalFilesDir(Environment.DIRECTORY_MOVIES).getAbsolutePath() + File.separator + System.currentTimeMillis() + ".mp4";
mediaRecorder.setOutputFile(filename);
Log.d(TAG, "start: " + filename);

// by using the profile, I don't think I need to do any of these manually:
// mediaRecorder.setVideoEncodingBitRate(16000000);
// mediaRecorder.setVideoFrameRate(30);
// mediaRecorder.setCaptureRate(30);
// mediaRecorder.setVideoSize(1920, 1080);
// mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
// mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);

// Log.d(TAG, "start: 1 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_1080P));
// true
// Log.d(TAG, "start: 2 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_HIGH_SPEED_1080P));
// false
// Log.d(TAG, "start: 3 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_HIGH));
// true

CamcorderProfile profile = CamcorderProfile.get(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_1080P);
Log.d(TAG, "start: profile " + ToString.inspect(profile));
// start: 0 android.media.CamcorderProfile@114016694 {
// audioBitRate: 256000
// audioChannels: 2
// audioCodec: 3
// audioSampleRate: 48000
// duration: 30
// fileFormat: 2
// quality: 6
// videoBitRate: 17000000
// videoCodec: 2
// videoFrameHeight: 1080
// videoFrameRate: 30
// videoFrameWidth: 1920
// }
mediaRecorder.setOrientationHint(0);
mediaRecorder.setProfile(profile);
mediaRecorder.prepare();
} catch (IOException e) {
Log.d(TAG, "start: exception" + e.getMessage());
}

}

public void start() {
Log.d(TAG, "start: ");

cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
try {
cameraManager.openCamera(String.valueOf(CameraMetadata.LENS_FACING_BACK), cameraDeviceStateCallback, handler);
} catch (CameraAccessException | SecurityException e) {
Log.d(TAG, "start: exception " + e.getMessage());
}

}

public void stop() {
Log.d(TAG, "stop: ");
mediaRecorder.stop();
mediaRecorder.reset();
mediaRecorder.release();
cameraDevice.close();
try {
handlerThread.join();
} catch (InterruptedException e) {

}
}

private class CameraCaptureSessionStateCallback extends CameraCaptureSession.StateCallback {
private final static String TAG = "CamCaptSessionStCb";

@Override
public void onActive(CameraCaptureSession session) {
Log.d(TAG, "onActive: ");
super.onActive(session);
}

@Override
public void onClosed(CameraCaptureSession session) {
Log.d(TAG, "onClosed: ");
super.onClosed(session);
}

@Override
public void onConfigured(CameraCaptureSession session) {
Log.d(TAG, "onConfigured: ");
}

@Override
public void onConfigureFailed(CameraCaptureSession session) {
Log.d(TAG, "onConfigureFailed: ");
}

@Override
public void onReady(CameraCaptureSession session) {
Log.d(TAG, "onReady: ");
super.onReady(session);
try {
CaptureRequest.Builder builder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
builder.addTarget(mediaRecorder.getSurface());
CaptureRequest request = builder.build();
session.setRepeatingRequest(request, null, handler);
mediaRecorder.start();
} catch (CameraAccessException e) {
Log.d(TAG, "onConfigured: " + e.getMessage());

}
}

@Override
public void onSurfacePrepared(CameraCaptureSession session, Surface surface) {
Log.d(TAG, "onSurfacePrepared: ");
super.onSurfacePrepared(session, surface);
}
}

private class CameraDeviceStateCallback extends CameraDevice.StateCallback {
private final static String TAG = "CamDeviceStateCb";

@Override
public void onClosed(CameraDevice camera) {
Log.d(TAG, "onClosed: ");
super.onClosed(camera);
}

@Override
public void onDisconnected(CameraDevice camera) {
Log.d(TAG, "onDisconnected: ");
}

@Override
public void onError(CameraDevice camera, int error) {
Log.d(TAG, "onError: ");
}

@Override
public void onOpened(CameraDevice camera) {
Log.d(TAG, "onOpened: ");
cameraDevice = camera;
try {
camera.createCaptureSession(Arrays.asList(mediaRecorder.getSurface()), cameraCaptureSessionStateCallback, handler);
} catch (CameraAccessException e) {
Log.d(TAG, "onOpened: " + e.getMessage());
}
}
}

}

我遵循了 Android 源代码(测试和应用程序)以及我在 github 上找到的几个示例,以解决这个问题,因为 camera2 API 还没有很好的文档记录。

有什么明显的地方我做错了吗?或者,我是否只是在我的 Mac 上缺少用于 Quicktime 的编解码器以及在我的 PC 上用于 Windows Media Player 和 VLC 的编解码器?我还没有尝试在 Linux 上播放这些文件,所以我不知道那里会发生什么。哦,如果我将 mp4 文件上传到 photos.google.com,它们也可以在那里完全正确地播放。

谢谢!标记

最佳答案

我的团队在开发基于 Camera2 API 的插件时遇到了类似的问题,但它只影响了 Samsung Galaxy S7(我们还有用于测试的 S6 没有出现此行为)。

该问题似乎是由三星相机固件中的错误引起的,并在设备退出深度 sleep (Android 6.0 Marshmallow 中的超低功耗模式)时触发。从深度 sleep 中恢复后,使用 Camera2 MediaRecorder 捕获和编码的任何视频的第一帧都具有非常长的帧持续时间 - 有时与视频本身的总持续时间一样长或更长。

因此,在播放时,第一帧会显示这么长时间,同时音频会继续播放。第一帧显示完毕后,其余帧将正常播放。

我们发现其他人也有类似的问题discussing the issue on GitHub

The issue is a deep sleep problem on some devices running Marshmallow. It appears to be CPU related as an S7 on Verizon doesn't have the issue, but an S7 on AT&T does have the issue. I've seen this on an S6 Verizon phone when it updated to Marshmallow.

In order to replicate, reboot a device while connected to USB. Run the sample. All should be ok. Then, disconnect the device, let it go into deep sleep (screen off, no movement for 5? minutes), and try again. The issue will appear once the device has gone into deep sleep.

我们最终使用了 cybaker's proposed workaround ;也就是说,当创建视频文件时,检查视频第一帧的持续时间。如果它看起来不正确,请使用合理的帧持续时间重新编码视频:

DataSource channel = new FileDataSourceImpl(rawFile);
IsoFile isoFile = new IsoFile(channel);

List<TrackBox> trackBoxes = isoFile.getMovieBox().getBoxes(TrackBox.class);
boolean sampleError = false;
for (TrackBox trackBox : trackBoxes) {
TimeToSampleBox.Entry firstEntry = trackBox.getMediaBox().getMediaInformationBox().getSampleTableBox().getTimeToSampleBox().getEntries().get(0);

// Detect if first sample is a problem and fix it in isoFile
// This is a hack. The audio deltas are 1024 for my files, and video deltas about 3000
// 10000 seems sufficient since for 30 fps the normal delta is about 3000
if(firstEntry.getDelta() > 10000) {
sampleError = true;
firstEntry.setDelta(3000);
}
}

if(sampleError) {
Movie movie = new Movie();
for (TrackBox trackBox : trackBoxes) {
movie.addTrack(new Mp4TrackImpl(channel.toString() + "[" + trackBox.getTrackHeaderBox().getTrackId() + "]" , trackBox));
}
movie.setMatrix(isoFile.getMovieBox().getMovieHeaderBox().getMatrix());
Container out = new DefaultMp4Builder().build(movie);

//delete file first!
FileChannel fc = new RandomAccessFile(rawFile.getName(), "rw").getChannel();
out.writeContainer(fc);
fc.close();
Log.d(TAG, "Finished correcting raw video");
}

希望这能为您指明正确的方向!

关于video - Android : mp4 output file not fully playable 上没有预览的 Camera2 视频录制,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37767511/

50 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com