gpt4 book ai didi

android - 点击以使用 javacv 像在 vine 中一样进行记录

转载 作者:行者123 更新时间:2023-11-29 00:05:32 25 4
gpt4 key购买 nike

我正在尝试像在 vine 中一样实现点击录制功能。 javacv 中提供的处理记录(不是触摸记录)的示例是 https://github.com/bytedeco/javacv/blob/master/samples/RecordActivity.java .我正在尝试修改它,以便在 onPreviewFrame 方法中仅当用户将手指放在屏幕上时才将帧添加到缓冲区。然后尝试在 stopRecording 方法中将这些帧组合成最终视频。

问题是,如果我按照下面的代码 fragment (在 stopRecording 方法中)设置时间戳

if (t > recorder.getTimestamp()) 
{
recorder.setTimestamp(t);
}

行为如下

案例一

如果我点击屏幕录制 2 秒,然后将手指从屏幕上移开 3 秒,然后再次将手指放回屏幕上录制另外 4 秒,则生成的视频如下所示,

第 1 2 秒视频已录制内容。接下来的 3 秒(手指离开屏幕的时间)。视频只显示手指最后放在屏幕上时记录的最后一帧。然后视频记录了接下来 4 秒的视频内容。因此,当手指从屏幕上移开时,处理视频录制似乎存在问题。

案例二

接下来,我在 stopRecording 方法中删除了记录器(上面给出的代码 fragment )的代码设置时间戳。

现在生成的视频(对于案例 1 中尝试的相同步骤)不包含手指离开屏幕时的中间 3 秒(这是必需的)。但是视频正在以更快的速度播放。所以看来我们需要设置时间戳,以便视频以正常速率播放。

我的 Activity 的完整代码如下。 (请注意,视频录制主要由 onPreviewFrame 和 stopRecording 方法处理)

public class TouchToRecordActivity extends Activity implements OnClickListener, View.OnTouchListener {

private final static String CLASS_LABEL = "TouchToRecordActivity";
private final static String LOG_TAG = CLASS_LABEL;

private String ffmpeg_link = "/mnt/sdcard/stream.mp4";

long startTime = 0;
boolean recording = false;
boolean rec = false;

private FFmpegFrameRecorder recorder;

private boolean isPreviewOn = false;

private int sampleAudioRateInHz = 44100;
private int imageWidth = 640;
private int imageHeight = 480;
private int destWidth = 480;
private int frameRate = 30;

/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;

/* video data getting thread */
private Camera cameraDevice;
private CameraView cameraView;

private Frame yuvImage = null;

/* layout setting */
private final int bg_screen_bx = 232;
private final int bg_screen_by = 128;
private final int bg_screen_width = 700;
private final int bg_screen_height = 500;
private final int bg_width = 1123;
private final int bg_height = 715;
private final int live_width = 640;
private final int live_height = 480;
private int screenWidth, screenHeight;
private Button btnRecorderControl;

/* The number of seconds in the continuous record loop (or 0 to disable loop). */
final int RECORD_LENGTH = 20;
Frame[] images;
long[] timestamps;
ShortBuffer[] samples;
int imagesIndex, samplesIndex;

long firstTime = 0;
long startPauseTime = 0;
long totalPauseTime = 0;
long pausedTime = 0;
long stopPauseTime = 0;
long totalTime = 0;

long totalRecordedTS = 0;

private TextView txtTimer;
private Handler mHandler = new Handler();

@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

setContentView(R.layout.touch_main);

initLayout();
}

@Override
protected void onDestroy() {
super.onDestroy();

recording = false;

if (cameraView != null) {
cameraView.stopPreview();
}

if (cameraDevice != null) {
cameraDevice.stopPreview();
cameraDevice.release();
cameraDevice = null;
}
}


private void initLayout() {

/* get size of screen */
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
screenWidth = display.getWidth();
screenHeight = display.getHeight();
RelativeLayout.LayoutParams layoutParam = null;
LayoutInflater myInflate = null;
myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
RelativeLayout topLayout = new RelativeLayout(this);
setContentView(topLayout);
LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.touch_main, null);
layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
topLayout.addView(preViewLayout, layoutParam);

txtTimer = (TextView) preViewLayout.findViewById(R.id.txtTimer);

/* add control button: start and stop */
btnRecorderControl = (Button) findViewById(R.id.recorder_control);
btnRecorderControl.setText("Start");
btnRecorderControl.setOnClickListener(this);

/* add camera view */
int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
int prev_rw, prev_rh;
if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
prev_rh = display_height_d;
prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
} else {
prev_rw = display_width_d;
prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
}
layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);

cameraDevice = Camera.open();
Log.i(LOG_TAG, "cameara open");
cameraView = new CameraView(this, cameraDevice);
topLayout.addView(cameraView, layoutParam);
topLayout.setOnTouchListener(this);
Log.i(LOG_TAG, "cameara preview start: OK");
}

//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() {

Log.w(LOG_TAG, "init recorder");

if (RECORD_LENGTH > 0) {
imagesIndex = 0;
images = new Frame[RECORD_LENGTH * frameRate];
timestamps = new long[images.length];
for (int i = 0; i < images.length; i++) {
images[i] = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
timestamps[i] = -1;
}
} else if (yuvImage == null) {
yuvImage = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
Log.i(LOG_TAG, "create yuvImage");
}
Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
recorder = new FFmpegFrameRecorder(ffmpeg_link, destWidth, imageHeight, 1);
recorder.setFormat("mp4");
recorder.setVideoCodecName("libx264");
recorder.setSampleRate(sampleAudioRateInHz);
// Set in the surface changed method
recorder.setFrameRate(frameRate);

Log.i(LOG_TAG, "recorder initialize success");

audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
runAudioThread = true;
}

public void startRecording() {

initRecorder();

mHandler.removeCallbacks(mUpdateTimeTask);
mHandler.postDelayed(mUpdateTimeTask, 100);

try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();

} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}

public void stopRecording() {

runAudioThread = false;
try {
audioThread.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
audioRecordRunnable = null;
audioThread = null;

if (recorder != null && recording) {
if (RECORD_LENGTH > 0) {
Log.v(LOG_TAG, "Writing frames");
try {
int firstIndex = imagesIndex % samples.length;
int lastIndex = (imagesIndex - 1) % images.length;
if (imagesIndex <= images.length) {
firstIndex = 0;
lastIndex = imagesIndex - 1;
}
if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
startTime = 0;
}
if (lastIndex < firstIndex) {
lastIndex += images.length;
}
int videoCounter = 0;
for (int i = firstIndex; i <= lastIndex; i++) {
if (timestamps[i] == -1) {
Log.v(LOG_TAG, "frame not recorded");
}
if (timestamps[i] != -1) {
long t = timestamps[i % timestamps.length] - startTime;
if (t >= 0) {

videoCounter++;

/*if (((i % images.length) != 0) && images[i % images.length] != images[(i % images.length) - 1]) {
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}*/
Log.v(LOG_TAG, "imageIndex=" + (i % images.length));
recorder.record(images[i % images.length]);
/* }*/
Log.v(LOG_TAG, "videoCounter=" + videoCounter);
}
}
}

firstIndex = samplesIndex % samples.length;
lastIndex = (samplesIndex - 1) % samples.length;
if (samplesIndex <= samples.length) {
firstIndex = 0;
lastIndex = samplesIndex - 1;
}
if (lastIndex < firstIndex) {
lastIndex += samples.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {
if (timestamps[i] != -1) {
recorder.recordSamples(samples[i % samples.length]);
}
}
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG, e.getMessage());
e.printStackTrace();
}
}

recording = false;
Log.v(LOG_TAG, "Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;

}
}

@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {

if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}

finish();

return true;
}

return super.onKeyDown(keyCode, event);
}

@Override
public boolean onTouch(View view, MotionEvent motionEvent) {
switch (motionEvent.getAction()) {
case MotionEvent.ACTION_DOWN:
Log.v(LOG_TAG, "ACTION_DOWN" + recording);

if (!recording) {
startRecording();
} else {
stopPauseTime = System.currentTimeMillis();
totalPauseTime = stopPauseTime - startPauseTime - ((long) (1.0 / (double) frameRate) * 1000);
pausedTime += totalPauseTime;
}
rec = true;
setTotalVideoTime();
btnRecorderControl.setText(getResources().getString(R.string.stop));
break;
case MotionEvent.ACTION_MOVE:
rec = true;
setTotalVideoTime();
break;
case MotionEvent.ACTION_UP:
Log.v(LOG_TAG, "ACTION_UP");
rec = false;
startPauseTime = System.currentTimeMillis();
break;
}
return true;
}

private Runnable mUpdateTimeTask = new Runnable() {
public void run() {
if (recording) {
setTotalVideoTime();
}
mHandler.postDelayed(this, 500);
}
};

private synchronized void setTotalVideoTime() {
totalTime = System.currentTimeMillis() - firstTime - pausedTime - ((long) (1.0 / (double) frameRate) * 1000);
if (totalTime > 0)
txtTimer.setText(getRecordingTimeFromMillis(totalTime));
}

private String getRecordingTimeFromMillis(long millis) {
String strRecordingTime = null;
int seconds = (int) (millis / 1000);
int minutes = seconds / 60;
int hours = minutes / 60;

if (hours >= 0 && hours < 10)
strRecordingTime = "0" + hours + ":";
else
strRecordingTime = hours + ":";

if (hours > 0)
minutes = minutes % 60;

if (minutes >= 0 && minutes < 10)
strRecordingTime += "0" + minutes + ":";
else
strRecordingTime += minutes + ":";

seconds = seconds % 60;

if (seconds >= 0 && seconds < 10)
strRecordingTime += "0" + seconds;
else
strRecordingTime += seconds;

return strRecordingTime;

}


//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {

@Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

// Audio
int bufferSize;
ShortBuffer audioData;
int bufferReadResult;

bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

if (RECORD_LENGTH > 0) {
samplesIndex = 0;
samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
for (int i = 0; i < samples.length; i++) {
samples[i] = ShortBuffer.allocate(bufferSize);
}
} else {
audioData = ShortBuffer.allocate(bufferSize);
}

Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();

/* ffmpeg_audio encoding loop */
while (runAudioThread) {
if (RECORD_LENGTH > 0) {
audioData = samples[samplesIndex++ % samples.length];
audioData.position(0).limit(0);
}
//Log.v(LOG_TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
audioData.limit(bufferReadResult);
if (bufferReadResult > 0) {
Log.v(LOG_TAG, "bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
// Why? Good question...
if (recording && rec) {
Log.v(LOG_TAG, "Recording audio");
if (RECORD_LENGTH <= 0) try {
recorder.recordSamples(audioData);
//Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG, e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(LOG_TAG, "AudioThread Finished, release audioRecord");

/* encoding finish, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG, "audioRecord released");
}
}
}

//---------------------------------------------
// camera thread, gets and encodes video data
//---------------------------------------------
class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

private SurfaceHolder mHolder;
private Camera mCamera;

public CameraView(Context context, Camera camera) {
super(context);
Log.w("camera", "camera view");
mCamera = camera;
mHolder = getHolder();
mHolder.addCallback(CameraView.this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mCamera.setPreviewCallback(CameraView.this);
}

@Override
public void surfaceCreated(SurfaceHolder holder) {
try {
stopPreview();
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}

public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
stopPreview();

Camera.Parameters camParams = mCamera.getParameters();
List<Camera.Size> sizes = camParams.getSupportedPreviewSizes();
// Sort the list in ascending order
Collections.sort(sizes, new Comparator<Camera.Size>() {

public int compare(final Camera.Size a, final Camera.Size b) {
return a.width * a.height - b.width * b.height;
}
});

camParams.setPreviewSize(imageWidth, imageHeight);

Log.v(LOG_TAG, "Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);

camParams.setPreviewFrameRate(frameRate);
Log.v(LOG_TAG, "Preview Framerate: " + camParams.getPreviewFrameRate());

mCamera.setParameters(camParams);

List<Camera.Size> videoSizes = mCamera.getParameters().getSupportedVideoSizes();

// Set the holder (which might have changed) again
try {
mCamera.setPreviewDisplay(holder);
mCamera.setPreviewCallback(CameraView.this);
startPreview();
} catch (Exception e) {
Log.e(LOG_TAG, "Could not set preview display in surfaceChanged");
}
}

@Override
public void surfaceDestroyed(SurfaceHolder holder) {
try {
mHolder.addCallback(null);
mCamera.setPreviewCallback(null);
} catch (RuntimeException e) {
// The camera has probably just been released, ignore.
}
}

public void startPreview() {
if (!isPreviewOn && mCamera != null) {
isPreviewOn = true;
mCamera.startPreview();
}
}

public void stopPreview() {
if (isPreviewOn && mCamera != null) {
isPreviewOn = false;
mCamera.stopPreview();
}
}

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
startTime = System.currentTimeMillis();
return;
}
if (RECORD_LENGTH > 0) {
int i = imagesIndex++ % images.length;
Log.v(LOG_TAG, "recording:" + recording + "rec:" + rec);
if (recording && rec) {
yuvImage = images[i];
timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
totalRecordedTS++;
} else {
Log.v(LOG_TAG, "recording is paused");
yuvImage = null;
timestamps[i] = -1;
}
}

/* get video data */
if (yuvImage != null && recording && rec) {
if (data.length != imageWidth * imageHeight) {
Camera.Size sz = camera.getParameters().getPreviewSize();
imageWidth = sz.width;
imageHeight = sz.height;
destWidth = imageHeight;
Log.v(LOG_TAG, "data length:" + data.length);
}

ByteBuffer bb = (ByteBuffer) yuvImage.image[0].position(0); // resets the buffer
int start = 2 * ((imageWidth - destWidth) / 4); // this must be even
for (int row = 0; row < imageHeight * 3 / 2; row++) {
bb.put(data, start, destWidth);
start += imageWidth;
}

}
}
}

@Override
public void onClick(View v) {
if (!recording) {
startRecording();
Log.w(LOG_TAG, "Start Button Pushed");
btnRecorderControl.setText("Stop");
} else {
// This will trigger the audio recording loop to stop and then set isRecorderStart = false;
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
btnRecorderControl.setText("Start");
}
}}

根据 Alex Cohn 的建议进行的更改

建议 1 - 估计平均帧率

    public void stopRecording() {

..............................

if (((i % images.length) != 0) && images[i % images.length] != images[(i % images.length) - 1]) {
if (t > recorder.getTimestamp()) {
t += 1000000 / frameRate;
recorder.setTimestamp(t);
}

recorder.record(images[i % images.length]);
}
..........................................


}

所做的更改是添加 t += 1000000/frameRate;但这会导致视频在手指离开屏幕时部分卡住(如上述案例 1 中所述)。

建议2-修改onPreviewFrame()

long[] timestampsForRecorder;
private void initRecorder() {

Log.w(LOG_TAG, "init recorder");

if (RECORD_LENGTH > 0) {
.......................................................
timestampsForRecorder = new long[images.length];
for (int i = 0; i < images.length; i++) {
images[i] = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
timestamps[i] = -1;
timestampsForRecorder[i] = -1;
}
} else if (yuvImage == null) {
yuvImage = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
Log.i(LOG_TAG, "create yuvImage");
}
...................................................
}

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
startTime = SystemClock.elapsedRealtime();
return;
}
if (RECORD_LENGTH > 0) {
int i = imagesIndex++ % images.length;
Log.v(LOG_TAG, "recording:" + recording + "rec:" + rec);
if (recording && rec) {
yuvImage = images[i];
long thisFrameTime = SystemClock.elapsedRealtime();
timestamps[i] = thisFrameTime;
long lastFrameTime = timestamps[(int) (imagesIndex == 0 ? startTime : ((imagesIndex-1) % images.length))];
Log.v(LOG_TAG, "lastFrameTime:" + lastFrameTime+",stopPauseTime:" + stopPauseTime);
if (lastFrameTime > stopPauseTime) {
timestampsForRecorder[i] = 1000 * (thisFrameTime - Math.max(stopPauseTime, lastFrameTime));
}
}
}

.....................................................
}

public void stopRecording() {

.......................................................

if (recorder != null && recording) {
if (RECORD_LENGTH > 0) {
Log.v(LOG_TAG, "Writing frames");
try {
int firstIndex = imagesIndex % samples.length;
int lastIndex = (imagesIndex - 1) % images.length;
if (imagesIndex <= images.length) {
firstIndex = 0;
lastIndex = imagesIndex - 1;
}
if ((startTime = timestampsForRecorder[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
startTime = 0;
}
if (lastIndex < firstIndex) {
lastIndex += images.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {

if (timestampsForRecorder[i] != -1) {
long t = timestampsForRecorder[i % timestampsForRecorder.length] - startTime;
if (t >= 0) {

if (((i % images.length) != 0) && images[i % images.length] != images[(i % images.length) - 1]) {
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
Log.v(LOG_TAG, "imageIndex=" + (i % images.length));
recorder.record(images[i % images.length]);
}
}
}
}
.............................................
} catch (FFmpegFrameRecorder.Exception e) {
.................................
}
}

...........................................

}
}

用这个录制的视频出现了上述案例2的问题。即,它正在以更快的速度播放

最佳答案

简单(但不精确)的解决方案是估计平均帧速率,并使用 t += 1000000/average_fps; recorder.setTimestamp(t); 而不是查看实际时间戳。

为了更准确,您可以按如下方式更改onPreviewFrame():

long thisFrameTime = SystemClock.elapsedRealtime();
timestamps[i] = thisFrameTime;
long lastFrameTime = timestamps[imagesIndex < 2 ? startTime : (imagesIndex-2) % images.length)];
if (lastFrameTime > stopPauseTime) {
timestampsForRecorder[i] = 1000 * (thisFrameTime - Math.max(stopPauseTime, lastFrameTime));
}

您可以将第二个数组 timestampsForRecorder 直接提供给记录器。

请注意,使用 SystemClock.elapsedRealtime() 更安全无处不在:

This clock is guaranteed to be monotonic, and continues to tick even when the CPU is in power saving modes, so is the recommend basis for general purpose interval timing.

关于android - 点击以使用 javacv 像在 vine 中一样进行记录,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34063847/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com