gpt4 book ai didi

android - native 窗口 queueBuffer 函数不呈现 Stagefright 解码器的输出

转载 作者:太空宇宙 更新时间:2023-11-03 10:48:38 32 4
gpt4 key购买 nike

我将一个 SurfaceView 表面从 Java 传递到 JNI,在那里我从该表面获取 native 窗口。 Stagefright 从 mp4 文件解码 h264 帧。在解码过程中,我调用 ANativeWindow::queueBuffer() 以发送要渲染的解码帧。解码或调用 queueBuffer() 时都没有错误,我得到的只是黑屏。

我真的觉得我没有正确设置 native 窗口,以便在调用 queueBuffer() 时将其呈现到屏幕上。但是,我可以直接通过 memcpy 将像素渲染到 native 窗口。不幸的是,在我实例化 OMXClient 之后,尝试手动绘制像素时发生了段错误,因此看来我必须使用 queueBuffer()

在 onCreate() 中设置我的表面 View :

protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);

SurfaceView surfaceView = new SurfaceView(this);
surfaceView.getHolder().addCallback(this);
setContentView(surfaceView);
}

创建表面后,我用表面调用我的原生 init() 函数:

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
NativeLib.init(holder.getSurface(), width, height);
}

在 JNI 中创建 native 窗口并启动解码线程:

nativeWindow = ANativeWindow_fromSurface(env, surface);
int ret = pthread_create(&decode_thread, NULL, &decode_frames, NULL);

我按照 vec.io's Stagefright decoding example 解码帧的例程

void* decode_frames(void*){
mNativeWindow = nativeWindow;
sp<MediaSource> mVideoSource = new AVFormatSource();
OMXClient mClient;
mClient.connect();

sp<MediaSource> mVideoDecoder = OMXCodec::Create(mClient.interface(), mVideoSource->getFormat(), false, mVideoSource, NULL, 0, mNativeWindow);
mVideoDecoder->start();

while(err != ERROR_END_OF_STREAM ) {
MediaBuffer *mVideoBuffer;
MediaSource::ReadOptions options;
err = mVideoDecoder->read(&mVideoBuffer, &options);

if (err == OK) {
if (mVideoBuffer->range_length() > 0) {

sp<MetaData> metaData = mVideoBuffer->meta_data();
int64_t timeUs = 0;
metaData->findInt64(kKeyTime, &timeUs);
status_t err1 = native_window_set_buffers_timestamp(mNativeWindow.get(), timeUs * 1000);
//This line results in a black frame
status_t err2 = mNativeWindow->queueBuffer(mNativeWindow.get(), mVideoBuffer->graphicBuffer().get(), -1);

if (err2 == 0) {
metaData->setInt32(kKeyRendered, 1);
}
}
mVideoBuffer->release();
}
}
mVideoSource.clear();
mVideoDecoder->stop();
mVideoDecoder.clear();
mClient.disconnect();
}

编辑:采纳 Ganesh 的建议,我连接了 Awesome Renderer 以更改色彩空间。在此期间,很明显未在 Stagefright 中设置颜色格式。

08-06 00:56:32.842: A/SoftwareRenderer(7326): frameworks/av/media/libstagefright/colorconversion/SoftwareRenderer.cpp:42 CHECK(meta->findInt32(kKeyColorFormat, &tmp)) failed.
08-06 00:56:32.842: A/libc(7326): Fatal signal 11 (SIGSEGV) at 0xdeadbaad (code=1), thread 7340 (hieu.alloclient)

尝试显式设置颜色空间(kKeyColorFormat 为 yuv420P 颜色空间)会导致出队问题。这可能是有道理的,因为我指定的颜色格式是任意的。

08-06 00:44:30.878: V/OMXCodec(6937): matchComponentName (null)
08-06 00:44:30.888: V/OMXCodec(6937): matching 'OMX.qcom.video.decoder.avc' quirks 0x000000a8
08-06 00:44:30.888: V/OMXCodec(6937): matchComponentName (null)
08-06 00:44:30.888: V/OMXCodec(6937): matching 'OMX.google.h264.decoder' quirks 0x00000000
08-06 00:44:30.888: V/OMXCodec(6937): Attempting to allocate OMX node 'OMX.qcom.video.decoder.avc'
08-06 00:44:30.918: V/OMXCodec(6937): Successfully allocated OMX node 'OMX.qcom.video.decoder.avc'
08-06 00:44:30.918: V/OMXCodec(6937): configureCodec protected=0
08-06 00:44:30.918: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] AVC profile = 66 (Baseline), level = 13
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] setVideoOutputFormat width=320, height=240
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] portIndex: 0, index: 0, eCompressionFormat=7 eColorFormat=0
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] found a match.
08-06 00:44:30.938: I/QCOMXCodec(6937): Decoder should be in arbitrary mode
08-06 00:44:30.958: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] video dimensions are 320 x 240
08-06 00:44:30.958: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] Crop rect is 320 x 240 @ (0, 0)
08-06 00:44:30.958: D/infoJNI(6937): before started
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocating 2 buffers of size 2097088 on input port
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocated buffer 0x417037d8 on input port
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocated buffer 0x41703828 on input port
08-06 00:44:30.978: V/OMXCodec(6937): native_window_set_usage usage=0x40000000
08-06 00:44:30.978: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocating 22 buffers from a native window of size 147456 on output port
08-06 00:44:30.978: E/OMXCodec(6937): dequeueBuffer failed: Invalid argument (22)

最佳答案

我最终改用 Java 低级 API 解决了这个问题。我设置了一个本地 read_frame 函数,它使用 FFmpeg 解析视频帧。我在一个单独的 Java 解码器线程中调用此函数,该线程返回一个新的数据帧以供 MediaCodec 解码。以这种方式呈现非常直接——只需将 MediaCodec 传递给表面即可。

或者,我可以使用 MediaExtractor,但是 FFmpeg 有一些我需要的其他功能。

关于android - native 窗口 queueBuffer 函数不呈现 Stagefright 解码器的输出,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/17960268/

32 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com