gpt4 book ai didi

android - 调用 "onResume"时,GLSurfaceView 不会恢复其 Open GL 线程

转载 作者:搜寻专家 更新时间:2023-11-01 08:05:43 28 4
gpt4 key购买 nike

我的问题如下:在“旧”Android 设备(v 2.2 和 2.3)上,旋转后,我的 GLSurfaceView 是空白的。我可以在我的日志中看到这些调用:

- rotation detected! -
CTestApp(10669): entering onConfigurationChanged method.
MainActivity(10669): entering onPause method.
*WEBRTC*(10669): ViEAndroidGLES20::onPause
*WEBRTC*(10669): ContextFactory::destroyContext
*WEBRTC*(10669): ViEAndroidGLES20::onPause
*WEBRTC*(10669): ContextFactory::destroyContext
MainActivity(10669): end of onPause method.
MainActivity(10669): entering onStop method.
*WEBRTC*(10669): ViEAndroidGLES20::onDetachedFromWindow
*WEBRTC*(10669): ViEAndroidGLES20::onDetachedFromWindow
MainActivity(10669): end of onStop method.
MainActivity(10669): entering onDestroy method.
MainActivity(10669): end of onDestroy method.
MainActivity(10669): entering onCreate method.
MainActivity(10669): entering onStart method.
MainActivity(10669): end of onStart method.
MainActivity(10669): entering onResume method.
*WEBRTC*(10669): ViEAndroidGLES20::onResume
*WEBRTC*(10669): ViEAndroidGLES20::onResume
MainActivity(10669): end of onResume method.
*WEBRTC*(10669): ViEAndroidGLES20::onAttachedToWindow
*WEBRTC*(10669): ViEAndroidGLES20::onAttachedToWindow

在较新的 Android 设备上,视频流的呈现会在设备旋转后正确恢复:

工作设备的日志类似于之前的(不工作的)日志,除了这些痕迹出现在“onAttachedToWindow”调用之后:

creating OpenGL ES 2.0 context
ViEAndroidGLES20::onSurfaceCreated

在 Eclipse 调试器中,我注意到在 Activity 销毁期间暂停的 2 个 OpenGl 线程没有恢复。 Android 2.3 和 4.0 之间的 GLSurfaceView 行为似乎存在差异,导致 OpenGl 线程仅在较新版本上恢复。有人对此有线索吗?

以下是我用于测试的设备的详细信息:

工作设备:

  • 运行 Android 版本 4.1.1 的 galaxy nexus
  • 运行 Android 版本 4.0.4 的 Galaxy Tab 10.1

“坏”设备: - HTC desire,运行Android 2.3.5 - 摩托罗拉机器人,运行 Android 2.2

这是我使用的代码的附加信息。

我有以下类,它扩展了 GLSurfaceView:

public class ViEAndroidGLES20 extends GLSurfaceView
implements GLSurfaceView.Renderer {
private static String TAG = "WEBRTC-JR";
private static final boolean DEBUG = true;
// True if onSurfaceCreated has been called.
private boolean surfaceCreated = false;
private boolean openGLCreated = false;
// True if NativeFunctionsRegistered has been called.
private boolean nativeFunctionsRegisted = false;
private ReentrantLock nativeFunctionLock = new ReentrantLock();
// Address of Native object that will do the drawing.
private long nativeObject = 0;
private int viewWidth = 0;
private int viewHeight = 0;

public static boolean UseOpenGL2(Object renderWindow) {
return ViEAndroidGLES20.class.isInstance(renderWindow);
}

public ViEAndroidGLES20(Context context) {
super(context);
init(false, 0, 0);
}

public ViEAndroidGLES20(Context context, boolean translucent,
int depth, int stencil) {
super(context);
init(translucent, depth, stencil);
}

private void init(boolean translucent, int depth, int stencil) {

// By default, GLSurfaceView() creates a RGB_565 opaque surface.
// If we want a translucent one, we should change the surface's
// format here, using PixelFormat.TRANSLUCENT for GL Surfaces
// is interpreted as any 32-bit surface with alpha by SurfaceFlinger.
if (translucent) {
this.getHolder().setFormat(PixelFormat.TRANSLUCENT);
}

// Setup the context factory for 2.0 rendering.
// See ContextFactory class definition below
setEGLContextFactory(new ContextFactory());

// We need to choose an EGLConfig that matches the format of
// our surface exactly. This is going to be done in our
// custom config chooser. See ConfigChooser class definition
// below.
setEGLConfigChooser( translucent ?
new ConfigChooser(8, 8, 8, 8, depth, stencil) :
new ConfigChooser(5, 6, 5, 0, depth, stencil) );

// Set the renderer responsible for frame rendering
this.setRenderer(this);
this.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}

private static class ContextFactory implements GLSurfaceView.EGLContextFactory {
private static int EGL_CONTEXT_CLIENT_VERSION = 0x3098;
public EGLContext createContext(EGL10 egl, EGLDisplay display, EGLConfig eglConfig) {
Log.w(TAG, "creating OpenGL ES 2.0 context");
checkEglError("Before eglCreateContext", egl);
int[] attrib_list = {EGL_CONTEXT_CLIENT_VERSION, 2, EGL10.EGL_NONE };
EGLContext context = egl.eglCreateContext(display, eglConfig,
EGL10.EGL_NO_CONTEXT, attrib_list);
checkEglError("After eglCreateContext", egl);
return context;
}

public void destroyContext(EGL10 egl, EGLDisplay display, EGLContext context) {
Log.d("*WEBRTC*", "ContextFactory::destroyContext");
egl.eglDestroyContext(display, context);
}
}

private static void checkEglError(String prompt, EGL10 egl) {
int error;
while ((error = egl.eglGetError()) != EGL10.EGL_SUCCESS) {
Log.e("*WEBRTC*", String.format("%s: EGL error: 0x%x", prompt, error));
}
}

private static class ConfigChooser implements GLSurfaceView.EGLConfigChooser {

public ConfigChooser(int r, int g, int b, int a, int depth, int stencil) {
mRedSize = r;
mGreenSize = g;
mBlueSize = b;
mAlphaSize = a;
mDepthSize = depth;
mStencilSize = stencil;
}

// This EGL config specification is used to specify 2.0 rendering.
// We use a minimum size of 4 bits for red/green/blue, but will
// perform actual matching in chooseConfig() below.
private static int EGL_OPENGL_ES2_BIT = 4;
private static int[] s_configAttribs2 =
{
EGL10.EGL_RED_SIZE, 4,
EGL10.EGL_GREEN_SIZE, 4,
EGL10.EGL_BLUE_SIZE, 4,
EGL10.EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
EGL10.EGL_NONE
};

public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display) {

// Get the number of minimally matching EGL configurations
int[] num_config = new int[1];
egl.eglChooseConfig(display, s_configAttribs2, null, 0, num_config);

int numConfigs = num_config[0];

if (numConfigs <= 0) {
throw new IllegalArgumentException("No configs match configSpec");
}

// Allocate then read the array of minimally matching EGL configs
EGLConfig[] configs = new EGLConfig[numConfigs];
egl.eglChooseConfig(display, s_configAttribs2, configs, numConfigs, num_config);

// Now return the "best" one
return chooseConfig(egl, display, configs);
}

public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display,
EGLConfig[] configs) {
for(EGLConfig config : configs) {
int d = findConfigAttrib(egl, display, config,
EGL10.EGL_DEPTH_SIZE, 0);
int s = findConfigAttrib(egl, display, config,
EGL10.EGL_STENCIL_SIZE, 0);

// We need at least mDepthSize and mStencilSize bits
if (d < mDepthSize || s < mStencilSize)
continue;

// We want an *exact* match for red/green/blue/alpha
int r = findConfigAttrib(egl, display, config,
EGL10.EGL_RED_SIZE, 0);
int g = findConfigAttrib(egl, display, config,
EGL10.EGL_GREEN_SIZE, 0);
int b = findConfigAttrib(egl, display, config,
EGL10.EGL_BLUE_SIZE, 0);
int a = findConfigAttrib(egl, display, config,
EGL10.EGL_ALPHA_SIZE, 0);

if (r == mRedSize && g == mGreenSize && b == mBlueSize && a == mAlphaSize)
return config;
}
return null;
}

private int findConfigAttrib(EGL10 egl, EGLDisplay display,
EGLConfig config, int attribute, int defaultValue) {

if (egl.eglGetConfigAttrib(display, config, attribute, mValue)) {
return mValue[0];
}
return defaultValue;
}

// Subclasses can adjust these values:
protected int mRedSize;
protected int mGreenSize;
protected int mBlueSize;
protected int mAlphaSize;
protected int mDepthSize;
protected int mStencilSize;
private int[] mValue = new int[1];
}

// IsSupported
// Return true if this device support Open GL ES 2.0 rendering.
public static boolean IsSupported(Context context) {
ActivityManager am =
(ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE);
ConfigurationInfo info = am.getDeviceConfigurationInfo();
if(info.reqGlEsVersion >= 0x20000) {
// Open GL ES 2.0 is supported.
return true;
}
return false;
}

public void onDrawFrame(GL10 gl) {
nativeFunctionLock.lock();
if(!nativeFunctionsRegisted || !surfaceCreated) {
nativeFunctionLock.unlock();
return;
}

if(!openGLCreated) {
if(0 != CreateOpenGLNative(nativeObject, viewWidth, viewHeight)) {
return; // Failed to create OpenGL
}
openGLCreated = true; // Created OpenGL successfully
}
DrawNative(nativeObject); // Draw the new frame
nativeFunctionLock.unlock();
}

public void onSurfaceChanged(GL10 gl, int width, int height) {

if (DEBUG)
{
Log.d("*WEBRTC*", "ViEAndroidGLES20::onSurfaceChanged");
}

surfaceCreated = true;
viewWidth = width;
viewHeight = height;

nativeFunctionLock.lock();
if(nativeFunctionsRegisted) {
if(CreateOpenGLNative(nativeObject,width,height) == 0)
{
openGLCreated = true;
}
else
{
Log.e("*WEBRTC*", "ViEAndroidGLES20::onSurfaceChanged - failed to openGlCreated!");
}
}
nativeFunctionLock.unlock();
}

public void onSurfaceCreated(GL10 gl, EGLConfig config) {

if (DEBUG)
{
Log.d("*WEBRTC*", "ViEAndroidGLES20::onSurfaceCreated");
}
}

public void ReDraw() {
if(surfaceCreated) {
// Request the renderer to redraw using the render thread context.
this.requestRender();
}
}

private native int CreateOpenGLNative(long nativeObject,
int width, int height);
private native void DrawNative(long nativeObject);

protected void onAttachedToWindow()
{
if (DEBUG)
{
Log.d("*WEBRTC*", "ViEAndroidGLES20::onAttachedToWindow");
}

super.onAttachedToWindow();
}

protected void onDetachedFromWindow()
{
if (DEBUG)
{
Log.d("*WEBRTC*", "ViEAndroidGLES20::onDetachedFromWindow");
}

super.onDetachedFromWindow();
}

public void onPause()
{
if (DEBUG)
{
Log.d("*WEBRTC*", "ViEAndroidGLES20::onPause");
}

super.onPause();
}

public void onResume()
{
if (DEBUG)
{
Log.d("*WEBRTC*", "ViEAndroidGLES20::onResume");
}

super.onResume();
}
}

当我旋转设备时,我的主要 Activity 被销毁,但我的应用程序保留了 [[ViEAndroidGLES20]] 实例(类成员 m_RemoteView1 和 m_RemoteView2)的引用。这些引用在 Activity 的 onStart() 回调中获取,如下所示

// The activity is about to become visible.
@Override protected void onStart() {

Log.d("MainActivity", "entering onStart method.");

super.onStart();

// The application is responsible of keeping valid references to the surface view
// used to perform local capture and remote stream rendering.
m_RemoteView1 = ((CTestApp)getApplication()).GetRemoteVideoView();
m_RemoteView2 = ((CTestApp)getApplication()).GetRemoteVideoView2();

if (m_RemoteView1 != null)
{
LinearLayout layout = (LinearLayout) findViewById(R.id.remoteVideoRenderLayout1);
layout.addView(m_RemoteView1);
}

if (m_RemoteView2 != null)
{
LinearLayout layout = (LinearLayout) findViewById(R.id.remoteVideoRenderLayout2);
layout.addView(m_RemoteView2);
}
}

// The activity has become visible, it is now resumed.
@Override protected void onResume() {

Log.d("MainActivity", "entering onResume method.");

super.onResume();

// A GLSurfaceView must be notified when the activity is paused and resumed. GLSurfaceView clients
// are required to call onPause() when the activity pauses and onResume() when the activity resumes.
((GLSurfaceView)m_RemoteView1).onResume();
((GLSurfaceView)m_RemoteView2).onResume();
}

请注意,我还包含了主要 Activity 的 onResume() 回调实现,以表明我在 Activity 恢复时调用了 GLSurfaceView.onResume()。

最佳答案

我终于找到了我的问题所在。问题来自导致问题的 Android 2.3 和 4.x 之间类 android.opengl.GLSurfaceView 的行为差异。在 GLSurfaceView 的 Android 4.x 实现中,回调“onAttachedToWindow”导致关联的 GLThread 重新启动。

Android 2.2 和 2.3 实现中缺少 GLThread 的重新启动。不恢复 OpenGL 线程导致渲染 View 在调用 ViewGroup::removeView/addView 后变为空白,就像在旋转场景中一样。

为了纠正这个问题,我在我的项目中添加了一个类 newGLSurfaceView,它是 Android 4.1 源代码的 GLSurfaceView.java 类的副本。

谢谢,

关于android - 调用 "onResume"时,GLSurfaceView 不会恢复其 Open GL 线程,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/14529126/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com