- c - 在位数组中找到第一个零
- linux - Unix 显示有关匹配两种模式之一的文件的信息
- 正则表达式替换多个文件
- linux - 隐藏来自 xtrace 的命令
当我想使用 twilio video api 和 ARcore 流式传输自定义 View 时遇到问题,基本上它流式传输黑屏。我使用示例中的 ViewCapturer 类到此链接 https://github.com/twilio/video-quickstart-android/tree/master/exampleCustomVideoCapturer来自官方文档,但不适用于 arcore,可能是由于 arFragment 中存在表面 View 。
感谢您的支持。
activity_camera.xml
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/container"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".CameraARActivity">
<fragment
android:id="@+id/ux_fragment"
android:name="com.google.ar.sceneform.ux.ArFragment"
android:layout_width="match_parent"
android:layout_height="match_parent" />
<android.support.v7.widget.RecyclerView
android:id="@+id/recycler_view"
android:layout_width="match_parent"
android:layout_height="100dp"
android:layout_alignParentBottom="true"
android:background="#c100a5a0"
android:visibility="gone" />
<ImageButton
android:id="@+id/btnCloseChat"
android:layout_width="24dp"
android:layout_height="24dp"
android:layout_alignParentBottom="true"
android:layout_alignParentEnd="true"
android:layout_marginBottom="86dp"
android:layout_marginEnd="13dp"
android:background="@android:color/transparent"
android:contentDescription="Close chat button"
android:src="@drawable/ic_close_black_24dp"
android:visibility="gone" />
</RelativeLayout>
本地视频创建行:
screenVideoTrack = LocalVideoTrack.create(CameraARActivity.this, true, new ViewCapturer(mArFragment.getArSceneView()));
和 ViewCapturer 类
import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.os.Handler;
import android.os.Looper;
import android.os.SystemClock;
import android.view.View;
import com.twilio.video.VideoCapturer;
import com.twilio.video.VideoDimensions;
import com.twilio.video.VideoFormat;
import com.twilio.video.VideoFrame;
import com.twilio.video.VideoPixelFormat;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicBoolean;
/**
* ViewCapturer demonstrates how to implement a custom {@link VideoCapturer}. This class
* captures the contents of a provided view and signals the {@link VideoCapturer.Listener} when
* the frame is available.
*/
public class ViewCapturer implements VideoCapturer {
private static final int VIEW_CAPTURER_FRAMERATE_MS = 100;
private final View view;
private Handler handler = new Handler(Looper.getMainLooper());
private VideoCapturer.Listener videoCapturerListener;
private AtomicBoolean started = new AtomicBoolean(false);
private final Runnable viewCapturer = new Runnable() {
@Override
public void run() {
boolean dropFrame = view.getWidth() == 0 || view.getHeight() == 0;
// Only capture the view if the dimensions have been established
if (!dropFrame) {
// Draw view into bitmap backed canvas
int measuredWidth = View.MeasureSpec.makeMeasureSpec(view.getWidth(),
View.MeasureSpec.EXACTLY);
int measuredHeight = View.MeasureSpec.makeMeasureSpec(view.getHeight(),
View.MeasureSpec.EXACTLY);
view.measure(measuredWidth, measuredHeight);
view.layout(0, 0, view.getMeasuredWidth(), view.getMeasuredHeight());
Bitmap viewBitmap = Bitmap.createBitmap(view.getWidth(), view.getHeight(),
Bitmap.Config.ARGB_8888);
Canvas viewCanvas = new Canvas(viewBitmap);
view.draw(viewCanvas);
// Extract the frame from the bitmap
int bytes = viewBitmap.getByteCount();
ByteBuffer buffer = ByteBuffer.allocate(bytes);
viewBitmap.copyPixelsToBuffer(buffer);
byte[] array = buffer.array();
final long captureTimeNs =
TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
// Create video frame
VideoDimensions dimensions = new VideoDimensions(view.getWidth(), view.getHeight());
VideoFrame videoFrame = new VideoFrame(array,
dimensions, VideoFrame.RotationAngle.ROTATION_0, captureTimeNs);
// Notify the listener
if (started.get()) {
videoCapturerListener.onFrameCaptured(videoFrame);
}
}
// Schedule the next capture
if (started.get()) {
handler.postDelayed(this, VIEW_CAPTURER_FRAMERATE_MS);
}
}
};
public ViewCapturer(View view) {
this.view = view;
}
/**
* Returns the list of supported formats for this view capturer. Currently, only supports
* capturing to RGBA_8888 bitmaps.
*
* @return list of supported formats.
*/
@Override
public List<VideoFormat> getSupportedFormats() {
List<VideoFormat> videoFormats = new ArrayList<>();
VideoDimensions videoDimensions = new VideoDimensions(view.getWidth(), view.getHeight());
VideoFormat videoFormat = new VideoFormat(videoDimensions, 30, VideoPixelFormat.RGBA_8888);
videoFormats.add(videoFormat);
return videoFormats;
}
/**
* Returns true because we are capturing screen content.
*/
@Override
public boolean isScreencast() {
return true;
}
/**
* This will be invoked when it is time to start capturing frames.
*
* @param videoFormat the video format of the frames to be captured.
* @param listener capturer listener.
*/
@Override
public void startCapture(VideoFormat videoFormat, Listener listener) {
// Store the capturer listener
this.videoCapturerListener = listener;
this.started.set(true);
// Notify capturer API that the capturer has started
boolean capturerStarted = handler.postDelayed(viewCapturer,
VIEW_CAPTURER_FRAMERATE_MS);
this.videoCapturerListener.onCapturerStarted(capturerStarted);
}
/**
* Stop capturing frames. Note that the SDK cannot receive frames once this has been invoked.
*/
@Override
public void stopCapture() {
this.started.set(false);
handler.removeCallbacks(viewCapturer);
}
}
解决方案
package com.bitdrome.dionigi.eragle.utils;
import android.graphics.Bitmap;
import android.os.Handler;
import android.os.Looper;
import android.os.SystemClock;
import android.view.PixelCopy;
import android.view.SurfaceView;
import android.view.View;
import com.twilio.video.VideoCapturer;
import com.twilio.video.VideoDimensions;
import com.twilio.video.VideoFormat;
import com.twilio.video.VideoFrame;
import com.twilio.video.VideoPixelFormat;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicBoolean;
/**
* ViewCapturer demonstrates how to implement a custom {@link
VideoCapturer}. This class
* captures the contents of a provided view and signals the {@link
VideoCapturer.Listener} when
* the frame is available.
*/
public class ViewCapturer implements VideoCapturer,
PixelCopy.OnPixelCopyFinishedListener {
private static int VIEW_CAPTURER_FRAMERATE_MS = 10;
private final View view;
private Bitmap viewBitmap;
private Handler handler = new Handler(Looper.getMainLooper());
private Handler handlerPixelCopy = new Handler(Looper.getMainLooper());
private VideoCapturer.Listener videoCapturerListener;
private AtomicBoolean started = new AtomicBoolean(false);
public ViewCapturer(View view) {
this(view, 24);
}
public ViewCapturer(View view, int framePerSecond) {
if (framePerSecond <= 0)
throw new IllegalArgumentException("framePersecond must be greater than 0");
this.view = view;
float tmp = (1f / framePerSecond) * 1000;
VIEW_CAPTURER_FRAMERATE_MS = Math.round(tmp);
}
private final Runnable viewCapturer = new Runnable() {
@Override
public void run() {
boolean dropFrame = view.getWidth() == 0 || view.getHeight() == 0;
// Only capture the view if the dimensions have been established
if (!dropFrame) {
// Draw view into bitmap backed canvas
int measuredWidth = View.MeasureSpec.makeMeasureSpec(view.getWidth(),
View.MeasureSpec.EXACTLY);
int measuredHeight = View.MeasureSpec.makeMeasureSpec(view.getHeight(),
View.MeasureSpec.EXACTLY);
view.measure(measuredWidth, measuredHeight);
view.layout(0, 0, view.getMeasuredWidth(), view.getMeasuredHeight());
viewBitmap = Bitmap.createBitmap(view.getWidth(), view.getHeight(), Bitmap.Config.ARGB_8888);
try {
PixelCopy.request((SurfaceView) view, viewBitmap, ViewCapturer.this, handlerPixelCopy);
} catch (IllegalArgumentException e) {
}
}
}
};
/**
* Returns the list of supported formats for this view capturer. Currently, only supports
* capturing to RGBA_8888 bitmaps.
*
* @return list of supported formats.
*/
@Override
public List<VideoFormat> getSupportedFormats() {
List<VideoFormat> videoFormats = new ArrayList<>();
VideoDimensions videoDimensions = new VideoDimensions(view.getWidth(), view.getHeight());
VideoFormat videoFormat = new VideoFormat(videoDimensions, 30, VideoPixelFormat.RGBA_8888);
videoFormats.add(videoFormat);
return videoFormats;
}
/**
* Returns true because we are capturing screen content.
*/
@Override
public boolean isScreencast() {
return true;
}
/**
* This will be invoked when it is time to start capturing frames.
*
* @param videoFormat the video format of the frames to be captured.
* @param listener capturer listener.
*/
@Override
public void startCapture(VideoFormat videoFormat, Listener listener) {
// Store the capturer listener
this.videoCapturerListener = listener;
this.started.set(true);
// Notify capturer API that the capturer has started
boolean capturerStarted = handler.postDelayed(viewCapturer,
VIEW_CAPTURER_FRAMERATE_MS);
this.videoCapturerListener.onCapturerStarted(capturerStarted);
}
/**
* Stop capturing frames. Note that the SDK cannot receive frames once this has been invoked.
*/
@Override
public void stopCapture() {
this.started.set(false);
handler.removeCallbacks(viewCapturer);
}
@Override
public void onPixelCopyFinished(int i) {
// Extract the frame from the bitmap
int bytes = viewBitmap.getByteCount();
ByteBuffer buffer = ByteBuffer.allocate(bytes);
viewBitmap.copyPixelsToBuffer(buffer);
byte[] array = buffer.array();
final long captureTimeNs = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
// Create video frame
VideoDimensions dimensions = new VideoDimensions(view.getWidth(), view.getHeight());
VideoFrame videoFrame = new VideoFrame(array,
dimensions, VideoFrame.RotationAngle.ROTATION_0, captureTimeNs);
// Notify the listener
if (started.get()) {
videoCapturerListener.onFrameCaptured(videoFrame);
}
if (started.get()) {
handler.postDelayed(viewCapturer, VIEW_CAPTURER_FRAMERATE_MS);
}
}
}
最佳答案
对于必须使用 Twilio Video 流式传输 ARCore 的人
在您的 ARCore 渲染类中。
@Override
public void onDrawFrame(GL10 gl) {
....
this.takeLastFrame();
}
private byte[] takeLastFrame() {
int height = this.mFrameHeight;
int width = this.mFrameWidth;
Mat input = new Mat(height, width, CvType.CV_8UC4);
ByteBuffer buffer = ByteBuffer.allocate(input.rows() * input.cols() * input.channels());
GLES20.glReadPixels(0, 0, width, height,
GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, buffer);
input.put(0, 0, buffer.array());
Core.rotate(input, input, Core.ROTATE_180);
Core.flip(input, input, 1);
return convertMatToBytes(input);
}
private byte[] convertMatToBytes(Mat image) {
int bufferSize = image.channels() * image.cols() * image.rows();
byte[] b = new byte[bufferSize];
image.get(0, 0, b);
return b;
}
在您的自定义捕获器类中
byte[] array = view.takeLastFrame();
if (array != null && array.length > 0) {
final long captureTimeNs = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
// Create video frame
VideoDimensions dimensions = new VideoDimensions(view.getFrameWidth(), view.getFrameHeight());
VideoFrame videoFrame = new VideoFrame(array,
dimensions, VideoFrame.RotationAngle.ROTATION_0, captureTimeNs);
// Notify the listener
if (started.get()) {
videoCapturerListener.onFrameCaptured(videoFrame);
}
}
关于android - 使用 Twilio 视频流式传输 CustomView ARcore,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52439822/
我最近在/ drawable中添加了一些.gifs,以便可以将它们与按钮一起使用。这个工作正常(没有错误)。现在,当我重建/运行我的应用程序时,出现以下错误: Error: Gradle: Execu
Android 中有返回内部存储数据路径的方法吗? 我有 2 部 Android 智能手机(Samsung s2 和 s7 edge),我在其中安装了一个应用程序。我想使用位于这条路径中的 sqlit
这个问题在这里已经有了答案: What's the difference between "?android:" and "@android:" in an android layout xml f
我只想知道 android 开发手机、android 普通手机和 android root 手机之间的实际区别。 我们不能从实体店或除 android marketplace 以外的其他地方购买开发手
自Gradle更新以来,我正在努力使这个项目达到标准。这是一个团队项目,它使用的是android-apt插件。我已经进行了必要的语法更改(编译->实现和apt->注释处理器),但是编译器仍在告诉我存在
我是android和kotlin的新手,所以请原谅要解决的一个非常简单的问题! 我已经使用导航体系结构组件创建了一个基本应用程序,使用了底部的导航栏和三个导航选项。每个导航选项都指向一个专用片段,该片
我目前正在使用 Facebook official SDK for Android . 我现在正在使用高级示例应用程序,但我不知道如何让它获取应用程序墙/流/状态而不是登录的用户。 这可能吗?在那种情
我在下载文件时遇到问题, 我可以在模拟器中下载文件,但无法在手机上使用。我已经定义了上网和写入 SD 卡的权限。 我在服务器上有一个 doc 文件,如果用户单击下载。它下载文件。这在模拟器中工作正常但
这个问题在这里已经有了答案: What is the difference between gravity and layout_gravity in Android? (22 个答案) 关闭 9
任何人都可以告诉我什么是 android 缓存和应用程序缓存,因为当我们谈论缓存清理应用程序时,它的作用是,缓存清理概念是清理应用程序缓存还是像内存管理一样主存储、RAM、缓存是不同的并且据我所知,缓
假设应用程序 Foo 和 Eggs 在同一台 Android 设备上。任一应用程序都可以获取设备上所有应用程序的列表。一个应用程序是否有可能知道另一个应用程序是否已经运行以及运行了多长时间? 最佳答案
我有点困惑,我只看到了从 android 到 pc 或者从 android 到 pc 的例子。我需要制作一个从两部手机 (android) 连接的 android 应用程序进行视频聊天。我在想,我知道
用于使用 Android 以编程方式锁定屏幕。我从 Stackoverflow 之前关于此的问题中得到了一些好主意,并且我做得很好,但是当我运行该代码时,没有异常和错误。而且,屏幕没有锁定。请在这段代
文档说: android:layout_alignParentStart If true, makes the start edge of this view match the start edge
我不知道这两个属性和高度之间的区别。 以一个TextView为例,如果我将它的layout_width设置为wrap_content,并将它的width设置为50 dip,会发生什么情况? 最佳答案
这两个属性有什么关系?如果我有 android:noHistory="true",那么有 android:finishOnTaskLaunch="true" 有什么意义吗? 最佳答案 假设您的应用中有
我是新手,正在尝试理解以下 XML 代码: 查看 developer.android.com 上的文档,它说“starStyle”是 R.attr 中的常量, public static final
在下面的代码中,为什么当我设置时单选按钮的外观会发生变化 android:layout_width="fill_parent" 和 android:width="fill_parent" 我说的是
很难说出这里要问什么。这个问题模棱两可、含糊不清、不完整、过于宽泛或夸夸其谈,无法以目前的形式得到合理的回答。如需帮助澄清此问题以便重新打开,visit the help center . 关闭 9
假设我有一个函数 fun myFunction(name:String, email:String){},当我调用这个函数时 myFunction('Ali', 'ali@test.com ') 如何
我是一名优秀的程序员,十分优秀!