- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我正在尝试将多个 rtp h264 有效负载视频流混合到一个 15FPS 的视频流中。
在 15FPS 的视频测试源模式上混合两个视频流的工作流水线
VIDEO_CAPS="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264"
gst-launch -vvvve videomixer2 name=mix ! ffmpegcolorspace ! xvimagesink
udpsrc caps=$VIDEO_CAPS port=3030 ! .recv_rtp_sink_0 gstrtpbin ! rtph264depay ! ffdec_h264 ! videoscale ! video/x-raw-yuv , width=176, height=144 ! videobox top=0 left=0 ! video/x-raw-yuv,format=\(fourcc\)AYUV ! ffmpegcolorspace ! mix.
udpsrc caps=$VIDEO_CAPS port=6666 ! .recv_rtp_sink_0 gstrtpbin ! rtph264depay ! ffdec_h264 ! videoscale ! video/x-raw-yuv , width=176, height=144 ! videobox top=0 left=-178 ! video/x-raw-yuv,format=\(fourcc\)AYUV ! ffmpegcolorspace ! mix.
videotestsrc ! video/x-raw-yuv, framerate=15/1, width=640, height=360 ! mix.
上面的管道有一个奇怪的问题,当尝试混合时,第一个视频源变为空白,(我确定第一个视频源仍在流式传输,但是它没有'出现。显示的是第二个视频流。详细信息如下。
上述管道的详细信息
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, color-matrix=(string)sdtv
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, color-matrix=(string)sdtv
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, color-matrix=(string)sdtv
/GstPipeline:pipeline0/GstVideoMixer2:mix.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoMixer2:mix.GstVideoMixer2Pad:sink_0: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, color-matrix=(string)sdtv
/GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpSession:rtpsession0.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin0.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin0.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpSession:rtpsession0.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpSession:rtpsession1.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin1.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin1.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad1: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpSession:rtpsession1.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpSsrcDemux:rtpssrcdemux1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpJitterBuffer:rtpjitterbuffer1.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpJitterBuffer:rtpjitterbuffer1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpPtDemux:rtpptdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)99
/GstPipeline:pipeline0/GstRtpBin:rtpbin0.GstGhostPad:recv_rtp_src_0_4186542290_99.GstProxyPad:proxypad2: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)99
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpPtDemux:rtpptdemux1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay1.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)99
/GstPipeline:pipeline0/GstRtpBin:rtpbin1.GstGhostPad:recv_rtp_src_0_4186622237_99.GstProxyPad:proxypad3: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)99
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2641.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoBox:videobox0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoBox:videobox0.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp1.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp1.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoMixer2:mix.GstVideoMixer2Pad:sink_1: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2641.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoScale:videoscale1.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoScale:videoscale1.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstCapsFilter:capsfilter3.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstCapsFilter:capsfilter3.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoBox:videobox1.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoBox:videobox1.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
/GstPipeline:pipeline0/GstCapsFilter:capsfilter4.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstCapsFilter:capsfilter4.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp2.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp2.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstVideoMixer2:mix.GstVideoMixer2Pad:sink_2: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstXvImageSink:xvimagesink0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2875): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
There may be a timestamping problem, or this computer is too slow.
我高度怀疑它与 videobox 有关。
最佳答案
(在问题编辑中回答。转换为社区维基答案。参见 What is the appropriate action when the answer to a question is added to the question itself?)
OP 写道:
As I said this issue was with videobox, we have to add videobox border-alpha=0 or alpha channel.
Fully functional pipeline : Accepts 3 RTP h264 videos and mixes over a video test pattern.
gst-launch -vvvve videomixer2 name=mix ! ffmpegcolorspace ! xvimagesink sync=false udpsrc caps=$VIDEO_CAPS port=3030 ! .recv_rtp_sink_0 gstrtpbin ! rtph264depay ! ffdec_h264 ! videoscale ! video/x-raw-yuv , width=176, height=144 ! videobox border-alpha=0 top=0 left=0 ! video/x-raw-yuv,format=\(fourcc\)AYUV ! ffmpegcolorspace ! mix. udpsrc caps=$VIDEO_CAPS port=6666 ! .recv_rtp_sink_0 gstrtpbin ! rtph264depay ! ffdec_h264 ! videoscale ! video/x-raw-yuv , width=176, height=144 ! videobox border-alpha=0 top=0 left=-176 ! video/x-raw-yuv,format=\(fourcc\)AYUV ! ffmpegcolorspace ! mix. udpsrc caps=$VIDEO_CAPS port=2222 ! .recv_rtp_sink_0 gstrtpbin ! rtph264depay ! ffdec_h264 ! videoscale ! video/x-raw-yuv , width=176, height=144 ! videobox border-alpha=0 top=-144 left=0 ! video/x-raw-yuv,format=\(fourcc\)AYUV ! mix. videotestsrc ! video/x-raw-yuv, framerate=15/1, width=352, height=288 ! mix.
关于ffmpeg - Gstreamer rtp 视频混合器,找到了一个工作流水线,但需要改进,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/14117427/
我对此很陌生,我在这里的论坛上检查过答案,但我没有找到任何真正可以帮助我的答案。我正在尝试播放 res/raw 文件夹中的视频。到目前为止我已经设置了这段代码: MediaPlayer mp; @Ov
我可以播放一个视频剪辑,检测视频的结尾,然后创建一个表单,然后播放另一个视频剪辑。我的问题是,表单 react 不正确,我创建了带有提交按钮和两个单选按钮可供选择的表单。我希望让用户进行选择,验证响应
首先,我必须说我在web2py讨论组中看到过类似的内容,但我不太理解。 我使用 web2py 设置了一个数据库驱动的网站,其中的条目只是 HTML 文本。其中大多数将包含 img和/或video指向相
我正在尝试在视频 View 中播放 YouTube 视频。 我将 xml 布局如下: 代码是这样的: setContentView(R.layout.webview); VideoV
我正在开发一个需要嵌入其中的 youtube 视频播放器的 android 应用程序。我成功地从 API 获得了 RTSP 视频 URL,但是当我试图在我的 android 视频 View 中加载这个
我目前正在从事一个使用 YouTube API 的网络项目。 我完全不熟悉 API。所以每一行代码都需要付出很多努力。 使用以下代码,我可以成功检索播放列表中的项目: https://www.goog
是否可以仅使用视频 ID 和 key 使用 API V3 删除 youtube 视频?我不断收到有关“必需参数:部分”丢失的错误消息。我用服务器和浏览器 api 键试了一下这是我的代码: // $yo
所以我一直坚持这个大约一个小时左右,我就是无法让它工作。到目前为止,我一直在尝试从字符串中提取整个链接,但现在我觉得只获取视频 ID 可能更容易。 RegEx 需要从以下链接样式中获取 ID/URL,
var app = angular.module('speakout', []).config( function($sceDelegateProvider) {
我正在努力从 RSS 提要中阅读音频、视频新闻。我如何确定该 rss 是用于新闻阅读器还是用于音频或视频? 这是视频源:http://feeds.cbsnews.com/CBSNewsVideo 这是
利用python反转图片/视频 准备:一张图片/一段视频 python库:pillow,moviepy 安装库 ?
我希望在用户双击视频区域时让我的视频全屏显示,而不仅仅是在他们单击控件中的小图标时。有没有办法添加事件或其他东西来控制用户点击视频时发生的情况? 谢谢! 最佳答案 按照 Musa 的建议,附
关闭。这个问题需要更多 focused .它目前不接受答案。 想改进这个问题?更新问题,使其仅关注一个问题 editing this post . 7年前关闭。 Improve this questi
我有一个公司培训视频加载到本地服务器上。我正在使用 HTML5 的视频播放来观看这些视频。该服务器无法访问网络,但我已加载 apache 并且端口 8080 对同一网络上的所有机器开放。 这些文件位于
我想混合来自 video.mp4 的视频(时长 1 分钟)和来自 audio.mp3 的音频(10 分钟持续时间)到一个持续时间为 1 分钟的输出文件中。来自 audio.mp3 的音频应该是从 4
关闭。这个问题需要更多 focused .它目前不接受答案。 想改进这个问题?更新问题,使其仅关注一个问题 editing this post . 8年前关闭。 Improve this questi
我正在尝试使用 peer/getUserMedia 创建一个视频 session 网络应用程序。 目前,当我将唯一 ID 发送到视频 session 时,我能够听到/看到任何加入我的 session
考虑到一段时间内的观看次数,我正在评估一种针对半自动脚本的不同方法,该脚本将对视频元数据执行操作。 简而言之,只要视频达到指标中的某个阈值,就说观看次数,它将触发某些操作。 现在要执行此操作,我将不得
我正在通过iBooks创建专门为iPad创建动态ePub电子书的网站。 它需要支持youtube视频播放,所以当我知道视频的直接路径时,我正在使用html5 标记。 有没有一种使用html5 标签嵌入
我对Android不熟悉,我想浏览youtube.com并在Webview内从网站显示视频。当前,当我尝试执行此操作时,将出现设备的浏览器,并让我使用设备浏览器浏览该站点。如果Webview不具备这种
我是一名优秀的程序员,十分优秀!