gpt4 book ai didi

java - 在 Pi 3 B+ 上使用 Processing Pi 的 GLSL 着色器

转载 作者:行者123 更新时间:2023-12-01 17:53:31 25 4
gpt4 key购买 nike

这是我第一次从Processing OSX 移植GLSL 着色器以在Raspberry Pi 3 B+ 上运行的Processing Pi 上运行。我有一个非常基本的着色器,可以在两个视频播放之间溶解。它在我的 Mac 上运行得很好,但当它移植到 Processing Pi 并更新为使用处理视频库 GLvideo 时,它会抛出错误。

该着色器最初是从 ShaderToy 帖子转换而来的,但我将其重新编写为直接 GLSL,以确保不存在任何兼容性问题。我环顾四周,找不到任何我认为会导致此问题的具体内容。因此,任何引用、指示或帮助将不胜感激。

我尝试了其他一些方法,我将视频大小调整得更小,将 Pi 的 GPU 内存更新为 256mb 等。我确保它仍然可以在 OSX 上运行,但是当它在 Raspberry Pi 3B+ 上运行时,草图是一个空的白色屏幕。

我想也许 Pi 处理 GLSL 的方式有所不同?或者 Pi GPU 中的sampler2D 纹理是否有限制?更重要的是在处理方面,它跨越了我的也许不支持在处理 pi 的着色器中将 PGraphics 从Processing 设置为sampler2D 纹理?当你设置texture3D时,也许GLVideo图像有问题。另外,也许是我混淆了碎片和颜色着色器的工作方式。目前我认为我正在使用处理颜色着色器。

控制台中唯一的输出是:

Final caps: video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)640, height=(int)360, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, texture-target=(string)2D
Final caps: video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)640, height=(int)360, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, texture-target=(string)2D

shaderDisolveGLSL.pde

//import processing.video.*;
import gohai.glvideo.*;

PShader mixShader;

PGraphics pg;
PGraphics pg2;

//Movie movie;
//Movie movie2;

GLMovie movie;
GLMovie movie2;

void setup() {
size(640, 360, P2D);
noSmooth();
pg = createGraphics(640, 360, P2D);

//movie = new Movie(this, "_sm/LabspaceDawnv1blur2.mp4");
movie = new GLMovie(this, "_sm/LabspaceDawnv1blur2.mp4");
movie.loop();

//movie2 = new Movie(this, "_sm/LabspaceFireblur2.mp4");
movie2 = new GLMovie(this, "_sm/LabspaceFireblur2.mp4");
movie2.loop();

pg = createGraphics(width, height, P2D);
pg2 = createGraphics(width, height, P2D);

mixShader = loadShader("fadeshader.glsl");
mixShader.set("iResolution", float(width), float(height));
mixShader.set("iTime", millis()/1000.);

mixShader.set("iChannel0", pg);
mixShader.set("iChannel1", pg2);

}

//void movieEvent(Movie m) {
void movieEvent(GLMovie m) {
m.read();
redraw();
}

void draw() {

pg.beginDraw();
pg.image(movie, 0, 0, width, height);
pg.endDraw();

pg2.beginDraw();
pg2.image(movie2, 0, 0, width, height);
pg2.endDraw();

shader(mixShader);
rect(0, 0, width, height);

}

fadeshader.glsl

// Type of shader expected by Processing
#define PROCESSING_COLOR_SHADER

uniform float iTime;
uniform sampler2D iChannel0;
uniform sampler2D iChannel1;
uniform vec2 iResolution;

void main() {

vec2 uv = gl_FragCoord.xy / iResolution.xy;
vec4 mixColor = vec4(0.0);
vec4 color0 = vec4(uv.x,uv.y,0.0,1.0);
vec4 color1 = vec4(uv.x,uv.y,0.0,1.0);

color0 = texture2D(iChannel0, uv);
color1 = texture2D(iChannel1, uv);

float duration = 10.0;
float t = mod(float(iTime), duration) / duration;

mixColor = mix(color0, color1, t);
gl_FragColor = mixColor;
}

如果有人好奇的话,我已经用较小的视频更新了示例草图的新版本:https://www.dropbox.com/sh/fu2plxmqhf7shtp/AADxqmW9zf73EsdzworCb5ECa?dl=0

任何关于可能出现的情况或从哪里开始调试更多内容的建议或想法将不胜感激。

谢谢!

最佳答案

我不是 100% 确定,但错误可能与视频编码以及 GLVideo 库可以在 Raspberry PI 上解码的内容(依赖于 gstreamer)有关

我已经在旧系统上的 OSX 上遇到了错误:草图在灰色屏幕上卡住了几秒钟,然后崩溃,没有任何警告或错误。

我建议重新编码视频,如果不需要则删除音频 channel ,并使用与处理传输视频相同或相似的 H.264 编码器(例如示例 > 库 > 视频 > 电影 > 循环)

ffprobe -i /Users/George/Desktop/shaderDisolveGLSL/data/_sm/LabspaceDawn\ v1\ blur\ 2.mp4 
ffprobe version 3.3.3 Copyright (c) 2007-2017 the FFmpeg developers
built with Apple LLVM version 7.0.0 (clang-700.0.72)
configuration: --prefix=/usr/local/Cellar/ffmpeg/3.3.3 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-libmp3lame --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-opencl --disable-lzma --enable-vda
libavutil 55. 58.100 / 55. 58.100
libavcodec 57. 89.100 / 57. 89.100
libavformat 57. 71.100 / 57. 71.100
libavdevice 57. 6.100 / 57. 6.100
libavfilter 6. 82.100 / 6. 82.100
libavresample 3. 5. 0 / 3. 5. 0
libswscale 4. 6.100 / 4. 6.100
libswresample 2. 7.100 / 2. 7.100
libpostproc 54. 5.100 / 54. 5.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/Users/George/Desktop/shaderDisolveGLSL/data/_sm/LabspaceDawn v1 blur 2.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: isommp42
creation_time : 2020-03-15T14:03:48.000000Z
Duration: 00:00:49.09, start: 0.000000, bitrate: 218 kb/s
Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt709), 640x360 [SAR 1:1 DAR 16:9], 119 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
Metadata:
creation_time : 2020-03-15T14:03:48.000000Z
handler_name : ISO Media file produced by Google Inc. Created on: 03/15/2020.
Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 95 kb/s (default)
Metadata:
creation_time : 2020-03-15T14:03:48.000000Z
handler_name : ISO Media file produced by Google Inc. Created on: 03/15/2020.

公交视频详情:

ffprobe -i /Users/George/Desktop/shaderDisolveGLSL/data/transit.mov 
ffprobe version 3.3.3 Copyright (c) 2007-2017 the FFmpeg developers
built with Apple LLVM version 7.0.0 (clang-700.0.72)
configuration: --prefix=/usr/local/Cellar/ffmpeg/3.3.3 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-libmp3lame --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-opencl --disable-lzma --enable-vda
libavutil 55. 58.100 / 55. 58.100
libavcodec 57. 89.100 / 57. 89.100
libavformat 57. 71.100 / 57. 71.100
libavdevice 57. 6.100 / 57. 6.100
libavfilter 6. 82.100 / 6. 82.100
libavresample 3. 5. 0 / 3. 5. 0
libswscale 4. 6.100 / 4. 6.100
libswresample 2. 7.100 / 2. 7.100
libpostproc 54. 5.100 / 54. 5.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/Users/George/Desktop/shaderDisolveGLSL/data/transit.mov':
Metadata:
major_brand : qt
minor_version : 537199360
compatible_brands: qt
creation_time : 2012-08-31T20:17:39.000000Z
Duration: 00:00:12.38, start: 0.000000, bitrate: 731 kb/s
Stream #0:0(eng): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt709), 640x360, 727 kb/s, 29.97 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)
Metadata:
creation_time : 2012-08-31T20:17:44.000000Z
handler_name : Apple Alias Data Handler
encoder : H.264

什么似乎有效:

Stream #0:0(eng): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt709), 640x360, 727 kb/s, 29.97 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)

什么似乎崩溃了:

Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt709), 640x360 [SAR 1:1 DAR 16:9], 119 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)

在使用 movieEvent 时,我遇到了一些其他间歇性 JOGL 错误:在 draw() 中偷偷调用 .read() 似乎解决问题。

这是我在 OSX 上运行的代码的调整版本:

import processing.video.*;
//import gohai.glvideo.*;

PShader mixShader;

PGraphics pg;
PGraphics pg2;

Movie movie;
Movie movie2;

//GLMovie movie;
//GLMovie movie2;

void setup() {
size(640, 360, P2D);
noSmooth();
noStroke();

//movie = new Movie(this, "_sm/LabspaceDawnv1blur2.mp4");
movie = new Movie(this, "transit.mov");
//movie = new GLMovie(this, "_sm/LabspaceDawnv1blur2.mp4");
movie.loop();

//movie2 = new Movie(this, "_sm/LabspaceFireblur2.mp4");
movie2 = new Movie(this, "transit2.mov");
//movie2 = new GLMovie(this, "_sm/LabspaceFireblur2.mp4");
movie2.loop();

pg = createGraphics(width, height, P2D);
pg2 = createGraphics(width, height, P2D);

mixShader = loadShader("fadeshader.glsl");
mixShader.set("iResolution", float(width), float(height));
mixShader.set("iChannel0", pg);
mixShader.set("iChannel1", pg2);

}

//void movieEvent(Movie m) {
//void movieEvent(GLMovie m) {
//m.read();
//redraw();
//}

void draw() {
if(movie.available()){ movie.read(); }
if(movie2.available()){ movie2.read(); }

pg.beginDraw();
// for testing only since both movies are the same
movie.filter(GRAY);
pg.image(movie, 0, 0, width, height);
pg.endDraw();

pg2.beginDraw();
pg2.image(movie2, 0, 0, width, height);
pg2.endDraw();

// don't forget to update time
mixShader.set("iTime", millis() * 0.01);

shader(mixShader);
rect(0, 0, width, height);
}

希望这可以与 Raspberry Pi 上的 Transit mov 配合使用来测试编解码器。一旦顺利运行,请重新编码您的视频(Handbreak 可能会有所帮助),然后重试。

@jshaw3 我进行了更改以在 RPI3 上进行测试。 GLVideo 似乎有问题。如果您不需要音频,您也许可以使用图像序列:Image Sequence Player可以让这变得更容易。请记住,这应该在不到 5 秒的时间内初始化,以避免 P3D/GL 端超时(否则推迟加载到第一帧):

import com.hirschandmann.image.*;

PShader mixShader;

PGraphics pg;
PGraphics pg2;

ISPlayer movie1;
ISPlayer movie2;

boolean loadTriggered = false;

void setup() {
size(640, 360, P2D);
noSmooth();
pg = createGraphics(640, 360, P2D);

pg = createGraphics(width, height, P2D);
pg2 = createGraphics(width, height, P2D);

mixShader = loadShader("fadeshader.glsl");
mixShader.set("iResolution", float(width), float(height));
mixShader.set("iTime", millis()/1000.);

mixShader.set("iChannel0", pg);
mixShader.set("iChannel1", pg2);

}

void draw() {

if(!loadTriggered){
movie1 = new ISPlayer(this,dataPath("_sm/LabspaceDawnv1blur2Frames"));
movie1.loop();

movie2 = new ISPlayer(this,dataPath("_sm/LabspaceFireblur2Frames"));
movie2.loop();

loadTriggered = true;
}

pg.beginDraw();
if(movie1 != null) pg.image(movie1, 0, 0, width, height);
pg.endDraw();

pg2.beginDraw();
if(movie2 != null) pg2.image(movie2, 0, 0, width, height);
pg2.endDraw();

shader(mixShader);
rect(0, 0, width, height);

}

注意以上假设您已将 .mp4 文件转换为图像序列(例如 LabspaceDawnv1blur2.mp4 -> LabspaceDawnv1blur2Frames)。这是一个 ffmepg 示例:

ffmpeg -i LabspaceDawnv1blur2.mp4.mp4 -vf fps=1/60 LabspaceDawnv1blur2.mp4Frames/frame_%04d.png

关于java - 在 Pi 3 B+ 上使用 Processing Pi 的 GLSL 着色器,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/60758549/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com