gpt4 book ai didi

flutter - 将 Flutter 动画直接渲染到视频

转载 作者:行者123 更新时间:2023-12-03 15:47:30 28 4
gpt4 key购买 nike

考虑到 Flutter 使用自己的图形引擎,有没有办法将 Flutter 动画直接渲染为视频,或者以逐帧的方式创建屏幕截图?

一个用例是,这可以让观众更轻松地进行演示。

例如,一位作者想要创建一个 Flutter 动画教程,他们在其中构建一个演示应用程序并撰写一篇配套的博客文章,使用直接用 Flutter 渲染的动画 GIF/视频。

另一个例子是 UI 团队之外的一位开发人员发现一个复杂的动画有微小的错误。无需真正学习动画代码,他们就可以将动画渲染成视频并用注释编辑该短片,然后将其发送给 UI 团队进行诊断。

最佳答案

它并不漂亮,但我已经设法让原型(prototype)工作了。
首先,所有动画都需要由一个主动画 Controller 提供支持,以便我们可以单步执行我们想要的动画的任何部分。其次,我们要记录的小部件树必须包含在 RepaintBoundary 中。带有全局键。 RepaintBoundary 和它的键可以生成小部件树的快照,如下所示:

Future<Uint8List> _capturePngToUint8List() async {
// renderBoxKey is the global key of my RepaintBoundary
RenderRepaintBoundary boundary = renderBoxKey.currentContext.findRenderObject();

// pixelratio allows you to render it at a higher resolution than the actual widget in the application.
ui.Image image = await boundary.toImage(pixelRatio: 2.0);
ByteData byteData = await image.toByteData(format: ui.ImageByteFormat.png);
Uint8List pngBytes = byteData.buffer.asUint8List();

return pngBytes;
}
然后可以在循环中使用上述方法,该循环将小部件树捕获到 pngBytes 中,并通过您想要的帧速率指定的 deltaT 向前步进动画 Controller :
double t = 0;
int i = 1;

setState(() {
animationController.value = 0.0;
});

Map<int, Uint8List> frames = {};
double dt = (1 / 60) / animationController.duration.inSeconds.toDouble();

while (t <= 1.0) {
print("Rendering... ${t * 100}%");
var bytes = await _capturePngToUint8List();
frames[i] = bytes;

t += dt;
setState(() {
animationController.value = t;
});
i++;
}
最后,所有这些 png 帧都可以通过管道传输到 ffmpeg 子进程中以写入视频。
我还没有设法让这部分工作得很好(更新:向下滚动以查看解决方案),所以我所做的是将所有的 png 帧写到实际的 png 文件中,然后我在里面手动运行 ffmpeg写入它们的文件夹。 (注:我使用flutter desktop能够访问我安装的ffmpeg,但是有 a package on pub.dev to get ffmpeg on mobile too)
List<Future<File>> fileWriterFutures = [];

frames.forEach((key, value) {
fileWriterFutures.add(_writeFile(bytes: value, location: r"D:\path\to\my\images\folder\" + "frame_$key.png"));
});

await Future.wait(fileWriterFutures);

_runFFmpeg();
这是我的文件编写器帮助功能:
Future<File> _writeFile({@required String location, @required Uint8List bytes}) async {
File file = File(location);
return file.writeAsBytes(bytes);
}
这是我的 FFmpeg 运行器功能:
void _runFFmpeg() async {
// ffmpeg -y -r 60 -start_number 1 -i frame_%d.png -c:v libx264 -preset medium -tune animation -pix_fmt yuv420p test.mp4
var process = await Process.start(
"ffmpeg",
[
"-y", // replace output file if it already exists
"-r", "60", // framrate
"-start_number", "1",
"-i", r"./test/frame_%d.png", // <- Change to location of images
"-an", // don't expect audio
"-c:v", "libx264rgb", // H.264 encoding
"-preset", "medium",
"-crf",
"10", // Ranges 0-51 indicates lossless compression to worst compression. Sane options are 0-30
"-tune", "animation",
"-preset", "medium",
"-pix_fmt", "yuv420p",
r"./test/test.mp4" // <- Change to location of output
],
mode: ProcessStartMode.inheritStdio // This mode causes some issues at times, so just remove it if it doesn't work. I use it mostly to debug the ffmpeg process' output
);

print("Done Rendering");
}
更新:
自从发布此答案以来,我已经弄清楚如何将图像直接通过管道传输到 ffmpeg,而无需先写出所有文件。以下是从我的一个小部件中获取的更新的渲染函数。小部件的上下文中存在一些变量,但我希望可以从上下文中推断出它们的值:
void render([double? pixelRatio]) async {
// If already rendering, return
if (isRendering) return;

String outputFileLocation = "final.mp4";

setState(() {
isRendering = true;
});

timeline.stop();

await timeline.animateTo(0.0, duration: const Duration(milliseconds: 700), curve: Curves.easeInOutQuad);
setState(() {
timeline.value = 0.0;
});

await Future.delayed(const Duration(milliseconds: 100));

try {
int width = canvasSize.width.toInt();
int height = canvasSize.height.toInt();
int frameRate = 60;
int numberOfFrames = frameRate * (timeline.duration!.inSeconds);

print("starting ffmpeg..");
var process = await Process.start(
"ffmpeg",
[
"-y", // replace output file if it already exists
// "-f", "rawvideo",
// "-pix_fmt", "rgba",
"-s", "${width}x$height", // size
"-r", "$frameRate", // framrate
"-i", "-",
"-frames", "$numberOfFrames",
"-an", // don't expect audio
"-c:v", "libx264rgb", // H.264 encoding
"-preset", "medium",
"-crf",
"10", // Ranges 0-51 indicates lossless compression to worst compression. Sane options are 0-30
"-tune", "animation",
"-preset", "medium",
"-pix_fmt", "yuv420p",
"-vf",
"pad=ceil(iw/2)*2:ceil(ih/2)*2", // ensure width and height is divisible by 2
outputFileLocation
],
mode: ProcessStartMode.detachedWithStdio,
runInShell: true);

print("writing to ffmpeg...");
RenderRepaintBoundary boundary = paintKey.currentContext!.findRenderObject()! as RenderRepaintBoundary;

pixelRatio = pixelRatio ?? 1.0;
print("Pixel Ratio: $pixelRatio");

for (int i = 0; i <= numberOfFrames; i++) {
Timeline.startSync("Render Video Frame");
double t = (i.toDouble() / numberOfFrames.toDouble());
// await timeline.animateTo(t, duration: Duration.zero);
timeline.value = t;

ui.Image image = await boundary.toImage(pixelRatio: pixelRatio);
ByteData? rawData = await image.toByteData(format: ui.ImageByteFormat.png);
var rawIntList = rawData!.buffer.asInt8List().toList();
Timeline.finishSync();

if (i % frameRate == 0) {
print("${((t * 100.0) * 100).round() / 100}%");
}

process.stdin.add(rawIntList);

image.dispose();
}
await process.stdin.flush();

print("stopping ffmpeg...");
await process.stdin.close();
process.kill();
print("done!");
} catch (e) {
print(e);
} finally {
await timeline.animateTo(beforeValue, duration: const Duration(milliseconds: 500), curve: Curves.easeInOutQuad);
setState(() {
isRendering = false;
});
}
}

关于flutter - 将 Flutter 动画直接渲染到视频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52274511/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com