gpt4 book ai didi

node.js - NodeJS : How would I compress a stream before uploading to S3?

转载 作者:太空宇宙 更新时间:2023-11-03 22:04:29 25 4
gpt4 key购买 nike

目前,我正在将图像文件从我的移动应用上传到我的服务器,然后从我的服务器上传到 S3。就像这样:

updateProgressPics: combineResolvers(
isAuthenticated,
async (parent, { pics }, { models, currentUser }) => {
return pics.map(async (pic, i) => {
const { file, filename, progressPicId } = pic;
const { createReadStream } = await file;
const stream = createReadStream();

const { Location: url, Key: key, Bucket: bucket } = await S3.upload({
stream,
filename,
folder: currentUser.id,
});

return await models.ProgressPic.findOneAndUpdate(
{ _id: progressPicId },
{ $set: { url, key, bucket } },
{ new: true, useFindAndModify: false }
);
});
}

我的 S3 文件:

import AWS from "aws-sdk";
import fs from "fs";

import { AWS_CONFIG, S3_BUCKET } from "../../config";

AWS.config.update(AWS_CONFIG);

const s3 = new AWS.S3();

export const upload = async ({ folder, filename, stream }) => {
const params = {
Bucket: S3_BUCKET,
Key: `${folder}/${filename}`,
Body: stream,
};

const options = { partSize: 10 * 1024 * 1024, queueSize: 1 };

return s3.upload(params, options).promise();
};

export default {
upload,
};

我正在使用 Graphql 上传,它公开了 createReadStream 函数:https://github.com/jaydenseric/graphql-upload#readme

图像文件有点大(3mb),我觉得即使使用 Cloudfront,它们也需要很长时间才能加载到设备上。我想在上传之前对其进行压缩,以节省存储空间并出于性能原因。有没有办法专门针对流执行此操作?

最佳答案

我想在这种情况下,一个名为 sharp 的库将是你的 friend 。我将在此处提供该库的链接,它广泛支持所有操作的流式传输。希望你能用这个解决你的压缩操作。

关于您的性能因素,库声称:

Resizing an image is typically 4x-5x faster than using the quickest ImageMagick and GraphicsMagick settings due to its use of libvips.

关于node.js - NodeJS : How would I compress a stream before uploading to S3?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58148139/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com