gpt4 book ai didi

javascript - 分块文件上传的文件大小不同/已损坏

转载 作者:行者123 更新时间:2023-12-01 03:13:09 28 4
gpt4 key购买 nike

我在服务器端使用 jquery-file-upload 和 Python-Flask。每当我上传 100mb 以上的大文件时,上传的版本比原始版本稍大并且无法打开(已损坏)。我对 10mb block 的大文件启用了分块,我尝试将“disableImageResize”设置为“true”,并尝试了单个和多个文件,结果是相同的。我的代码中是否缺少某些内容?

main.js

$(function () {
'use strict';
// Initialize the jQuery File Upload widget:
$('#fileupload').fileupload({
// Uncomment the following to send cross-domain cookies:
//xhrFields: {withCredentials: true},
url: 'rs_upload',
disableImageResize: true,
sequentialUploads: true,
// redirect: 'home',
done: function (e, data) {
console.log("uploaded: " + data.files[0].name)

}
, maxChunkSize: 10000000, // 10 MB,
}).bind('fileuploadstop', function (e, data) {
if (data.loaded == data.total){window.location.replace("rs_create")}
});

View .py

@app.route("/rs_upload", methods=["GET", "POST"])
def rs_upload():
if request.method == 'POST':
files = request.files['file']
fs = files
handle_file(fs)
fullpath = session.get('finalpath')
if 'Content-Range' in request.headers:
# extract starting byte from Content-Range header string
range_str = request.headers['Content-Range']
start_bytes = int(range_str.split(' ')[1].split('-')[0])

# append chunk to the file on disk, or create new
with open(fullpath, 'a') as f:
f.seek(start_bytes)
f.write(fs.stream.read())

else:
# this is not a chunked request, so just save the whole file
fs.save(fullpath)

return jsonify({"name": fs.filename,
"size": os.path.getsize(fullpath),
"url": 'uploads/' + fs.filename,
"thumbnail_url": None,
"delete_url": None,
"delete_type": None,})

return render_template('remote_sensing/upload.html')

最佳答案

不确定这是否是问题所在,但会尝试

with open(fullpath, 'ab') as f:

以二进制模式打开并附加到文件。

关于javascript - 分块文件上传的文件大小不同/已损坏,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45719907/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com