gpt4 book ai didi

django - IOError:请求数据读取错误

转载 作者:行者123 更新时间:2023-12-04 05:40:38 26 4
gpt4 key购买 nike

尝试将数据加载为 Excel 作为响应时,出现 IO 请求数据读取错误。

def convert_to_excel(request):
field = forms.CharField()
try: data = field.clean(request.POST.get('exceldata', ''))
except: data = u''
response = render_to_response("spreadsheet.html", request= request, dictionary = locals())
filename = "%s%s.xls" % ("report_excel", datetime.datetime.today().strftime('%Y%m%d%H%M%S'))
response['Content-Disposition'] = 'attachment; filename='+filename
response['Content-Type'] = 'application/vnd.ms-excel; charset=utf-8'
return response

当数据低于 150k 时它工作正常,但更大的文件在 200k 左右失败。我正在使用守护进程模式和 Python/2.7.3 运行 django 1.4 Apache/2.2.22 (Ubuntu) mod_wsgi/3.3。

这在本地主机中工作正常。我想这可能是守护进程模式下 wsgi 的问题或错误配置。有没有人知道这个?

我得到的异常(exception)如下。
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py", line 89, in get_response
response = middleware_method(request)

File "/usr/local/lib/python2.7/dist-packages/newrelic-1.2.1.265-py2.7.egg/newrelic/hooks/framework_django.py", line 191, in __call__
result = self.__wrapped(*args, **kwargs)

File "/home/core/mysite/src/task/tools/libs/pagination/middleware.py", line 8, in process_request
request.page = int(request.REQUEST.get('page', 1) )

File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/wsgi.py", line 166, in _get_request
self._request = datastructures.MergeDict(self.POST, self.GET)

File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/wsgi.py", line 180, in _get_post
self._load_post_and_files()

File "/usr/local/lib/python2.7/dist-packages/django/http/__init__.py", line 360, in _load_post_and_files
self._post, self._files = self.parse_file_upload(self.META, data)

File "/usr/local/lib/python2.7/dist-packages/django/http/__init__.py", line 320, in parse_file_upload
return parser.parse()

File "/usr/local/lib/python2.7/dist-packages/newrelic-1.2.1.265-py2.7.egg/newrelic/api/function_trace.py", line 82, in __call__
return self._nr_next_object(*args, **kwargs)

File "/usr/local/lib/python2.7/dist-packages/django/http/multipartparser.py", line 161, in parse
data = field_stream.read()

File "/usr/local/lib/python2.7/dist-packages/django/http/multipartparser.py", line 301, in read
out = ''.join(parts())

File "/usr/local/lib/python2.7/dist-packages/django/http/multipartparser.py", line 285, in parts
yield ''.join(self)

File "/usr/local/lib/python2.7/dist-packages/django/http/multipartparser.py", line 316, in next
output = self._producer.next()

File "/usr/local/lib/python2.7/dist-packages/django/http/multipartparser.py", line 449, in next
for bytes in stream:

File "/usr/local/lib/python2.7/dist-packages/django/http/multipartparser.py", line 316, in next
output = self._producer.next()

File "/usr/local/lib/python2.7/dist-packages/django/http/multipartparser.py", line 377, in next
data = self.flo.read(self.chunk_size)

File "/usr/local/lib/python2.7/dist-packages/django/http/__init__.py", line 384, in read
return self._stream.read(*args, **kwargs)

File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/wsgi.py", line 104, in read
result = self.buffer + self._read_limited(size - len(self.buffer))

File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/wsgi.py", line 92, in _read_limited
result = self.stream.read(size)

File "/usr/local/lib/python2.7/dist-packages/newrelic-1.2.1.265-py2.7.egg/newrelic/api/web_transaction.py", line 349, in read
data = self.__input.read(*args, **kwargs)

IOError: request data read error

最佳答案

检查你的 apache 配置,看看你是否能找到 LimitRequestBody
如果是,请将其更改为更大的数字,如果找不到,请在您的站点配置下添加以下内容

LimitRequestBody 1024000000 

另外,检查您的 uwsgi 文件限制选项,使用以下选项启动您的 uwsgi:
uwsgi --limit-post 1024000000 

这种错误是您的上传超过了您的 apache/nginx/uwsgi 最大上传设置。我还注意到你可能没有使用 uwsgi, limit request bodyset LimitRequestBody to be a low value by default 可能有帮助。

关于django - IOError:请求数据读取错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/11287791/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com