gpt4 book ai didi

python - 多处理 + 请求挂起,出现异常 AttributeError : 'file' object has no attribute 'out'

转载 作者:行者123 更新时间:2023-11-30 23:24:15 26 4
gpt4 key购买 nike

我正在尝试构建一个使用多处理+请求来并行发出多个请求的类。我遇到了一个问题,它挂起并给我一个神秘的错误消息,但我不确定如何。

下面是我的代码,它基本上只是使用带有回调的池将结果放入列表中。我的要求是每个 URL 都需要一个“硬超时”,即如果某个 URL 花费超过几秒钟的时间来下载其内容,我只想跳过它。因此,我使用池超时并对尝试的 URL 与返回的 URL 内容进行比较,尝试但未返回的 URL 被假定为失败。这是我的代码:

import time
import json
import requests
import sys
from urlparse import parse_qs
from urlparse import urlparse
from urlparse import urlunparse
from urllib import urlencode
from multiprocessing import Process, Pool, Queue, current_process
from multiprocessing.pool import ThreadPool
from multiprocessing import TimeoutError
import traceback
from sets import Set
from massweb.pnk_net.pnk_request import pnk_request_raw
from massweb.targets.fuzzy_target import FuzzyTarget
from massweb.payloads.payload import Payload

class MassRequest(object):

def __init__(self, num_threads = 10, time_per_url = 10, request_timeout = 10, proxy_list = [{}]):

self.num_threads = num_threads
self.time_per_url = time_per_url
self.request_timeout = request_timeout
self.proxy_list = proxy_list

self.results = []
self.urls_finished = []
self.urls_attempted = []

self.targets_results = []
self.targets_finished = []
self.targets_attempted = []

def add_to_finished(self, x):

self.urls_finished.append(x[0])
self.results.append(x)

def add_to_finished_targets(self, x):

self.targets_finished.append(x[0])
self.targets_results.append(x)

def get_urls(self, urls):

timeout = float(self.time_per_url * len(urls))
pool = Pool(processes = self.num_threads)
proc_results = []

for url in urls:
self.urls_attempted.append(url)
proc_result = pool.apply_async(func = pnk_request_raw, args = (url, self.request_timeout, self.proxy_list), callback = self.add_to_finished)
proc_results.append(proc_result)

for pr in proc_results:

try:
pr.get(timeout = timeout)

except:
pool.terminate()
pool.join()

pool.terminate()
pool.join()
list_diff = Set(self.urls_attempted).difference(Set(self.urls_finished))

for url in list_diff:
sys.stderr.write("URL %s got timeout" % url)
self.results.append((url, "__PNK_GET_THREAD_TIMEOUT"))

if __name__ == "__main__":

f = open("out_urls_to_fuzz_1mil")
urls_to_request = []
for line in f:
url = line.strip()
urls_to_request.append(url)

mr = MassRequest()
mr.get_urls(urls_to_request)

这是线程调用的函数:

def pnk_request_raw(url_or_target, req_timeout = 5, proxy_list = [{}]):

if proxy_list[0]:
proxy = get_random_proxy(proxy_list)
else:
proxy = {}

try:
if isinstance(url_or_target, str):

sys.stderr.write("Requesting: %s with proxy %s\n" % (str(url_or_target), str(proxy)))
r = requests.get(url_or_target, proxies = proxy, timeout = req_timeout)
return (url_or_target, r.text)

if isinstance(url_or_target, FuzzyTarget):

sys.stderr.write("Requesting: %s with proxy %s\n" % (str(url_or_target), str(proxy)))
r = requests.get(url_or_target.url, proxies = proxy, timeout = req_timeout)
return (url_or_target, r.text)

except:
#use this to mark failure on exception
traceback.print_exc()
#edit: this is the line that was breaking it all
sys.stderr.out("A request failed to URL %s\n" % url_or_target)
return (url_or_target, "__PNK_REQ_FAILED")

这似乎适用于较小的 URL 集,但输出如下:

Requesting: http://www.sportspix.co.za/ with proxy {}
Requesting: http://www.sportspool.co.za/ with proxy {}
Requesting: http://www.sportspredict.co.za/ with proxy {}
Requesting: http://www.sportspro.co.za/ with proxy {}
Requesting: http://www.sportsrun.co.za/ with proxy {}
Requesting: http://www.sportsstuff.co.za/ with proxy {}
Requesting: http://sportsstuff.co.za/2011-rugby-world-cup with proxy {}
Requesting: http://www.sportstar.co.za/4-stroke-racing with proxy {}
Requesting: http://www.sportstats.co.za/ with proxy {}
Requesting: http://www.sportsteam.co.za/ with proxy {}
Requesting: http://www.sportstec.co.za/ with proxy {}
Requesting: http://www.sportstours.co.za/ with proxy {}
Requesting: http://www.sportstrader.co.za/ with proxy {}
Requesting: http://www.sportstravel.co.za/ with proxy {}
Requesting: http://www.sportsturf.co.za/ with proxy {}
Requesting: http://reimo.sportsvans.co.za/ with proxy {}
Requesting: http://www.sportsvans.co.za/4x4andmoreWindhoek.html with proxy {}
Handled exception:Traceback (most recent call last):
File "mass_request.py", line 87, in get_fuzzy_targets
pr.get(timeout = timeout)
File "/usr/lib/python2.7/multiprocessing/pool.py", line 528, in get
raise self._value
AttributeError: 'file' object has no attribute 'out'

在最后一个异常中,程序挂起,我必须完全终止它。 AFAIK 我从来没有尝试访问具有“out”属性的文件对象。我的问题是...如何解决!?我在这里做明显错误的事情吗?为什么没有更明确的异常(exception)?

最佳答案

我认为 sys.stderr.out("A request failed to URL %s\n"% url_or_target) 应该是 sys.stderr.write("A request failed to URL %s\n"% url_or_target)

关于python - 多处理 + 请求挂起,出现异常 AttributeError : 'file' object has no attribute 'out' ,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23549836/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com