我将我的代理加载到 proxies
变量中,并尝试执行异步请求来获取 IP。很简单:
async def get_ip(proxy):
timeout = aiohttp.ClientTimeout(connect=5)
async with aiohttp.ClientSession(timeout=timeout) as session:
try:
async with session.get('https://api.ipify.org?format=json', proxy=proxy, timeout=timeout) as response:
json_response = await response.json()
print(json_response)
except:
pass
if __name__ == "__main__":
proxies = []
start_time = time.time()
loop = asyncio.get_event_loop()
tasks = [asyncio.ensure_future(get_ip(proxy)) for proxy in proxies]
loop.run_until_complete(asyncio.wait(tasks))
print('time spent to work: {} sec --------------'.format(time.time()-start_time))
当我尝试执行 100-200-300-400 个请求时,此代码工作正常,但当计数超过 500 时,我总是会收到错误:
Traceback (most recent call last):
File "async_get_ip.py", line 60, in <module>
loop.run_until_complete(asyncio.wait(tasks))
File "C:\Python37\lib\asyncio\base_events.py", line 571, in run_until_complete
self.run_forever()
File "C:\Python37\lib\asyncio\base_events.py", line 539, in run_forever
self._run_once()
File "C:\Python37\lib\asyncio\base_events.py", line 1739, in _run_once
event_list = self._selector.select(timeout)
File "C:\Python37\lib\selectors.py", line 323, in select
r, w, _ = self._select(self._readers, self._writers, [], timeout)
File "C:\Python37\lib\selectors.py", line 314, in _select
r, w, x = select.select(r, w, w, timeout)
ValueError: too many file descriptors in select()
我一直在寻找解决方案,但我发现的只是操作系统的限制。我可以在不使用额外库的情况下解决这个问题吗?
最佳答案
同时启动无限数量的请求并不是一个好主意。每个启动的请求都会消耗一些资源,从 CPU/RAM 到操作系统的 select() 容量,就您的情况而言,迟早会导致问题。
为了避免这种情况,您应该使用 asyncio.Semaphore它允许限制同时连接的最大数量。
我相信您的代码只需进行很少的更改:
sem = asyncio.Semaphore(50)
async def get_ip(proxy):
async with sem:
# ...
Here's full complex example一般如何使用信号量。
附注
except:
pass
你永远不应该做这样的事情,它只会break a code sooner or later .
至少使用除了异常
。
关于python-3.x - 为什么我收到 ValueError : too many file descriptors in select()?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/57182009/