多进程里嵌套协程,并发数增加的时候,程序卡死是怎么回事?
from multiprocessing import Semaphore
import time
import asyncio
import aiohttp
import requests
import random,time, datetime
import utils
import json
def start_loop(turn):
request_url = ‘https://www.baidu.com’
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
tasks = []
for num in range(2000):
func = aio_request(request_url,turn,loop)
task = asyncio.ensure_future(func)
tasks.append(task)
try:
ret = loop.run_until_complete(asyncio.gather(*tasks))
loop.close()
except Exception as e:
print(‘loop 中断或者完成’,e)
return turn
async def aio_request(url,headers,turn,loop):
async with aiohttp.ClientSession() as s:
async with await s.get(url) as response:
print(‘请求内容’, url)
# response.read()二进制(.content )
result = await response.json()
print(url, ‘获取内容完成’)
def main():
start = time.time()
pool = multiprocessing.Pool(processes = 30)
for turn in range(10):
pool.apply_async(start_loop, (turn, ))
pool.close()
pool.join()
print(‘总耗时:’,time.time()-start)
if __name__ == “__main__”:
main()
这是个简单例子。主要逻辑是,多进程跑协程任务,协程处理并发 2000 个地址以上。奇怪的地方在于,我这个例子也能跑,但放到项目里就卡死。而且并发 20 能跑,2000 就卡死。把这个协程单独运行(非进程内)也可以跑,不知道是不是阻塞了。之前实在没并发编程经验,不知道算不算很低级的错误。。。