可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
I am using Python3 Asyncio module to create a load balancing application. I have two heavy IO tasks:
- A SNMP polling module, which determines the best possible server
- A "proxy-like" module, which balances the petitions to the selected server.
Both processes are going to run forever, are independent from eachother and should not be blocked by the other one.
I cant use 1 event loop because they would block eachother, is there any way to have 2 event loops or do I have to use multithreading/processing?
I tried using asyncio.new_event_loop() but havent managed to make it work.
回答1:
Answering my own question to post my solution:
What I ended up doing was creating a thread and a new event loop inside the thread for the polling module, so now every module runs in a different loop. It is not a perfect solution, but it is the only one that made sense to me(I wanted to avoid threads, but since it is only one...). Example:
import asyncio
import threading
def worker():
second_loop = asyncio.new_event_loop()
execute_polling_coroutines_forever(second_loop)
return
threads = []
t = threading.Thread(target=worker)
threads.append(t)
t.start()
loop = asyncio.get_event_loop()
execute_proxy_coroutines_forever(loop)
Asyncio requires that every loop runs its coroutines in the same thread. Using this method you have one event loop foreach thread, and they are totally independent: every loop will execute its coroutines on its own thread, so that is not a problem.
As I said, its probably not the best solution, but it worked for me.
回答2:
The whole point of asyncio
is that you can run multiple thousands of I/O-heavy tasks concurrently, so you don't need Thread
s at all, this is exactly what asyncio
is made for. Just run the two coroutines (SNMP and proxy) in the same loop and that's it.
You have to make both of them available to the event loop BEFORE calling loop.run_forever()
. Something like this:
import asyncio
async def snmp():
print("Doing the snmp thing")
await asyncio.sleep(1)
async def proxy():
print("Doing the proxy thing")
await asyncio.sleep(2)
async def main():
while True:
await snmp()
await proxy()
loop = asyncio.get_event_loop()
loop.create_task(main())
loop.run_forever()
I don't know the structure of your code, so the different modules might have their own infinite loop or something, in this case you can run something like this:
import asyncio
async def snmp():
while True:
print("Doing the snmp thing")
await asyncio.sleep(1)
async def proxy():
while True:
print("Doing the proxy thing")
await asyncio.sleep(2)
loop = asyncio.get_event_loop()
loop.create_task(snmp())
loop.create_task(proxy())
loop.run_forever()
Remember, both snmp
and proxy
needs to be coroutines (async def
) written in an asyncio-aware manner. asyncio
will not make simple blocking Python functions suddenly "async".
In your specific case, I suspect that you are confused a little bit (no offense!), because well-written async modules will never block each other in the same loop. If this is the case, you don't need asyncio
at all and just simply run one of them in a separate Thread
without dealing with any asyncio
stuff.
回答3:
Asyncio event loop is a single thread running and it will not run anything in parallel, it is how it is designed. The closest thing which I can think of is using asyncio.wait
.
from asyncio import coroutine
import asyncio
@coroutine
def some_work(x, y):
print("Going to do some heavy work")
yield from asyncio.sleep(1.0)
print(x + y)
@coroutine
def some_other_work(x, y):
print("Going to do some other heavy work")
yield from asyncio.sleep(3.0)
print(x * y)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait([asyncio.async(some_work(3, 4)),
asyncio.async(some_other_work(3, 4))]))
loop.close()
an alternate way is to use asyncio.gather()
- it returns a future results from the given list of futures.
tasks = [asyncio.Task(some_work(3, 4)), asyncio.Task(some_other_work(3, 4))]
loop.run_until_complete(asyncio.gather(*tasks))
回答4:
But I used that like this, but still its synchronous no async:
def main(*args):
loop = get_event_loop()
coro = asyncio.start_server(handle_echo, '127.0.0.1', 50008,loop=loop)
srv = loop.run_until_complete(coro)
loop.run_forever()
@asyncio.coroutine
def handle_echo(reader, writer):
data = yield from reader.read(500)
message = data.decode(encoding='utf-8')
nameindex=('name="calculator2"' in message)
if nameindex:
time.sleep(5)
writer.write("Content-Length: 1\r\n\r\n2".encode())
yield from writer.drain()
else:
writer.write("Content-Length: 1\r\n\r\n1".encode())
yield from writer.drain()
print("Close the client socket")
writer.close()
if received value contains (name="calculator2") I wait for 5 seconds
if not, just answer and write data immediately.
But when test it, first send data to server with containing (name="calculator2") and next data without (name="calculator2"), but next data handles after 5 seconds of first is done and after that 2th data will be handled.
its sequential. what it wrong with it?
and the other way, how should I get client connected ip and port?
回答5:
If the proxy server is running all the time it cannot switch back and forth. The proxy listens for client requests and makes them asynchronous, but the other task cannot execute, because this one is serving forever.
If the proxy is a coroutine and is starving the SNMP-poller (never awaits), isn't the client requests being starved aswell?
every coroutine will run forever, they will not end
This should be fine, as long as they do await/yield from
. The echo server will also run forever, it doesn't mean you can't run several servers (on differents ports though) in the same loop.