I am very new to Tornado. Was just seeing how to handle a request which blocks in Tornado. I run the blocking code in a separate thread. However the main thread still blocks till the threaded function finishes. I am not using gen.coroutine here, but have tried that and the result is the same
counter = 0
def run_async(func):
@wraps(func)
def function_in_a_thread(*args, **kwargs):
func_t = Thread(target=func, args=args, kwargs=kwargs)
func_t.start()
return function_in_a_thread
def long_blocking_function(index, sleep_time, callback):
print "Entering run counter:%s" % (index,)
time.sleep(sleep_time)
print "Exiting run counter:%s" % (index,)
callback('keyy' + index)
class FooHandler(tornado.web.RequestHandler):
@web.asynchronous
def get(self):
global counter
counter += 1
current_counter = str(counter)
print "ABOUT to spawn thread for counter:%s" % (current_counter,)
long_blocking_function(
index=current_counter,
sleep_time=5, callback=self.done_waiting)
print "DONE with the long function"
def done_waiting(self, response):
self.write("Whatever %s " % (response,))
self.finish()
class Application(tornado.web.Application):
def __init__(self):
handlers = [(r"/foo", FooHandler),
]
settings = dict(
debug=True,
)
tornado.web.Application.__init__(self, handlers, **settings)
def main():
application = Application()
application.listen(8888)
tornado.ioloop.IOLoop.instance().start()
if __name__ == "__main__":
main()
When I issue back to back requests the FooHandler blocks and does not recieve any requests till the long_blocking_function finishes. So I end up seeing something like
ABOUT to spawn thread for counter:1
Entering run counter:1
Exiting run counter:1
DONE with the long function
ABOUT to spawn thread for counter:2
Entering run counter:2
Exiting run counter:2
DONE with the long function
ABOUT to spawn thread for counter:3
Entering run counter:3
Exiting run counter:3
DONE with the long function
I was expecting something along these lines(as I am issuing multiple requests before the first call to long_blocking_function finishes) but am only seeing trace similar to above
ABOUT to spawn thread for counter:1
DONE with the long function
ABOUT to spawn thread for counter:2
DONE with the long function
ABOUT to spawn thread for counter:3
DONE with the long function
ABOUT to spawn thread for counter:4
DONE with the long function
I have looked at Tornado blocking asynchronous requests and tried both the solutions. But both of them are blocking when I run them with back to back requests to the same handler. Can somebody figure out what I am doing wrong? I know tornado doesn't do well with multithreading, but I should be able to run a new thread from it, in a non-blocking way.
Tornado plays well with the concurrent.futures library (there's a Python 2.x backport available), so you could hand off your long running requests to a thread pool by using a ThreadPoolExecutor.
This technique works pretty well - we use it to handle long-running database operations. Of course, in the real world you'd also want to handle timeouts and other exceptions in a robust and graceful manner, but I hope this example is enough to illustrate the idea.
You simply forgot to actually use the
run_async
decorator you have defined.when you offload a task to another thread, you register a callback in the event-loop that is called once the task is finished. The answers are in this thread: How to make a library asynchronous in python.
regards m.