I have a process which spawns 2 types of thread classes. One thread is responsible for consuming a job_queue (100Threads of this class are usually running). And second thread is a killing thread. I am using a result_done flag which is being set from thread2 and the issue is my threads1 wait for X seconds and then check if result_done flag is set.
def run(self):
while True:
try:
val = self.job_queue.get(True,self.maxtimeout)
except:
pass
if self.result_done.isset():
return
Now, if maxtimeout is set to 500seconds and I set the result_done flag from another thread, this thread will wait for 500 seconds before exiting( if there's no data in the queue).
What I want to achieve is that all the threads die gracefully along with the current process, properly terminating the db,websocket,http connections etc as soon as result_done event is set from any of the threads of the process.
I am using python multiprocess library to create the process which spawns these threads.
Update: All threads are daemon=True threads.
To avoid waiting
maxtimeout
time, you could useEvent.wait()
method:Event.wait(timeout)
returns without waiting the fulltimeout
if the event is set during the call.