Please be so kind to point me on how can a list or an array be shared between processes so they can access/append/delete data from it? Do i need to use Manager for it? For example i have code that pings several hosts using multiprocessing:
#!/usr/bin/env python
from multiprocessing import Pool
import os
def ping(ip):
report = ("No response","Partial Response","Alive")
pingaling = os.popen("ping -q -c2 "+str(ip),"r")
while 1:
line = pingaling.readline()
try:
result = line[line.find(','):].split()[1]
output = report[int(result[0])]
except:
pass
if not line: break
print "Testing %s : %s!" % (ip, output)
if __name__ == '__main__':
pool = Pool(processes=3)
host = ['81.24.212.'+str(x) for x in range(10)]
pool.map(ping, host, 1)
pool.close()
pool.join()
But the output is unsorted, however i want to add output to an array and sort it:
Testing 81.24.212.1 : Alive!
Testing 81.24.212.2 : Alive!
Testing 81.24.212.6 : Alive!
Testing 81.24.212.0 : No response!
Testing 81.24.212.5 : No response!
Testing 81.24.212.3 : No response!
Testing 81.24.212.4 : No response!
Testing 81.24.212.9 : No response!
Testing 81.24.212.7 : No response!
Testing 81.24.212.8 : No response!
note that
pool.map
acts a lot like the buildinmap
function and like it returns a list of the results of the application of the given function to elements in the given list.so you need only have
ping()
return what it found and then do:after which you can use the
list_of_values
any way you likeThe data structure you're looking for is
multiprocessing.Queue
. You can pop values out of the queue into a list until you've got as many values as there were processes, then sort and print them. With your particular application, however, Dan D'spool.map
answer has a better approach.I strongly recommend that you use something like redis ( http://redis.io ), which is a datastructures server - it exists to facilitate the sharing of datastructures in a robust manner.