Python multicore programming [duplicate]

2019-01-23 16:57发布

This question already has an answer here:

Please consider a class as follow:

class Foo:
    def __init__(self, data):
        self.data = data

    def do_task(self):
        #do something with data 

In my application I've a list containing several instances of Foo class. The aim is to execute do_task for all Foo objects. A first implementation is simply:

 #execute tasks of all Foo Object instantiated
 for f_obj in my_foo_obj_list:
     f_obj.do_task()

I'd like to take advantage of multi-core architecture sharing the for cycle between 4 CPUs of my machine.

What's the best way to do it?

2条回答
成全新的幸福
2楼-- · 2019-01-23 17:34

You can use process pools in multiprocessing module.

def work(foo):
    foo.do_task()

from multiprocessing import Pool

pool = Pool()
pool.map(work, my_foo_obj_list)
pool.close()
pool.join()
查看更多
疯言疯语
3楼-- · 2019-01-23 17:42

Instead of going through all the multithreading/multicore basics, I would like to reference a Post by Ryan W. Smith: Multi-Core and Distributed Programming in Python

He will go into details how you can utilize multiple cores and use those concepts. But please be careful with that stuff if you are not familiar with general multithreading concepts.

Functional Programming will also allow you to customize the algorithm/function for each core.

查看更多
登录 后发表回答