Interprocess communication in Python

2019-01-03 02:09发布

What is a clean and elegant way to do interprocess communication between two different python processes? I currently use named pipes in the OS, but it feels a bit hacky. I rewrote my stuff with dbus services, which worked, but it seems when running the code remotely through an SSH session it now tries to initialise X11 which seems completely unnecessary for the things I want to do (they aren't GUI related). So maybe dbus is a bit too heavyweight. I was about to redesign again using sockets, but it seems quite low-level so I thought there might be a higher level module I could import and use which I simply don't know the name of, and I thought I should ask on SO first..

My requirement is to be able to run python foo.py and have that process just doing it's thing there, like a daemon, and be able to send messages to it with python foo.py --bar. The latter call should just sends a message to the existing process and terminates, possibly with a return code 0 for success or other for failure (so some two-way communication will be required).

6条回答
\"骚年 ilove
2楼-- · 2019-01-03 02:27

The multiprocessing library provides listeners and clients that wrap sockets and allow you to pass arbitrary python objects.

Your server could listen to receive python objects:

from multiprocessing.connection import Listener

address = ('localhost', 6000)     # family is deduced to be 'AF_INET'
listener = Listener(address, authkey='secret password')
conn = listener.accept()
print 'connection accepted from', listener.last_accepted
while True:
    msg = conn.recv()
    # do something with msg
    if msg == 'close':
        conn.close()
        break
listener.close()

Your client could send commands as objects:

from multiprocessing.connection import Client

address = ('localhost', 6000)
conn = Client(address, authkey='secret password')
conn.send('close')
# can also send arbitrary objects:
# conn.send(['a', 2.5, None, int, sum])
conn.close()
查看更多
叛逆
3楼-- · 2019-01-03 02:29

I would use sockets, but use Twisted to give you some abstraction, and to make things easy. Their Simple Echo Client / Server example is a good place to start.

You would just have to combine the files and instantiate and run either the client or server depending on the passed argument(s).

查看更多
走好不送
4楼-- · 2019-01-03 02:30

From my experience, rpyc is by far the simplest and most elegant way to go about it.

(I know this is an old question, but I've just stumbled upon it..)

查看更多
做自己的国王
5楼-- · 2019-01-03 02:42

I would use sockets; local communication was strongly optimized, so you shouldn't have performance problems and it gives you the ability to distribute your application to different physical nodes if the needs should arise.

With regard to the "low-level" approach, you're right. But you can always use an higher-level wrapper depending on your needs. XMLRPC could be a good candidate, but it is maybe overkill for the task you're trying to perform.

Twisted offers some good protocol simple implementations, such as LineReceiver (for simple line based messages) or the more elegant AMP (which was, by the way, standardized and implemented in different languages).

查看更多
我欲成王,谁敢阻挡
6楼-- · 2019-01-03 02:47

Nah, zeromq is the way to go. Delicious, isn't it?

import argparse
import zmq

parser = argparse.ArgumentParser(description='zeromq server/client')
parser.add_argument('--bar')
args = parser.parse_args()

if args.bar:
    # client
    context = zmq.Context()
    socket = context.socket(zmq.REQ)
    socket.connect('tcp://127.0.0.1:5555')
    socket.send(args.bar)
    msg = socket.recv()
    print msg
else:
    # server
    context = zmq.Context()
    socket = context.socket(zmq.REP)
    socket.bind('tcp://127.0.0.1:5555')
    while True:
        msg = socket.recv()
        if msg == 'zeromq':
            socket.send('ah ha!')
        else:
            socket.send('...nah')
查看更多
爷的心禁止访问
7楼-- · 2019-01-03 02:48

Check out a cross-platform library/server called RabbitMQ. Might be too heavy for two-process communication, but if you need multi-process or multi-codebase communication (with various different means, e.g. one-to-many, queues, etc), it is a good option.

Requirements:

$ pip install pika
$ pip install bson # for sending binary content
$ sudo apt-get rabbitmq-server # ubuntu, see rabbitmq installation instructions for other platforms

Publisher (sends data):

import pika, time, bson, os

connection = pika.BlockingConnection(pika.ConnectionParameters('localhost'))
channel = connection.channel()
channel.exchange_declare(exchange='logs', type='fanout')

i = 0
while True:
    data = {'msg': 'Hello %s' % i, b'data': os.urandom(2), 'some': bytes(bytearray(b'\x00\x0F\x98\x24'))}
    channel.basic_publish(exchange='logs', routing_key='', body=bson.dumps(data))
    print("Sent", data)
    i = i + 1
    time.sleep(1)

connection.close()

Subscriber (receives data, can be multiple):

import pika, bson

connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
channel = connection.channel()

channel.exchange_declare(exchange='logs', type='fanout')

result = channel.queue_declare(exclusive=True)
queue_name = result.method.queue

channel.queue_bind(exchange='logs', queue=queue_name)

def callback(ch, method, properties, body):
    data = bson.loads(body)
    print("Received", data)

channel.basic_consume(callback, queue=queue_name, no_ack=True)
channel.start_consuming()

Examples based on https://www.rabbitmq.com/tutorials/tutorial-two-python.html

查看更多
登录 后发表回答