We are developing a realtime app and we are using nginx push stream module for a websockets part. Firstly, data is send from a client to a php script that does some authentication and stores needed information in database and then pushes information to nginx that later sends it to a subscribed users on a specific sockets. Quite often there will be situations when there are more that 30 http requests made from this script to local nginx (which I am not exactly sure is a bad thing?).
Question
Is it possible to send information from php to nginx without http requests? Is there any way that my php script can communicate with nginx? What is a best practise to handle this kind of communications? Is sending 30+ http requests per php script a good practise?
I have read towards some AMQP solutions but haven't found information where nginx is a consumer of messages from rabbitmq.
I will gladly provide any additional information if something is not clear.
Let me answer step by step:
It's not a problem until you satisfied with the speed. You have possible two potential problems with that solution:
As you said, best practice to use Query interface. But I am not sure, is there a way to handle it on nginx side (I am not clear with technology you using on nginx side).
Also you can use long polling connection to send requests to nginx, that will decrease latency from problem (a), but it can issue some new problems.
I am assuming the following:
Current work flow:
OP concern:
Efficiency of command line php script communicating with Nginx server side script using http protocol, which maybe overkill as the communication happen within the same server.
Proposal 1
Proposal 2
Combine your command line php script into the Nginx server side script, and create a web interface for it. Current command line user will login webpage to control the process they used to do it with command line tool.
Pro: No more inter-scripts/inter-process communication. The whole work flow is in one process. This maybe more scalable for the future also, as multiple users can log in through web interface and handle the process remotely. Additionally, they do not require OS level accounts.
Con: May need more development time. (But you only have to maintain one code base instead of two.)
What about using PHP-FPM connected to Nginx over Unix domain sockets using the FastCGI protocol? That's the fastest way to do IPC between Nginx and PHP — there's very little IO overhead, compared to an Internet socket.
Why don't you consider using socket.io and Amazon SNS?
In our infrastructure when we want to send a notification to a specific client subscribed on a socket.io channel, we send a payload to an Amazon SNS topic. This payload has "channel" attribute and the "message" to send to the client. I give just a snippet from our code that's easy to understand
We have a node.js script that creates and endpoint on the port 8002 (http://your_ip:8002/receive) When Amazon SNS receives a payload from PHP backends, it forwards this payload to this endpoint and then the only thing to do is processing the payload and send the message to the corresponding client via socket.js. Here goes the node.js script:
Maybe it seems complicated but i the idea is clear.
Another solution that we tried before was deploying an ejabberd server (you can customize it) write a small javascript client using strophe client. http://blog.wolfspelz.de/2010/09/website-chat-made-easy-with-xmpp-and.html?m=1 good blog post about the topic. If you want to develop a chat application i would go for this option.
Another advantage, your users can also use xmpp clients to connect to your chat platform.