We are developing a realtime app and we are using nginx push stream module for a websockets part. Firstly, data is send from a client to a php script that does some authentication and stores needed information in database and then pushes information to nginx that later sends it to a subscribed users on a specific sockets. Quite often there will be situations when there are more that 30 http requests made from this script to local nginx (which I am not exactly sure is a bad thing?).
Question
Is it possible to send information from php to nginx without http requests? Is there any way that my php script can communicate with nginx? What is a best practise to handle this kind of communications? Is sending 30+ http requests per php script a good practise?
I have read towards some AMQP solutions but haven't found information where nginx is a consumer of messages from rabbitmq.
I will gladly provide any additional information if something is not clear.
I am assuming the following:
Current work flow:
- User run php script from command line, which communicate with a server side script/cgi setup in Nginx using http request
- Server side script/cgi in Nginx will take the incoming data, process it and put it in database, or send out to end user
OP concern:
Efficiency of command line php script communicating with Nginx server side script using http protocol, which maybe overkill as the communication happen within the same server.
Proposal 1
- Command line php script will write all information into file(s),
then send one http request to Nginx server side cgi script
- Nginx server cgi script, upon receiving the request, will pick up all
information from file(s), then process it
- ramfs (ram disk) can be use to minimize I/O to physical HD
Proposal 2
Combine your command line php script into the Nginx server side script, and create a web interface for it. Current command line user will login webpage to control the process they used to do it with command line tool.
Pro: No more inter-scripts/inter-process communication. The whole work flow is in one process. This maybe more scalable for the future also, as multiple users can log in through web interface and handle the process remotely. Additionally, they do not require OS level accounts.
Con: May need more development time. (But you only have to maintain one code base instead of two.)
Why don't you consider using socket.io and Amazon SNS?
In our infrastructure when we want to send a notification to a specific client subscribed on a socket.io channel, we send a payload to an Amazon SNS topic. This payload has "channel" attribute and the "message" to send to the client. I give just a snippet from our code that's easy to understand
$msg = array(
'channel' => $receiver->getCometChannel(), //Channel id of the client to send the message
'data' => json_encode($payload) //The message to send to the client
);
$client = $this->getSNSObject();
$client->publish(array(
'TopicArn' => $topicArn,
'Message' => json_encode($msg)
));
We have a node.js script that creates and endpoint on the port 8002 (http://your_ip:8002/receive) When Amazon SNS receives a payload from PHP backends, it forwards this payload to this endpoint and then the only thing to do is processing the payload and send the message to the corresponding client via socket.js. Here goes the node.js script:
var fs = require('fs');
var options = {
pfx:fs.readFileSync('/etc/ssl/certificate.pfx') //optional, for SSL support for socket.js
};
var io = require('socket.io')(8001);
// open the socket connection
io.sockets.on('connection', function(socket) {
socket.on('subscribe', function(data) { socket.join(data.channel); });
socket.on('unsubscribe', function(data) { socket.leave(data.channel); });
socket.on('message', function (data) {
io.sockets.in(data.channel).emit('message', data.message);
});
})
var http=require('http');
http.createServer(function(req, res) {
if(req.method === 'POST' && req.url === '/receive') {
return client(req, res);
}
res.writeHead(404);
res.end('Not found.');
}).listen(8002);
var SNSClient = require('aws-snsclient');
var client = SNSClient(function(err, message) {
try{
var body=JSON.parse(message.Message)
var channel=body.channel,data=(body.data);
console.log(channel);
io.sockets.in(channel).emit('message', {channel: channel, data: data});
} catch(e) {
console.log(e);
}
});
Maybe it seems complicated but i the idea is clear.
Let me answer step by step:
- Is sending 30+ http requests per php script a good practise?
It's not a problem until you satisfied with the speed. You have possible two potential problems with that solution:
a. high timing to reestablish http connection each request;
b. when concurrent requests reach its maximum nginx can skip some of your requests;
- What is a best practise to handle this kind of communications?
As you said, best practice to use Query interface. But I am not sure, is there a way to handle it on nginx side (I am not clear with technology you using on nginx side).
Also you can use long polling connection to send requests to nginx, that will decrease latency from problem (a), but it can issue some new problems.
What about using PHP-FPM connected to Nginx over Unix domain sockets using the FastCGI protocol? That's the fastest way to do IPC between Nginx and PHP — there's very little IO overhead, compared to an Internet socket.
Another solution that we tried before was deploying an ejabberd server (you can customize it) write a small javascript client using strophe client. http://blog.wolfspelz.de/2010/09/website-chat-made-easy-with-xmpp-and.html?m=1 good blog post about the topic. If you want to develop a chat application i would go for this option.
Another advantage, your users can also use xmpp clients to connect to your chat platform.