What is the most efficient library / method for interaction between node.js processes?

We have few node.js processes that should be able to pass messages. What is the most efficient way to do this? How about using node_redis pub / sub

EDIT: processes can run on different machines

+42
javascript redis interprocess
Jun 24 2018-11-11T00:
source share
6 answers

If you want to send messages from one machine to another and not worry about callbacks, then Redis pub / sub is the best solution. It is really easy to implement, and Redis is very fast.

First you need to install Redis on one of your machines.

It is really easy to connect to Redis:

var client = require('redis').createClient(redis_port, redis_host); 

But do not forget about opening the Redis port in your firewall!

Then you need to subscribe to each machine on some channel:

 client.on('ready', function() { return client.subscribe('your_namespace:machine_name'); }); client.on('message', function(channel, json_message) { var message; message = JSON.parse(message); // do whatever you vant with the message }); 

You can skip your_namespace and use the global namespace, but you'll regret it soon.

It is also very easy to send messages:

 var send_message = function(machine_name, message) { return client.publish("your_namespace:" + machine_name, JSON.stringify(message)); }; 

If you want to send different types of messages, you can use pmessages instead of messages:

 client.on('ready', function() { return client.psubscribe('your_namespace:machine_name:*'); }); client.on('pmessage', function(pattern, channel, json_message) { // pattern === 'your_namespace:machine_name:*' // channel === 'your_namespace:machine_name:'+message_type var message = JSON.parse(message); var message_type = channel.split(':')[2]; // do whatever you want with the message and message_type }); send_message = function(machine_name, message_type, message) { return client.publish([ 'your_namespace', machine_name, message_type ].join(':'), JSON.stringify(message)); }; 

It is best practice to name your processes (or machines) their functionality (for example, 'send_email' ). In this case, a process (or machine) can be subscribed to more than one channel, if it implements several functionality.

In fact, you can build bidirectional communication using redis. But this is more complicated, because for each request it is necessary to add a unique callback channel name in order to receive a callback without losing context.

So, my conclusion is this: Use Redis if you need to "send and forget" a message, explore other solutions if you need a full bi-directional communication .

+34
Sep 16 '12 at 15:51
source share

Why not use ZeroMQ / 0mq for IPC? Redis (the database) kills too much by doing something as simple as IPC.

Quoting Guide:

ØMQ (ZeroMQ, 0MQ, zmq) looks like an embedded network library but acts like a concurrency structure. This gives you sockets that carry atomic messages in various vehicles such as in-process, interprocess, TCP and multicast. You can connect N-to-N sockets to patterns such as forks, pub-sub, task distribution, and request-response. It is fast enough to be material for clustered products. This asynchronous I / O model gives you scalable multi-core applications built as asynchronous message processing tasks.

The advantage of using 0MQ (or even vanilla sockets through the network library in the Node core, minus all the functions provided by the 0MQ socket) is the lack of a master process. Its broker setup is best for the scenario you are describing. If you simply push messages to different hosts from one central process, you can use the PUB / SUB socket at 0mq (also supports multicast IP over PGM / EPGM). In addition, 0mq also provides various types of sockets (PUSH / PULL / XREP / XREQ / ROUTER / DEALER), with which you can create your own devices.

Start with this excellent guide: http://zguide.zeromq.org/page:all

For 0MQ 2.x:

http://github.com/JustinTulloss/zeromq.node

For 0MQ 3.x (plug of the above module that supports PUBLISHER filtering for PUBSUB):

http://github.com/shripadk/zeromq.node

+29
Sep 17 '12 at 17:16
source share

More than 4 years after requesting a question, there is a module for communication between processes, called node-ipc . It supports unix / windows sockets for communication on a single computer, as well as TCP, TLS, and UDP, claiming that at least sockets, TCP, and UDP are stable.

Here is a small example taken from the documentation from the github repository:

Server for Unix Sockets, Windows Sockets, and TCP Sockets

 var ipc=require('node-ipc'); ipc.config.id = 'world'; ipc.config.retry= 1500; ipc.serve( function(){ ipc.server.on( 'message', function(data,socket){ ipc.log('got a message : '.debug, data); ipc.server.emit( socket, 'message', data+' world!' ); } ); } ); ipc.server.start(); 

Client for Unix and TCP Sockets

 var ipc=require('node-ipc'); ipc.config.id = 'hello'; ipc.config.retry= 1500; ipc.connectTo( 'world', function(){ ipc.of.world.on( 'connect', function(){ ipc.log('## connected to world ##'.rainbow, ipc.config.delay); ipc.of.world.emit( 'message', 'hello' ) } ); ipc.of.world.on( 'disconnect', function(){ ipc.log('disconnected from world'.notice); } ); ipc.of.world.on( 'message', function(data){ ipc.log('got a message from world : '.debug, data); } ); } ); 

Im currently evaluating this module to replace local ipc (but may be remote ipc in the future) as a replacement for the old solution via stdin / stdout. Maybe I will expand my answer when I finish, to give additional information about how and how well this module works.

+22
Nov 26 '15 at 21:45
source share

i will start with the built-in functions provided by node.
you can use process alarms , for example:

 process.on('SIGINT', function () { console.log('Got SIGINT. Press Control-D to exit.'); }); 

this alarm

Radiated when processes receive a signal. See Sigaction (2) for a list of standard POSIX signal names such as SIGINT, SIGUSR1, etc.

Once you know the process, you can create a child process and attach it to the message event to retrieve and send messages. When using child_process.fork() you can write the child using child.send(message, [sendHandle]) and the messages will be received by the message event for the child.

Alternatively, you can use cluster . The cluster module allows you to easily create a process network in which all shared ports are used.

 var cluster = require('cluster'); var http = require('http'); var numCPUs = require('os').cpus().length; if (cluster.isMaster) { // Fork workers. for (var i = 0; i < numCPUs; i++) { cluster.fork(); } cluster.on('exit', function(worker, code, signal) { console.log('worker ' + worker.process.pid + ' died'); }); } else { // Workers can share any TCP connection // In this case its a HTTP server http.createServer(function(req, res) { res.writeHead(200); res.end("hello world\n"); }).listen(8000); } 

For third-party services, you can check: hook.io , signals and bean .

+4
Sep 12 '12 at 12:11
source share

take a look at node-messenger

https://github.com/weixiyen/messenger.js

will fit most needs easily (pub / under ... fire and forget .. send / request) with automatic supported connection

+2
May 27 '13 at 11:48
source share

we are working on a multiprocessor node application that is required to process a large number of cross-process messages in real time.

First we tried redis-pub-sub, which did not meet the requirements.

Then I tried tcp socket, which was better, but still not the best.

So, we switched to the UDP datagram, which is much faster.

Here is the repo code, just a few lines of code. https://github.com/SGF-Games/node-udpcomm

+1
Sep 29
source share



All Articles