Scaling socket.io with redis

I am currently creating a horizontally scalable socket.io server that looks like this:

LoadBalancer (nginx) Proxy1 Proxy2 Proxy3 Proxy{N} BackEnd1 BackEnd2 BackEnd3 BackEnd4 BackEnd{N} 

My question is: with the socket-io redis module, can I send a message to a specific socket connected to one of the proxy servers from one of the server servers, if they are all connected to the same redis server? If so, how to do it?

+5
source share
2 answers

As you want to scale the socket.io server, and you used nginx as a load balancer, do not forget to set sticky load balancing , another Singleie connection will be connected to several servers based on load balancing, passing the connection to the socket. io server. Therefore it is better to use sticky load balancing

With the redis socket io adapter, you can send and receive messages with one or more socket.io servers using the Redis Pub/Sub implementation .

If you tell me which technology is used for Proxy and Backend, I will give you more information about this.

+3
source

Using the socket.io-redis module, all of your server servers will use the same pool of connected users. You can emit from Backend1, and if the client is connected to Backend4, it will receive a message.

The key to this work, although with Socket.io is to use sticky sessions on nginx, so that as soon as I connect the client, it will remain on one computer. This is because the socket.io path starts with a WebSocket and several long poll threads, all of which must be on the same backend server for it to work properly.

Instead of using sticky sessions, you can change the client connection option to use only Websockets, and this will fix problems with multiple connections to multiple servers, as there will be only one connection - a single website. This will also cause your application to lose the ability to switch to longer polling instead of WebSockets.

+2
source

Source: https://habr.com/ru/post/1237903/


All Articles