Parallel socket.io?

Half of the functionality of the site works on a socket.io and user where he was not connected to the socket.io, but since it's all running in a single thread in the end, the speed drops sharply to a large number of connections.

Is there a way how to spread user streams? (Say 6 threads, each thread 1/6 users)
All of them should have access to the shared memory (the number of arrays, one with constant data, one changes by itself over time, one changes connections (sockets))
July 2nd 19 at 13:32
2 answers
July 2nd 19 at 13:34
Solution
Maybe just start node to multiple processes using Cluster , or a good wrapper PM2

That is, if I certainly correctly understood the task.
Thank you, my good man. About the existence of the Cluster did not know.

As I understand it you have to take the code from the example cluster, in "else" put the whole code of the sockets well, and there are already just to start using pm2.

Only one question. You will use one memory space for a third array (the array of users from the database that is queried once when the user logs on to the site in the future to an array of clients needed user just added socket.id, so the output socket.id is removed from clients of the user data in the array modifierade sockets, which are specified in the clients, it is made to reduce the number of queries to the database, the consumed memory and to remove a bad option to request old data from the database, if one user has opened multiple tabs)? - Birdie.Heathcote commented on July 2nd 19 at 13:37
About the first question in Stack Overflow - emmet.Langosh35 commented on July 2nd 19 at 13:40
: Thank you - Birdie.Heathcote commented on July 2nd 19 at 13:43
July 2nd 19 at 13:36
About the processes you friend well said.
About storing data between child processes - read what is Redis for example.

Find more questions by tags Socket.ioNode.jsJavaScript