Using node.js as the tcp server, I'm going to manage a relatively large number of GPS devices (~ 3000 device), and as a first step I’m just going to store the incoming data in the database, but even at this point I assume some performance problems that they bother me and I would like to catch them before they bite me.
1 - Looking at written similar servers using languages such as java or ruby, I see code like the following:
Java
Thread serverThread = new Thread(() -> { System.out.println("Listening to server port 9000"); while (true) { try { Socket socket = serverSocket.accept(); ...
ruby
require 'socket' server = TCPServer.new ("127.0.0.1",8080) loop do Thread.start(server.accept) do |client| ...
It seems that they give a separate stream for each device (socket) that connect to the tcp server? Since node.js is single-threaded and acts asynchronously, should you worry about incoming connections or will something like the following simple approach satisfy a large number of simultaneous connections?
net.createServer(function(device) { device.on('data', function(data) {
2 - Should I restrict connections to the database using the connection pool? How does the database also ask on the other hand for GIS and monitoring, how much should the pool size be?
3 - How can I use caching (for example, using redis) in such a system?
It should be great if someone sheds light on these thoughts. I would also like to hear any other thoughts on performance that you may experience or recognize when implementing such systems. Thanks.
source share