Scalable socket event queue processing

My C # class should be able to handle a large amount of events received through a socket connection of type tcp stream. The scope of event messages received from the tcp server of the class socket is completely variable. For example, sometimes he receives only one event message for ten seconds, and at other times he receives sixty event messages per second.

I am using Socket.ReceiveAsync to receive messages. ReceiveAsync returns true if the receive operation is waiting, or false if there is already data on the wire, and the receive operation is synchronous. If the operation fails, Socket will call my callback in the I / O completion thread, otherwise I will call my own callback in the current thread (IOC). Besides being mixed with event messages, I also get responses to commands sent to this tcp server. Response messages are processed immediately; individually, by shelling the work flow.

However, I would like to queue event messages until I have enough "(N) of them OR until there is no more on the explorer ...), and then run the stream handler to process event reporting package.In addition, I want all events to be processed sequentially, so I want one workflow to work on this at a time.

The event message process only needs to copy the message buffer to the object, raise the event, and then free the message buffer back to the ring buffer pool. So my question is ... what do you think is the best strategy for this?

Do you need more information? Let me know. Thanks!!

+4
source share
1 answer

I would not call 60 events per second high volume. With such a low level of activity, any method of processing sockets will not be a fine at all. I processed 5,000 events per second in a single thread, using hardware that is much less capable than the current ones, just using select.

I will say that if you want to scale, sending messages separately between threads will be a disaster. You need to perform a batch installation, or your context switches will kill performance.

+5
source

Source: https://habr.com/ru/post/1276914/


All Articles