Processing element queue asynchronously in C #

I am trying to create a system that processes a work queue. The system has the following characteristics:

  • The system consists of two components: worker and worker.
  • There is a set upper limit on the number of employees working simultaneously. This upper limit is greater than one.
  • To avoid problems with the same task being performed twice, there is only one work assignor.

What design would you use to create such a system? That's what I think:

  • Create a collection of queues, one queue for each worker
  • Create a timer for the work destination. His task was to fill the lines.
  • Create a timer for each worker by passing the queue object as the state of the object to represent its workload
  • Delete and add to the queues while they are blocked.
  • Use a counter that increases and decreases with blocking to ensure that no more than the specified number of work tasks will be executed at a time.

I feel that there must be a better way to do this. What would you suggest? Should I switch from timers to worker threads? Should threads just rotate / wait while the queue is empty? Should threads close and have a job to conditionally create a new one?

+4
source share
3 answers

I don’t know how long your tasks will work, but it seems that ThreadPool will be the best. Moreover, I would use and actually use only one central queue - this in itself will eliminate some complexity. I have one Thread that processes the queue and performs an action on an element in your case, this will be the task queue.

As for creating the queue in streaming mode, there is a ConcurrentQueue in System.Collections.Concurrent for this purpose ( msdn , checkpoint against blocking ).

Now drop the BlockingCollection ( msdn ) and you have everything you need.

BlockingCollection<Packet> sendQueue = new BlockingCollection<Packet>(new ConcurrentQueue<Packet>()); while (true) { var packet = sendQueue.Take(); //this blocks if there are no items in the queue. ThreadPool.QueueUserWorkItem(state => { var data = (Packet)state; //do whatever you have to do }, packet ); } 

and somewhere there is something that sendQueue.Add(packet);

Summarizing,

  • One turn for all "workers"
  • One thread that is removed from the queue and passes it to ThreadPool.

I think that it is.

ps: if you need to control the number of threads, use the "Intelligent Thread Pool" as suggested by josh3736

+10
source

Use thread pool. Here is one that handles the order of work items and sends them through the thread pool.

+3
source

You are on the right track. You can use MSMQ and C # multithreaded service. I started writing multithreaded C # services with this article . The article takes on a certain age, but the principles simply have not changed, so this is relevant. It's easy to understand and better, but he explores both of the approaches that you suggest. Feel free to email me if you need more help.

0
source

Source: https://habr.com/ru/post/1392833/


All Articles