C # multithreading

lets say that I have an event that fires as 10 times per second.

void Session_OnEvent(object sender, CustomEventArgs e) { //DoStuff DoLongOperation(e); } 

I need a DoLongOperation (e) method; to process a separate thread each time the event is triggered,

I could do something like:

 new Thread(DoLongOperation).Start(e); 

but I have the feeling that this is bad for performance, I want to get the best performance, and what is the best thing I can do?

thanks idvance ..

edit: when I said long, I didnโ€™t mean a retreat that would require more than 1 second maximum, I just donโ€™t want the event to wait for that time, so I want to do this in a separate chain ...

+4
source share
5 answers

Use one thread to process your request and insert work items for the stream from your event.

Namely:

  • copy e
  • create a list of <CustomEventArgs> and insert a copy at the end
  • synchronize access to this list from the stream and from the event

As a member of the class, do the following:

 List< CustomEventArgs > _argsqueue; Thread _processor; 

In the constructor of the do class:

 _argsqueue=new List< CustomEventArgs >(); _processor=new Thread(ProcessorMethod); 

Define processormethod:

 void ProcessorMethod() { while (_shouldEnd) { CustomEventArgs e=null; lock (_argsqueue) { if (_argsqueue.Count>0) { CustomEventArgs e=_argsqueue[0]; _argsqueue.RemoveAt(0); } } if (e!=null) { DoLongOperation(e); } else { Sleep(100); } } } 

And in your case:

 lock (_argsqueue) { _argsqueue.Add(e.Clone()); } 

You will need to deal with the details for yourself, for example, when you close the form or when deleting the object in question, you need to:

 _shouldEnd=true; _processor.Join(); 
+2
source

A direct answer to your question: use a managed thread pool using ThreadPool.QueueUserWorkItem to push your operations to it. (You might want to take a look at the answer to the question " when do I use a thread pool against my own threads? ").

However, look at the larger picture: if all the operations that you start take more than 100 ms to complete, you are mathematically going to generate more work than you can handle. It will not end well, no matter how you cut it. For example, if you create a separate thread each time, then your process will work from threads, if you use a thread pool, you will swamp it with work that it can never complete, etc.

If only some of your operations end longest and most complete immediately, then you may have a chance at a practical solution. Otherwise, you need to rethink the design of your program.

+6
source

If you are using C # 4.0, you may need to use a task scheduler . Since your DoLongOperation implies that it will work for a long time, you should consider the following

Long term tasks

You can explicitly deny the task being placed on the local queue. For example, you may know that a particular work item will be relatively long and probably block all other work items on the local queue. In this case, you can specify the LongRunning parameter, which gives a hint to the scheduler, which may require an additional thread for the task, so that it does not block forward the progress of other threads or work items in the local queue. From using this option, you avoid ThreadPool completely, including global and local queues.

Another nice thing about using TaskScheduler is that it has a MaximumConcurrencyLevel. This allows you to easily set up your concurrency after testing recommended by John.

Here is an example from MSDN that does just that

 namespace System.Threading.Tasks.Schedulers { using System; using System.Collections.Generic; using System.Linq; using System.Threading; class Program { static void Main() { LimitedConcurrencyLevelTaskScheduler lcts = new LimitedConcurrencyLevelTaskScheduler(1); TaskFactory factory = new TaskFactory(lcts); factory.StartNew(()=> { for (int i = 0; i < 500; i++) { Console.Write("{0} on thread {1}", i, Thread.CurrentThread.ManagedThreadId); } } ); Console.ReadKey(); } } /// <summary> /// Provides a task scheduler that ensures a maximum concurrency level while /// running on top of the ThreadPool. /// </summary> public class LimitedConcurrencyLevelTaskScheduler : TaskScheduler { /// <summary>Whether the current thread is processing work items.</summary> [ThreadStatic] private static bool _currentThreadIsProcessingItems; /// <summary>The list of tasks to be executed.</summary> private readonly LinkedList<Task> _tasks = new LinkedList<Task>(); // protected by lock(_tasks) /// <summary>The maximum concurrency level allowed by this scheduler.</summary> private readonly int _maxDegreeOfParallelism; /// <summary>Whether the scheduler is currently processing work items.</summary> private int _delegatesQueuedOrRunning = 0; // protected by lock(_tasks) /// <summary> /// Initializes an instance of the LimitedConcurrencyLevelTaskScheduler class with the /// specified degree of parallelism. /// </summary> /// <param name="maxDegreeOfParallelism">The maximum degree of parallelism provided by this scheduler.</param> public LimitedConcurrencyLevelTaskScheduler(int maxDegreeOfParallelism) { if (maxDegreeOfParallelism < 1) throw new ArgumentOutOfRangeException("maxDegreeOfParallelism"); _maxDegreeOfParallelism = maxDegreeOfParallelism; } /// <summary>Queues a task to the scheduler.</summary> /// <param name="task">The task to be queued.</param> protected sealed override void QueueTask(Task task) { // Add the task to the list of tasks to be processed. If there aren't enough // delegates currently queued or running to process tasks, schedule another. lock (_tasks) { _tasks.AddLast(task); if (_delegatesQueuedOrRunning < _maxDegreeOfParallelism) { ++_delegatesQueuedOrRunning; NotifyThreadPoolOfPendingWork(); } } } /// <summary> /// Informs the ThreadPool that there work to be executed for this scheduler. /// </summary> private void NotifyThreadPoolOfPendingWork() { ThreadPool.UnsafeQueueUserWorkItem(_ => { // Note that the current thread is now processing work items. // This is necessary to enable inlining of tasks into this thread. _currentThreadIsProcessingItems = true; try { // Process all available items in the queue. while (true) { Task item; lock (_tasks) { // When there are no more items to be processed, // note that we're done processing, and get out. if (_tasks.Count == 0) { --_delegatesQueuedOrRunning; break; } // Get the next item from the queue item = _tasks.First.Value; _tasks.RemoveFirst(); } // Execute the task we pulled out of the queue base.TryExecuteTask(item); } } // We're done processing items on the current thread finally { _currentThreadIsProcessingItems = false; } }, null); } /// <summary>Attempts to execute the specified task on the current thread.</summary> /// <param name="task">The task to be executed.</param> /// <param name="taskWasPreviouslyQueued"></param> /// <returns>Whether the task could be executed on the current thread.</returns> protected sealed override bool TryExecuteTaskInline(Task task, bool taskWasPreviouslyQueued) { // If this thread isn't already processing a task, we don't support inlining if (!_currentThreadIsProcessingItems) return false; // If the task was previously queued, remove it from the queue if (taskWasPreviouslyQueued) TryDequeue(task); // Try to run the task. return base.TryExecuteTask(task); } /// <summary>Attempts to remove a previously scheduled task from the scheduler.</summary> /// <param name="task">The task to be removed.</param> /// <returns>Whether the task could be found and removed.</returns> protected sealed override bool TryDequeue(Task task) { lock (_tasks) return _tasks.Remove(task); } /// <summary>Gets the maximum concurrency level supported by this scheduler.</summary> public sealed override int MaximumConcurrencyLevel { get { return _maxDegreeOfParallelism; } } /// <summary>Gets an enumerable of the tasks currently scheduled on this scheduler.</summary> /// <returns>An enumerable of the tasks currently scheduled.</returns> protected sealed override IEnumerable<Task> GetScheduledTasks() { bool lockTaken = false; try { Monitor.TryEnter(_tasks, ref lockTaken); if (lockTaken) return _tasks.ToArray(); else throw new NotSupportedException(); } finally { if (lockTaken) Monitor.Exit(_tasks); } } } } 
+2
source

Performance will largely depend on several factors:

  • How many threads will be launched at once?
  • What will they do?
  • How long will they work? (Minimum lead time, maximum, average)
  • What happens if one of the threads is down?

Ten times per second is a fairly high activity rate. Depending on the duration of the execution, it may make sense to use a separate process, such as a service. Obviously, activity must be thread safe, that is (partially) there is no competition for resources. If two threads may need to update the same resource (file, memory location), you will need to use a lock. This can hinder efficiency if not handled well.

+1
source

Yes you can do it. But when your event fires 10 times per second and you start 10 long operations per second, you will quickly end the threads.

0
source

Source: https://habr.com/ru/post/1332911/


All Articles