Task background worker c #

Are there any changes that several background workers perform better than tasks on 5 second running processes? I remember reading in a book that the task is for short running processes.

I ask the following:

I have a process that takes 5 seconds and 4,000 processes terminate. First I did:

for (int i=0; i<4000; i++) { Task.Factory.StartNewTask(action); } 

and it had poor performance (after the first minute, 3-4 tasks where it was completed, and the console application had 35 threads). It might have been stupid, but I thought that the thread pool would handle this situation (it would put all the actions in the queue, and when the thread is free, it would take an action and execute it).

The second step now was to manually do the Background Environment.ProcessorCount workspaces and all the actions that need to be placed in ConcurentQueue. Thus, the code will look something like this:

 var workers = new List<BackgroundWorker>(); //initialize workers workers.ForEach((bk) => { bk.DoWork += (s, e) => { while (toDoActions.Count > 0) { Action a; if (toDoActions.TryDequeue(out a)) { a(); } } } bk.RunWorkerAsync(); }); 

It has improved. He performed much better tasks, even when I had 30 background workers (as many tasks as in the first case).

LE:

I run the following tasks:

  public static Task IndexFile(string file) { Action<object> indexAction = new Action<object>((f) => { Index((string)f); }); return Task.Factory.StartNew(indexAction, file); } 

And the Index method is this:

  private static void Index(string file) { AudioDetectionServiceReference.AudioDetectionServiceClient client = new AudioDetectionServiceReference.AudioDetectionServiceClient(); client.IndexCompleted += (s, e) => { if (e.Error != null) { if (FileError != null) { FileError(client, new FileIndexErrorEventArgs((string)e.UserState, e.Error)); } } else { if (FileIndexed != null) { FileIndexed(client, new FileIndexedEventArgs((string)e.UserState)); } } }; using (IAudio proxy = new BassProxy()) { List<int> max = new List<int>(); if (proxy.ReadFFTData(file, out max)) { while (max.Count > 0 && max.First() == 0) { max.RemoveAt(0); } while (max.Count > 0 && max.Last() == 0) { max.RemoveAt(max.Count - 1); } client.IndexAsync(max.ToArray(), file, file); } else { throw new CouldNotIndexException(file, "The audio proxy did not return any data for this file."); } } } 

These methods read some data from an mp3 file using the Bass.net library. This data is then sent to the WCF service using the async method. The IndexFile method (a string file) that creates tasks is called 4,000 times in a for loop. These two events, FileIndexed and FileError, are not processed, so they are never thrown.

+6
source share
3 answers

The reason performance for tasks was so poor was because you installed too many small tasks (4000). Remember that the CPU must also schedule tasks, so setting up a lot of short-lived tasks causes an extra load on the processor. More details can be found in the second paragraph of the TPL :

Starting with the .NET Framework 4, TPL is the preferred way to write multi-threaded and parallel code. However, not all code is suitable for parallelization; for example, if a loop performs only a small amount of work at each iteration, or it does not work for many iterations, then the parallelization overhead can cause the code to run slower.

When you used background workers, you limited the number of possible ProcessCount live threads. This reduced planning overhead.

+1
source

Given that you have a strictly defined list of things, I should use the Parallel class (either For or ForEach , depending on what suits you). In addition, you can pass a configuration parameter to any of these methods to control how many tasks are actually running at the same time:

  System.Threading.Tasks.Parallel.For(0, 20000, new ParallelOptions() { MaxDegreeOfParallelism = 5 }, i => { //do something }); 

The code above will perform 20,000 operations, but will not perform more than 5 operations at a time.

I SUSPECT why background workers did better for you because you created them and created them at the beginning, while in your sample code of the Task it seems that you are creating a new Task object for each operation.

Alternatively, you thought about using a fixed number of Task objects created at the beginning, and then perform a similar action with ConcurrentQueue , as it did with background workers? It should also be quite effective.

+1
source

Have you considered using threadpool?

http://msdn.microsoft.com/en-us/library/system.threading.threadpool.aspx

If your performance is slower when using threads, this can only be due to overhead (allocating and destroying individual threads).

0
source

Source: https://habr.com/ru/post/916598/


All Articles