How to ensure that Task.Factory.StartNew does not slow down the main thread?

There are a bunch of compressed pieces of data that should be deflated asynchronously - without blocking or slowing down the main stream in any form or form.

Decompressed pieces will be used by the main thread as soon as they are unpacked.

I am currently doing it like this:

foreach (var chunkPair in compressedChunkData)
{                        
    var task = Task.Factory.StartNew<Chunk>(() =>
    {
        var compressedBytes = Convert.FromBase64String(chunkPair.Value);
        var chunk = Decompress(compressedBytes);
        return chunk;
    }).ContinueWith((finishedTask) =>
    {
        var chunk = finishedTask.Result;
        TaskFinishActions.Enqueue(() =>
        {
            chunk.PostSerialize();
            document.Chunks.Add(chunkPair.Key, chunk);
        });
    });
}
// By the time we get here 20ms has passed!!!

The problem is that it seems to capture the kernel that mainthread is running on, it improves performance.

Is there a way to TaskFactoryhave a thread to the kernel and a context switch from mainthread only in those short moments when mainthread is locked?

EDIT: foreach , , , mainthread .

EDIT2: , :

  • , 250 , compressedChunkData
  • 10 , 12, 0, 2 ..
+4
2

TaskScheduler, . Windows .

, , . , . , N , , .

/ , . , , . . . Task!= Threads.

+1

for , , ?

for, , . ParallelOptions class concurrency

Parallel.ForEach(compressedChunkData, chunkPair => {
    var compressedBytes = Convert.FromBase64String(chunkPair.Value);
    var chunk = Decompress(compressedBytes);
    TaskFinishActions.Enqueue(() => {
        chunk.PostSerialize();
        document.Chunks.Add(chunkPair.Key, chunk);
    });
});

, Jon Skeet. , async await Parallel.Foreach .

EDIT:

: , , , . ). , , - . - .

, , , .

0

Source: https://habr.com/ru/post/1625736/


All Articles