Can the distribution of a byte array be performance critical?

On my small little file transfer website ( this one running .NET 4.5.1) I follow the article in the Microsoft Knowledge Base 812406 for sending previously downloaded files from a server to a browser.

Performing performance optimization, I was surprised to find that the line

var buffer = new byte[10000];

takes a significant percentage of the time (I use the Red Gate ANTI Performance Profiler ). The buffer is allocated only once for a full download / client.

My questions:

  • Is it good to allocate a buffer this way and this size?
  • Any alternatives when allocating a ~ 10k buffer?

Update 1:

Thanks to your comments, I saw that memory is also allocated inside the loop.

However, ANTS Profiler only notes that allocating outside the loop takes so much time, which I honestly don’t understand yet. I removed the (meaningless) selection inside the loop.

Update 2:

Having implemented the proposed BufferManager, and also reduced the buffer size from 10 to 4096 (just in case ...), my site has been working very smoothly since then.

+4
source share
3 answers

Yes. In fact, WCF uses a “buffer manager” to prevent this problem .

, , Byte[] . , , GC, . .

BufferManager, .

+3

.NET, , . # 0, . ( , , .)

, Microsoft, . , .

// Gets the exact number of bytes read
length = iStream.Read(buffer, 0, 10000);

// Writes only 'length' bytes to the output
Response.OutputStream.Write(buffer, 0, length);

"" , . .

: buffer= new Byte[10000]; while!

+4

, , . , , :

  • . 10K . , , .

  • , . , . , ( ), / , , , , "" , .

+1
source

Source: https://habr.com/ru/post/1535176/


All Articles