How can I efficiently write incoming data to disk?

I am currently writing a program that reads a short line from a (potentially large) number of sources as often as every second. I needed to be able to write this data to separate files, and I was hoping for some recommendations regarding the most efficient way to do this.

My current implementation queues data and discards them when the queue exceeds a certain size.

Is there a better approach? In C #, are there any I / O constructors that are especially efficient?

EDIT: By and large, I believe that a โ€œreasonableโ€ maximum will be ~ 100 data sources, but that could be several hundred in the worst case.

+4
source share
1 answer

You will need to define โ€œlargeโ€ to get a better answer. You don't need your turn..NET Framework BufferedStream is pretty efficient

http://msdn.microsoft.com/en-us/library/3dsccbf4.aspx

http://msdn.microsoft.com/en-us/library/system.io.bufferedstream.write.aspx

If the โ€œbigโ€ one does not fit the maximum number of files that the OS permits, you can simply leave the files open (set up sharing as needed if other processes should access them while they are being written). This avoids the overhead of opening each file once per second.

Make sure that whatever buffer approach you use, so that you do not download more data than you are prepared to lose in the event of a power failure or other system failure.

If you cannot accept data loss, you can immediately write each line to a file (without any buffering) and instead use a disk controller with a write cache and a backup battery.

UPDATE

100 data sources are far below the maximum number of open files for any OS that runs .Net. You should be fine just opening files and leaving them open until you are done with them.

For fun, check out the pen limits on Windows, see

http://blogs.technet.com/b/markrussinovich/archive/2009/09/29/3283844.aspx

+3
source

Source: https://habr.com/ru/post/1437896/


All Articles