I work mainly on systems based on Windows and Windows CE, where CreateFile, ReadFileand WriteFileare work horses, regardless of whether I am in Win32 native land or in a managed .Net environment.
Until now, I have never had an obvious problem with writing or reading large files in one fragment, as opposed to loops, until several small pieces have been processed. I usually delegate I / O to a background thread that notifies me of this.
But, looking at files IO-textbooks or “examples of textbooks”, I often find a “cycle with small pieces”, which is used without explaining the reasons for its use, and not more obvious (I dare say!) "Once."
Are there any flaws in the way I do this that I did not understand?
Clarification:
In a large file, I compared my only piece with a few pieces. A few sample chunks that I mentioned often have block sizes of 1024 bytes on Windows CE and 10 times on the desktop. My large files are usually binary files such as photos from mobile phones, etc. And as such in the amount of about 2-10 MB. In other words, it is not close to 1 GB.