Thus, this is associated with processing a large heap of objects and tries to minimize the number of byte instances []. Basically, I have OutOfMemoryExceptions, and I feel that this is due to the fact that we are creating too many byte arrays. The program works great when processing multiple files, but it needs to be scaled, and currently it cannot.
In a nutshell, I have a loop that retrieves documents from a database. Currently, it pulls one document at a time, and then processes the document. Documents can range from less than mega to 400+ megabytes. (hence why I process one at a time). Below is the pseudocode before optimization.
So, the steps that I am doing are:
Make a call to the database to find the largest file size (and then multiply it by 1.1)
var maxDataSize = new BiztalkBinariesData().GetMaxFileSize(); maxDataSize = (maxDataSize != null && maxDataSize > 0) ? (long)(maxDataSize * 1.1) : 0; var FileToProcess = new byte[maxDataSize];
Then I make another database call, pulling all the documents (without data) from the database and putting them in IEnumerable.
UnprocessedDocuments = claimDocumentData.Select(StatusCodes.CurrentStatus.WaitingToBeProcessed); foreach (var currentDocument in UnprocessDocuments) {
Then I populate the byte [] array from an external source:
FileToProcess = new BiztalkBinariesData() .Get(currentDocument.SubmissionSetId, currentDocument.FullFileName);
That is the question. It would be much easier to pass the current document (IClaimDocument) to other processing methods. So, if I set some of the data in the current document to a preformatted array, will this use the existing link? Or does it create a new array in a large bunch of objects?
currentDocument.Data = FileToProcess;
At the end of the loop, I cleaned up FileToProcess
Array.Clear(FileToProcess, 0, FileToProcess.length);
Was it clear? If not, I will try to clear it.
source share