Huge base64 strings are not a problem per se, .NET supports object sizes of about 2 GB, see answer here . Of course, this does not mean that you can store 2gb of information in the object!
However, I get the feeling that it is byte [], that is the problem.
If too many elements for the byte [] should contain, it does not matter if you transfer the result or even read it from a file on your hard drive.
So, just for testing purposes, can you try changing the type of this from byte [] to string, or even, possibly to a list? This is not elegant or the event may be appropriate, but it may point the way to a better solution.
Edit:
Another test case, to try instead of calling deserializeObject, just try saving the jsonContent string to a file and see how big it is?
Also, why do you need this in your memory? What kind of data is this? It seems to me that if you need to process this in memory, then you will have a bad time - the size of the object is too large for the CLR.
However, I had a little inspiration, how about trying another deserializer? Maybe RestSharp or you can use
HttpClient.ReadAsAsync<T> . It is possible that NewtonSoft itself has a problem, especially if the content is about 400 MB in size.
source share