Json.Net deserialize out of memory issue

I have a Json that contains, among others, a data field that stores a base64 encoded string. This Json is serialized and sent to the client.

On the client side, the newtonsoft json.net deserializer is used to return Json. However, if the data field becomes large (~ 400 MB), the deserializer throws an exception in memory: Array Dimensions exceeded. I also see in Task Manager that memory consumption is really growing rapidly.

Any ideas why this is so? Maximum size for json fields or something else?

Sample code (simplified):

HttpResponseMessage responseTemp = null; responseTemp = client.PostAsJsonAsync(client.BaseAddress, message).Result; string jsonContent = responseTemp.Content.ReadAsStringAsync.Result; result = JsonConvert.DeserializeObject<Result>(jsonContent); 

Result Class:

 public class Result { public string Message { get; set; } public byte[] Data { get; set; } } 

UPDATE:

I think my problem is not the serializer, but just trying to process such a huge line in memory. The moment I read a line in memory, the memory consumption of the application explodes. Each operation on this line does the same. At the moment, I think I need to find a way to work with streams and immediately re-read all the material in memory.

+5
source share
3 answers

I assume you are using 64 bits. If not, switch .

Having done this, if you are using .Net 4.5 or later, enable gcAllowVeryLargeObjects . It allows you to create arrays with int.MaxValue , even if this leads to the fact that the main memory buffer will be more than 2 GB. However, you still cannot read a single JSON token longer than 2 ^ 31 characters, because JsonTextReader full contents of each individual token in the private char[] _chars; , and in .Net, an array can only contain int.MaxValue elements .

+4
source

To read a large JSON string using JsonConvert.DeserializeObject, you will consume a lot of memory. So, one way to overcome this problem is to create an instance of JsonSerializer, as shown below.

  using (StreamReader r = new StreamReader(filePath)) { using (JsonReader reader = new JsonTextReader(r)) { JsonSerializer serializer = new JsonSerializer(); T lstObjects = serializer.Deserialize<T>(reader); } } 

Here filePath : is your current Json file and T : is your Generic object.

+5
source

Huge base64 strings are not a problem per se, .NET supports object sizes of about 2 GB, see answer here . Of course, this does not mean that you can store 2gb of information in the object!

However, I get the feeling that it is byte [], that is the problem.

If too many elements for the byte [] should contain, it does not matter if you transfer the result or even read it from a file on your hard drive.

So, just for testing purposes, can you try changing the type of this from byte [] to string, or even, possibly to a list? This is not elegant or the event may be appropriate, but it may point the way to a better solution.

Edit:

Another test case, to try instead of calling deserializeObject, just try saving the jsonContent string to a file and see how big it is?

Also, why do you need this in your memory? What kind of data is this? It seems to me that if you need to process this in memory, then you will have a bad time - the size of the object is too large for the CLR.

However, I had a little inspiration, how about trying another deserializer? Maybe RestSharp or you can use HttpClient.ReadAsAsync<T> . It is possible that NewtonSoft itself has a problem, especially if the content is about 400 MB in size.
+2
source

Source: https://habr.com/ru/post/1235021/


All Articles