My application serializes the object using Json.Net, compresses the received JSON, and then saves it to a file. In addition, an application can load an object from one of these files. These objects can be tens of MB in size, and I am worried about memory usage due to the way the existing code creates large strings and byte arrays: -
public void Save(MyClass myObject, string filename) { var json = JsonConvert.SerializeObject(myObject); var bytes = Compress(json); File.WriteAllBytes(filename, bytes); } public MyClass Load(string filename) { var bytes = File.ReadAllBytes(filename); var json = Decompress(bytes); var myObject = JsonConvert.DeserializeObject<MyClass>(json); } private static byte[] Compress(string s) { var bytes = Encoding.Unicode.GetBytes(s); using (var ms = new MemoryStream()) { using (var gs = new GZipStream(ms, CompressionMode.Compress)) { gs.Write(bytes, 0, bytes.Length); gs.Close(); return ms.ToArray(); } } } private static string Decompress(byte[] bytes) { using (var msi = new MemoryStream(bytes)) { using (var mso = new MemoryStream()) { using (var gs = new GZipStream(msi, CompressionMode.Decompress)) { gs.CopyTo(mso); return Encoding.Unicode.GetString(mso.ToArray()); } } } }
I was wondering if Save / Load methods can be replaced by threads? I found examples of using threads with Json.Net, but I'm struggling to figure out how to fit into the extra compression material.
source share