I am currently working on a project, and at some point I am dealing with nparray measurements (165L, 653L, 1024L, 1L). (About 100 MB of data).
For JSON compatibility reasons, I need to turn it into a regular list. So I used a regular function
array.tolist()
The problem is that this line consumes 10 GB of RAM. Something seems wrong here, shouldn't I use tolist () on large arrays?
I searched the websites a bit, I found some suspicions that tolist () was a memory leak, especially here An explicit memory leak with numpy tolist () in a long process and here https://mail.python.org/pipermail/matrix- sig / 1998-October / 002368.html . But this does not seem to be my problem.
Instead of converting the entire matrix 165 x 653 x 1024 x 1 to a list so that you can expand and convert it to JSON, just do 165 conversions to a list of internal dimensions of 653 x 1024 and write them to a file with your own explicit delimiters.
Source: https://habr.com/ru/post/1266134/More articles:Click multiple objects at once - javascriptHow to remove spaces / NA from dataframe and push values up - pythonAppcelerator does not detect android sdk api - androidProper use of the rod in the Cholesky decomposition of a positive semidefinite matrix - matrixhow to add --auth for mongodb image when using docker-compose? - node.jsBonded copy failed during `build stack 'in cifs directory - haskellSecurity does not expand additionally - swiftExplicit memory leak with numpy tolist () in a long process - pythonRSE Eclipse Error: Operation Failed. File system input or output error - javaF # Performance Impact of Verified Calcs? - performanceAll Articles