Large XML files in a dataset (outofmemory)

I am currently trying to load a slightly large XML file into a dataset. The xml file is about 700 MB, and every time I try to read xml, it needs a lot of time, and after a while it throws an "out of memory" exception.

DataSet ds = new DataSet();
ds.ReadXml(pathtofile);

The main problem is that I need to use these data sets (I use it to import data from the xml file into the sybase database (foreach table, foreach row, foreach column)) and that I do not have the file schema.

I have already searched Google, but I have found solutions that will not be used for me.

Additional info: I am using Sybase database (ASA 9), but my C # application crashes before I process db. The error occurs after I read the XML in the dataset and want to work with ds. I already read that this is a known mistake when using data sets with large content. I need data in a dataset at least once, because I need to import it into db.

+3
source share
3 answers

You may be able to overcome this by overloading the ReadXml method. Go into a buffered stream instead and see if this speeds up for you.

Here is the code:

DataSet ds = new DataSet();
FileStream filestream = File.OpenRead(pathtofile);
BufferedStream buffered = new BufferedStream(filestream);
ds.ReadXml(buffered);

, , . XML , 500 500 . , , , , (, <Version></Version> <V></V> > 60%).

, , !

+4
+1

, . ? ? # ? ?

The main solution would be to provide a part that throws an exception from memory (I think your C # application) has more memory with a parameter. At least this is what I would do if it were a Java program.

0
source

Source: https://habr.com/ru/post/1741778/


All Articles