My project has a reporting module that collects data from a database in XML form and runs XSLT to create the report format that the user wants. The options at this stage are HTML and CSV.
We use Java and Xalan to interact with data.
The bad part is that one of these reports that the user can request is 143 MB (about 430,000 records) for only part of the XML. When this converts to HTML, I end up with a bunch of space with a maximum volume of 4096G reserved for the heap. This is unacceptable.
It seems that the problem is too much data, but I cannot help but think that there is a better way to handle this than to limit the client and not meet the functional requirements.
I am happy to provide additional information as necessary, but I can not disclose too much about the project, because, as I am sure, most of you understand. In addition, the answer is yes; I need all the data at the same time: I canβt paginate it.
thanks
EDIT
All the conversion classes that I use are in the javax.xml.transform package. The implementation is as follows:
final Transformer transformer = TransformerFactory.newInstance().newTransformer( new StreamSource(new StringReader(xsl))); final StringWriter outWriter = new StringWriter(); transformer.transform( new StreamSource(new StringReader(xml)), new StreamResult(outWriter)); return outWriter.toString();
If possible, I would like to leave XSLT as it is. The StreamSource method of doing things should allow me to GC some data as it is processed, but I'm not sure what restrictions on XSLT (functions, etc.) might be required to properly clear it. If someone can point me to a resource with a detailed description of these restrictions, this will be very helpful.
source share