SQL Server 2008 Error - XML ​​parsing: too much memory required to parse a document

We hit a wall in SQL Server 2008 with the maximum number of attributes that one XML node can have before the XML parser works.

The error we get is:

Msg 6303, Level 16, State 1, Line 1 XML parsing: Document parsing required too much memory 

Which is a little misleading. The problem occurs when converting a row to an XML data type (or table column).

 SELECT CONVERT(XML, '<DataNode><Data attr1="a" attr2="b" XXXXX /></DataNode>') 

Where XXXXX is actually another 8191 attributes.

In total, our dataset contains 10,066 attributes to start with. When we reduce the number of attributes to 8,192, it works great. However, 8.193 attributes are reset.

It does not seem to have anything specific with regard to data size (100 MB or 60 KB it does not matter - we get the same failure / success result based on attribute counting)

So, is there anything that can be done with the SQL server to change this restriction?

In our C # application this restriction is absent, therefore quite correct XML documents in C # cannot be stored in the columns of XML data of the SQL server.

Any help anyone can offer would be greatly appreciated. The data structure cannot be changed at this moment, since this will require rewriting the data processing functionality of the entire application infrastructure with hundreds of components.

PS: I already advised you to control how ridiculous the situation is when the application stores data as attributes of a single node, and not as a tree, but I have to work with this: - (

edit : we tried this on SQL Server 2008 32 and 64-bit versions on a server with 2 GB of RAM and a server with 32 GB of RAM - all versions and environments have the same problem.

UPDATE:

I tried this on 2012 SQL Server and it fails when there are 8 193 attributes and the string length also exceeds the given size (the length of the test string is 833K), but it works when the same length of the string contains only 8 192 attribute.

However, I have a much shorter string (193K) with 12,000 attributes, and it works in SQL Server 2012 and SQL Server 2008)

So this is apparently a combination of the number of attributes when the string string exceeds a certain size. It is becoming more interesting!

Thanks for the feedback!

Update 2:

After further testing with smaller lines (270 thousand), I still fell into the attribute limit with 16384 ... 16,385 attributes failed! So this definitely happens with a step of 8K attributes, depending on the combination of string length !!

+6
source share
1 answer

8191 attributes sounds too much. If you are trying to store the actual data in attributes, it is obvious that the SQL Server parser has a limitation.

see article here: http://blogs.msdn.com/b/sqlprogrammability/archive/2006/05/23/605299.aspx?Redirected=true

If a type is required for validation, the validator loads its definition from metadata and compiles it into a format suitable for quick validation. To prevent the use of any one type of memory, SQL Server closes the size of the compiled type by one megabyte. SQL Server compiles all types and performs this check when the schema is imported to avoid accepting types that exceed the limit.

I would suggest changing the way you use XML and storing information inside an element.

see http://www.ibm.com/developerworks/library/x-eleatt/index.html and also http://www.w3schools.com/xml/xml_attributes.asp

+2
source

Source: https://habr.com/ru/post/951781/


All Articles