Provided by:
A very large XML file that is loaded into a table using the nvarchar(max) data type. This leads to a doubling of the data size (probably due to SQL Server encoding in unicode), and then we read the file from the table, analyze it and do bulk insertion into other tables in the database.
Problem:
In this case, it works fine, and no problem. However, when I try bulk insertion on a production server, I get the following error:
Exception: System.InvalidOperationException: A given value of type String from a data source cannot be converted to the nvarchar type of the specified target column. ---> System.InvalidOperationException: String or binary data will be Truncated.
Some unusual things that I noticed: When the ftp version of the ANSI Xml file (for later reading by the web application), it adds a few bytes to the file, and then DOUBLES in size when pasted into our table. When the ftp version of Unicode, the bytes remain the same, but it also DOUBLES and then fails
becausethedatastartst olooklikethi s.
We eliminated bad data by dividing the XML into one record under the root. Development was processed, production was not carried out.
Something MUST be different from the configuration on our development and production servers, but we cannot figure it out. Meanwhile, the comparison is the same.
Any help would be greatly appreciated!
EDIT: Update: we tried to read the file in the XmlDocument object directly from the server and bypassing the process of its storage in db. No change in behavior.
Second update: We excluded the FTP process (maybe?) By copying the file and then BACK (the shinks file size is several bytes, but we will return these bytes when copying).