I ran OutOfMemory exceptions when trying to load a 800 MB text file into a DataTable via StreamReader. I was wondering if there is a way to load a DataTable from a memory stream in batches, that is, read the first 10,000 lines of a text file from StreamReader, create a DataTable, do something with a DataTable, and then load the next 10,000 lines into StreamReader and soon.
My googles here didn't help much, but there seems to be an easy way to do this. Ultimately, I will write DataTables in MS SQL db using SqlBulkCopy, so if there is a simpler approach than what I described, I would be grateful for a quick pointer in the right direction.
Edit - Here is the code I'm running:
public static DataTable PopulateDataTableFromText(DataTable dt, string txtSource)
{
StreamReader sr = new StreamReader(txtSource);
DataRow dr;
int dtCount = dt.Columns.Count;
string input;
int i = 0;
while ((input = sr.ReadLine()) != null)
{
try
{
string[] stringRows = input.Split(new char[] { '\t' });
dr = dt.NewRow();
for (int a = 0; a < dtCount; a++)
{
string dataType = dt.Columns[a].DataType.ToString();
if (stringRows[a] == "" && (dataType == "System.Int32" || dataType == "System.Int64"))
{
stringRows[a] = "0";
}
dr[a] = Convert.ChangeType(stringRows[a], dt.Columns[a].DataType);
}
dt.Rows.Add(dr);
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
i++;
}
return dt;
}
And here is the error that returns:
"System.OutOfMemoryException: 'System.OutOfMemoryException'.
System.String.Split(Char [] , Int32, StringSplitOptions)
System.String.Split(Char [] separator}
Harvester.Config.PopulateDataTableFromText(DataTable dt, String txtSource) C:...."
SQL - , #, , , ? SqlBulkCopy.WriteToServer DataTable, sql. , ?
: , - , SQL Server. B A. bcp?