How to efficiently write to a file from SQL datareader in C #?

I have a remote sql connection in C # that needs to execute a query and save its results on the user's local hard drive. There is a fairly large amount of data that this thing can return, so you need to think about an effective way to store it. I read before I first put the whole result in memory, and then wrote that this is not a good idea, so if someone could help, it would be great!

I am currently storing sql result data in a DataTable, although I think it is better to do something in while(myReader.Read(){...} The following is the code that gets the results:

  DataTable t = new DataTable(); string myQuery = QueryLoader.ReadQueryFromFileWithBdateEdate(@"Resources\qrs\qryssysblo.q", newdate, newdate); using (SqlDataAdapter a = new SqlDataAdapter(myQuery, sqlconn.myConnection)) { a.Fill(t); } var result = string.Empty; for(int i = 0; i < t.Rows.Count; i++) { for (int j = 0; j < t.Columns.Count; j++) { result += t.Rows[i][j] + ","; } result += "\r\n"; } 

So now I have this huge line of result. And I have data. Should there be a much better way to do this?

Thanks.

+6
source share
7 answers

You yourself are on the right track. Use a loop with while(myReader.Read(){...} and write each record to a text file inside the loop. The .NET platform and the operating system will effectively flush the buffers on disk.

 using(SqlConnection conn = new SqlConnection(connectionString)) using(SqlCommand cmd = conn.CreateCommand()) { conn.Open(); cmd.CommandText = QueryLoader.ReadQueryFromFileWithBdateEdate( @"Resources\qrs\qryssysblo.q", newdate, newdate); using(SqlDataReader reader = cmd.ExecuteReader()) using(StreamWriter writer = new StreamWriter("c:\temp\file.txt")) { while(reader.Read()) { // Using Name and Phone as example columns. writer.WriteLine("Name: {0}, Phone : {1}", reader["Name"], reader["Phone"]); } } } 
+18
source

I agree that it is best to use SqlDataReader . Something like that:

 StreamWriter YourWriter = new StreamWriter(@"c:\testfile.txt"); SqlCommand YourCommand = new SqlCommand(); SqlConnection YourConnection = new SqlConnection(YourConnectionString); YourCommand.Connection = YourConnection; YourCommand.CommandText = myQuery; YourConnection.Open(); using (YourConnection) { using (SqlDataReader sdr = YourCommand.ExecuteReader()) using (YourWriter) { while (sdr.Read()) YourWriter.WriteLine(sdr[0].ToString() + sdr[1].ToString() + ","); } } 

Please note that in the while you can write this line to a text file in any format that, in your opinion, matches the column data from SqlDataReader .

+2
source

Rob Sedgwick's answer is more like him, but it can be improved and simplified. Here is how I did it:

 string separator = ";"; string fieldDelimiter = ""; bool useHeaders = true; string connectionString = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"; using (SqlConnection conn = new SqlConnection(connectionString)) { using (SqlCommand cmd = conn.CreateCommand()) { conn.Open(); string query = @"SELECT whatever"; cmd.CommandText = query; using (SqlDataReader reader = cmd.ExecuteReader()) { if (!reader.Read()) { return; } List<string> columnNames = GetColumnNames(reader); // Write headers if required if (useHeaders) { first = true; foreach (string columnName in columnNames) { response.Write(first ? string.Empty : separator); line = string.Format("{0}{1}{2}", fieldDelimiter, columnName, fieldDelimiter); response.Write(line); first = false; } response.Write("\n"); } // Write all records do { first = true; foreach (string columnName in columnNames) { response.Write(first ? string.Empty : separator); string value = reader[columnName] == null ? string.Empty : reader[columnName].ToString(); line = string.Format("{0}{1}{2}", fieldDelimiter, value, fieldDelimiter); response.Write(line); first = false; } response.Write("\n"); } while (reader.Read()); } } } 

And you need to have the GetColumnNames function:

 List<string> GetColumnNames(IDataReader reader) { List<string> columnNames = new List<string>(); for (int i = 0; i < reader.FieldCount; i++) { columnNames.Add(reader.GetName(i)); } return columnNames; } 
+2
source

Keeping the original approach, here is a quick win:

Instead of using String as a temporary buffer, use StringBuilder . This will allow you to use the .append(String) function for concatenations instead of using the += operator.

The += operator is particularly inefficient, so if you put it in a loop and it will repeat (potentially) millions of times, this will affect performance.

The .append(String) method .append(String) not destroy the original object, so it is faster

+2
source

I came up with this, this is a better CSV writer than other answers:

 public static class DataReaderExtension { public static void ToCsv(this IDataReader dataReader, string fileName, bool includeHeaderAsFirstRow) { const string Separator = ","; StreamWriter streamWriter = new StreamWriter(fileName); StringBuilder sb = null; if (includeHeaderAsFirstRow) { sb = new StringBuilder(); for (int index = 0; index < dataReader.FieldCount; index++) { if (dataReader.GetName(index) != null) sb.Append(dataReader.GetName(index)); if (index < dataReader.FieldCount - 1) sb.Append(Separator); } streamWriter.WriteLine(sb.ToString()); } while (dataReader.Read()) { sb = new StringBuilder(); for (int index = 0; index < dataReader.FieldCount; index++) { if (!dataReader.IsDBNull(index)) { string value = dataReader.GetValue(index).ToString(); if (dataReader.GetFieldType(index) == typeof(String)) { if (value.IndexOf("\"") >= 0) value = value.Replace("\"", "\"\""); if (value.IndexOf(Separator) >= 0) value = "\"" + value + "\""; } sb.Append(value); } if (index < dataReader.FieldCount - 1) sb.Append(Separator); } if (!dataReader.IsDBNull(dataReader.FieldCount - 1)) sb.Append(dataReader.GetValue(dataReader.FieldCount - 1).ToString().Replace(Separator, " ")); streamWriter.WriteLine(sb.ToString()); } dataReader.Close(); streamWriter.Close(); } } 

: mydataReader.ToCsv ("myfile.csv", true)

+1
source

Using the response object without response.Close() causes, at least in some cases, the html of the page that writes the data to be written to the file. If you use response.Close() , the connection may be closed ahead of schedule and cause an error when creating the file.

It is recommended to use HttpApplication.CompleteRequest() , however, this always leads to the fact that html is written to the end of the file.

I tried the thread in combination with the response object and was successful in the development environment. I have not tried it in production yet.

+1
source

I used .CSV to export data from a database using DataReader . in my project I read datareader and create a .CSV file. in a loop, I read the datareader, and for each row, I add the cell value to the result line. I use "," for individual columns, and "\ n" for individual rows. finally, I saved the result string as result.csv .

I offer this high performance extension . I tested it and quickly exported 600,000 lines as .CSV.

+1
source

Source: https://habr.com/ru/post/907168/


All Articles