I need to extract data from a .dbf file and convert it to xml. I wrote a routine that does this just fine. However, now we are faced with very large .dbf files - like 2GB +. And this code throws an OutOfMemoryException exception for these files.
Public Function GetData() As DataTable
Dim dt As New DataTable(Name)
Dim sqlcommand As String= "Select * From MyTable"
Dim cn As New OleDbConnection(myconnectionstring)
Try
cn.Open()
Dim cmd As New OleDbCommand(sqlcommand, cn)
dt.Load(cmd.ExecuteReader())
Catch ex As Exception
Throw ex
Finally
dt.Dispose()
cn.Close()
cn.Dispose()
End Try
Return dt
The fact is that if I run the same code on my computer using Visual Studio in debug mode in the same 2 GB .dbf file, an exception will not be thrown. This is almost the same as Visual Studio managing memory differently than the application does on its own.
Are there any problems with memory? I tried using a DataAdapter with similar results. Is this behavior I see with Visual Studio expected / by design?