(It is assumed that you are doing this in Powershell, but these methods apply to any language.)
I recommend checking file sizes first.
Do it first, it's fast!
if ((gci $file1).Length -ne (gci $file2).Length) { Write-Host "Files are different!" } else {
Finally, you can perform a full-scale comparison. If you are in PowerShell, look at Compare-Object ( diff alias). For instance,
if (diff (gc $file1) (gc $file2)) { Write-Host "Files are different!" }
It might be faster to do a buffered comparison between bytes, as shown here: http://keestalkstech.blogspot.com/2010/11/comparing-two-files-in-powershell.html
Alternatives:
MD5 comparisons can actually be slower than comparisons between bytes. Not only do you need to read files, but you also need to perform calculations to get a hash. You can at least optimize by caching the hash of the old file - saving half the I / O.
The reason you export the database table is that most databases add rows to the end. You have to make sure that this is your case and that you are just adding and not updating. If so, you can simply compare the last line in your file; for example, the last 4K or some large size of your line.
source share