Delete vs directory file + update performance

What is the best way to delete files?

  • Delete a file or
  • Delete the entire directory with files at once and recreate the directory

Just note that the root directory should still be there, so I can do:

var photo_files = Directory.EnumerateFiles(item_path, "*.jpg", SearchOption.TopDirectoryOnly); foreach (var photo in photo_files) { File.Delete(photo); } 

Or delete the entire directory and then create it again.

How much will the performance difference be for 10,000 or even 100,000 files?

PS To clarify, .NET does not have a function to delete all files in a folder at a time and exit the directory.

+6
source share
5 answers

When you delete a directory, this is one entry in the table of the main drive files, whereas if you delete each file, then there is a write operation for each file. Thus, it is more efficient to delete the directory and recreate it.

After sharing with frustrated @Mr, I would suggest the following amendment to my answer:

If you need to do this β€œa lot”, you can create an extension method for yourself that looks like this:

 public static class IOExtension { public static void PurgeDirectory(this DirectoryInfo d) { string path = d.FullName; Directory.Delete(d.FullName,true);//Delete with recursion Directory.CreateDirectory(path); } } 

so that you can just call this in the DirectoryInfo class, for example ...

 Directory Info di = new DirectoryInfo(path); di.PurgeDirectory(); 
+5
source

In terms of performance, you can delete the entire directory and recreate. After rest you need to create a catalog.

+3
source

I would say that deleting a directory directly will give better performance, but this assumption is based on a single ratio of resolution requirements requested on each call to File.Delete , unlike the initial check when using Directory.Delete - there "If you are sure, it will be even more devils in detail.

Both loops through the files and, ultimately, both boil down to invoking native Windows functions to complete the task β€” once for each file.

Keep in mind that the biggest bottleneck when working with IO is actually the hardware of the read or written disk.

Have you tested this to see what results are real in your situation?

+3
source

Something, somewhere you need to delete all the files. If you do not do this directly, then it must be done for you indirectly (for example, shelling and calling "RmDir / S"). Thus, the system as a whole will work approximately the same. The performance of your application may vary depending on whether you must first wait for all files to be deleted.

0
source

This question is similar to: "How hot is the center of the sun?"

The only way you probably know is to go there. So, create a test harness and go there.

Create a folder and put the same image in the folder 10,000 times. Call the file a director or something unique. You can write a simple program for this.

Then run the delete code and time for both cases. Repeat if necessary to confirm the results.

There is a .net stopwatch class that you can use in time. You can also use environment.tickcount to get timings.

-1
source

Source: https://habr.com/ru/post/885823/


All Articles