What is the best way to calculate directory size in .NET?

I wrote the following procedure for manually navigating through a directory and calculating its size in C # /. NET:

protected static float CalculateFolderSize(string folder) { float folderSize = 0.0f; try { //Checks if the path is valid or not if (!Directory.Exists(folder)) return folderSize; else { try { foreach (string file in Directory.GetFiles(folder)) { if (File.Exists(file)) { FileInfo finfo = new FileInfo(file); folderSize += finfo.Length; } } foreach (string dir in Directory.GetDirectories(folder)) folderSize += CalculateFolderSize(dir); } catch (NotSupportedException e) { Console.WriteLine("Unable to calculate folder size: {0}", e.Message); } } } catch (UnauthorizedAccessException e) { Console.WriteLine("Unable to calculate folder size: {0}", e.Message); } return folderSize; } 

I have an application that re-runs this procedure for a large number of folders. I am wondering if there is a more efficient way to calculate folder size with .NET? I did not see anything specific in the framework. Should I use P / Invoke and Win32 API? What is the most efficient way to calculate folder size in .NET?

+66
c # windows
Jan 22 '09 at 5:21
source share
20 answers

I do not believe that there is a Win32 API for calculating the space consumed by the directory, although I can fix this. If that were then, I would suggest that Explorer would use it. If you get the properties of a large folder in Explorer, the time it takes to give you the size of the folder is proportional to the number of files / subdirectories it contains.

Your routine looks pretty neat and simple. Keep in mind that you are calculating the sum of the file lengths, not the actual space consumed by the disk. The space consumed by the empty space at the end of the clusters, file streams, etc., is ignored.

+25
Jan 22 '09 at 5:42
source share

No, it looks like the recommended way to calculate the size of the directory, the corresponding method below:

 public static long DirSize(DirectoryInfo d) { long size = 0; // Add file sizes. FileInfo[] fis = d.GetFiles(); foreach (FileInfo fi in fis) { size += fi.Length; } // Add subdirectory sizes. DirectoryInfo[] dis = d.GetDirectories(); foreach (DirectoryInfo di in dis) { size += DirSize(di); } return size; } 

You would invoke with root as:

 Console.WriteLine("The size is {0} bytes.", DirSize(new DirectoryInfo(targetFolder)); 

... where targetFolder is the size of the folder to calculate.

+55
Jan 22 '09 at 5:28
source share

The best and shortest one liner could be next

  long length = Directory.GetFiles(directoryPath,"*",SearchOption.AllDirectories).Sum(t => (new FileInfo(t).Length)); 
+21
Mar 01 '14 at 6:54
source share

The real question is: what do you intend to use for?

Your first problem is that there are at least four for the “file size”:

  • The end-of-file offset, which is the number of bytes that you must skip to go from beginning to end of the file.
    In other words, this is the number of bytes logically in the file (in terms of usage).

  • "Actual data length", which is equal to the offset of the first byte, which is not actually stored.
    This is always less than or equal to the "end of file" and a multiple of the size of the cluster.
    For example, a 1 GB file may have a valid data length of 1 MB. If you ask Windows to read the first 8 MB, it will read the first 1 MB and pretend that the rest of the data was there, returning it as zeros.

  • "Set size" file. This is always greater than or equal to the "end of file".
    This is the number of clusters allocated by the OS for the file, multiplied by the cluster size.
    Unlike the case where the “end of file” is larger than the “permissible data length”, redundant bytes are not considered part of the file data, so the OS does not fill the zero buffer if you try to read in the selected area outside the file.

  • "compressed size" of a file that is valid only for compressed (and sparse?) files.
    It is equal to the cluster size multiplied by the number of clusters on the volume that are actually allocated for this file.
    For uncompressed and non-sparse files there is no concept of "compressed size"; you would use the "allocated size" instead.

The second problem is that a “file”, such as C:\Foo , can actually have multiple data streams.
This name only applies to the default thread. The file may have alternative streams, such as C:\Foo:Bar , the size of which does not even appear in Explorer!

Your third problem is that a “file” can have multiple names (“hard links”).
For example, C:\Windows\notepad.exe and C:\Windows\System32\notepad.exe are two names for the same file. Any name can be used to open any file stream.

Your fourth problem is that the “file” (or directory) may not actually be a file (or directory):
This can be a soft link ("symbolic link" or "reprocessing point") to another file (or directory).
This other file may not even be on the same drive. It may even point to something on the net, or it may even be recursive! Should the size be infinite if it is recursive?

Fifth, there are “filter” drivers that make certain files or directories look like actual files or directories, although they are not. For example, Microsoft WIM image files (which are compressed) can be “mounted” in a folder using the ImageX tool, and they do not look like reprocessing points or links. They look just like directories, except that they are not really directories, and the concept of "size" does not really make sense to them.

Your sixth problem is that metadata is required for each file.
For example, having 10 names for the same file requires more metadata, which requires space. If the file names are short, having 10 names can be as cheap as 1 name — and if they are long, then having multiple names can use more disk space for metadata. (Same story with multiple threads, etc.)
Do you also consider them?

+15
Sep 05 '13 at 3:04
source share
 public static long DirSize(DirectoryInfo dir) { return dir.GetFiles().Sum(fi => fi.Length) + dir.GetDirectories().Sum(di => DirSize(di)); } 
+14
May 24 '12 at 19:26
source share
 var size = new DirectoryInfo("E:\\").GetDirectorySize(); 

and here is the code for this extension method

 public static long GetDirectorySize(this System.IO.DirectoryInfo directoryInfo, bool recursive = true) { var startDirectorySize = default(long); if (directoryInfo == null || !directoryInfo.Exists) return startDirectorySize; //Return 0 while Directory does not exist. //Add size of files in the Current Directory to main size. foreach (var fileInfo in directoryInfo.GetFiles()) System.Threading.Interlocked.Add(ref startDirectorySize, fileInfo.Length); if (recursive) //Loop on Sub Direcotries in the Current Directory and Calculate it files size. System.Threading.Tasks.Parallel.ForEach(directoryInfo.GetDirectories(), (subDirectory) => System.Threading.Interlocked.Add(ref startDirectorySize, GetDirectorySize(subDirectory, recursive))); return startDirectorySize; //Return full Size of this Directory. } 
+6
Sep 02 '15 at 23:40
source share

Faster! Add the COM link "Windows Script Host Object ..."

 public double GetWSHFolderSize(string Fldr) { //Reference "Windows Script Host Object Model" on the COM tab. IWshRuntimeLibrary.FileSystemObject FSO = new IWshRuntimeLibrary.FileSystemObject(); double FldrSize = (double)FSO.GetFolder(Fldr).Size; Marshal.FinalReleaseComObject(FSO); return FldrSize; } private void button1_Click(object sender, EventArgs e) { string folderPath = @"C:\Windows"; Stopwatch sWatch = new Stopwatch(); sWatch.Start(); double sizeOfDir = GetWSHFolderSize(folderPath); sWatch.Stop(); MessageBox.Show("Directory size in Bytes : " + sizeOfDir + ", Time: " + sWatch.ElapsedMilliseconds.ToString()); } 
+5
Aug 28 '13 at 6:47
source share

This is the best way to calculate the size of a directory. Only another way will still use recursion, but it will be a little easier to use and will not be so flexible.

 float folderSize = 0.0f; FileInfo[] files = Directory.GetFiles(folder, "*", SearchOption.AllDirectories); foreach(FileInfo file in files) folderSize += file.Length; 
+4
Jan 22 '09 at 5:49
source share

I played in VS2008 and LINQ until recently, and this compact and short method works fine for me (for example, in VB.NET; of course, LINQ / .NET FW 3.5+ is required):

 Dim size As Int64 = (From strFile In My.Computer.FileSystem.GetFiles(strFolder, _ FileIO.SearchOption.SearchAllSubDirectories) _ Select New System.IO.FileInfo(strFile).Length).Sum() 

In short, it searches for subdirectories and is easy to understand if you know the LINQ syntax. You can even specify wildcards to search for specific files using the third parameter of the .GetFiles function.

I am not an expert in C #, but you can add the My namespace in C # this way .

I think that this method of obtaining the folder size is not only shorter and more modern than the method described in the Hao link , it basically uses the same loop-of-FileInfo method described there at the end.

+4
Jun 13 '09 at 8:18
source share

I extended @Hao's answer using the same counting principle, but supporting a richer return of data, so that you get size, recursive size, number of directories and recursive directory counting, N levels.

 public class DiskSizeUtil { /// <summary> /// Calculate disk space usage under <paramref name="root"/>. If <paramref name="levels"/> is provided, /// then return subdirectory disk usages as well, up to <paramref name="levels"/> levels deep. /// If levels is not provided or is 0, return a list with a single element representing the /// directory specified by <paramref name="root"/>. /// </summary> /// <returns></returns> public static FolderSizeInfo GetDirectorySize(DirectoryInfo root, int levels = 0) { var currentDirectory = new FolderSizeInfo(); // Add file sizes. FileInfo[] fis = root.GetFiles(); currentDirectory.Size = 0; foreach (FileInfo fi in fis) { currentDirectory.Size += fi.Length; } // Add subdirectory sizes. DirectoryInfo[] dis = root.GetDirectories(); currentDirectory.Path = root; currentDirectory.SizeWithChildren = currentDirectory.Size; currentDirectory.DirectoryCount = dis.Length; currentDirectory.DirectoryCountWithChildren = dis.Length; currentDirectory.FileCount = fis.Length; currentDirectory.FileCountWithChildren = fis.Length; if (levels >= 0) currentDirectory.Children = new List<FolderSizeInfo>(); foreach (DirectoryInfo di in dis) { var dd = GetDirectorySize(di, levels - 1); if (levels >= 0) currentDirectory.Children.Add(dd); currentDirectory.SizeWithChildren += dd.SizeWithChildren; currentDirectory.DirectoryCountWithChildren += dd.DirectoryCountWithChildren; currentDirectory.FileCountWithChildren += dd.FileCountWithChildren; } return currentDirectory; } public class FolderSizeInfo { public DirectoryInfo Path { get; set; } public long SizeWithChildren { get; set; } public long Size { get; set; } public int DirectoryCount { get; set; } public int DirectoryCountWithChildren { get; set; } public int FileCount { get; set; } public int FileCountWithChildren { get; set; } public List<FolderSizeInfo> Children { get; set; } } } 
+4
Jan 22 '15 at 17:16
source share

It looks like the following method performs your task faster than a recursive function:

 long size = 0; DirectoryInfo dir = new DirectoryInfo(folder); foreach (FileInfo fi in dir.GetFiles("*.*", SearchOption.AllDirectories)) { size += fi.Length; } 

A simple console application test shows that this cycle sums files faster than a recursive function and gives the same result. You probably want to use LINQ methods (e.g. Sum ()) to shorten this code.

+4
Jun 17 '15 at 10:45
source share

this solution works very well. he collects all subfolders:

 Directory.GetFiles(@"MainFolderPath", "*", SearchOption.AllDirectories).Sum(t => (new FileInfo(t).Length)); 
+4
Aug 22 '16 at 19:18
source share
 public static long GetDirSize(string path) { try { return Directory.EnumerateFiles(path).Sum(x => new FileInfo(x).Length) + Directory.EnumerateDirectories(path).Sum(x => GetDirSize(x)); } catch { return 0L; } } 
+3
Sep 11 '14 at 12:35
source share
 Directory.GetFiles(@"C:\Users\AliBayat","*",SearchOption.AllDirectories) .Select (d => new FileInfo(d)) .Select (d => new { Directory = d.DirectoryName,FileSize = d.Length} ) .ToLookup (d => d.Directory ) .Select (d => new { Directory = d.Key,TotalSizeInMB =Math.Round(d.Select (x =>x.FileSize) .Sum () /Math.Pow(1024.0,2),2)}) .OrderByDescending (d => d.TotalSizeInMB).ToList(); 

The GetFiles call with SearchOption.AllDirectories returns the fully qualified name of all files in all subdirectories specified directory. OS represents file size in bytes. You can get the file size from its Length property. Dividing it by 1024 with an accuracy of 2, you get the file size in megabytes. Since a directory / folder can contain many files, d.Select(x => x.FileSize) returns a set of file sizes, measured in megabytes. The last call to Sum() finds the total file size in the specified directory.

Update: filterMask = "." does not work with files without extension

+3
Aug 21 '18 at 5:41
source share

As for the best algorithm, you probably have its right. I would recommend you figure out the recursive function and use your own stack (remember that stack overflow is the end of the world in .Net 2.0+ application, the exception cannot be caught by IIRC).

Most importantly, if you use it in any kind of user interface, put it in a workflow that signals an update to the user interface thread.

+2
Jan 22 '09 at 5:51
source share

To improve performance, you can use the parallel task library (TPL). Here is a good example: Calculating the size of a directory file - how to make it faster?

I have not tested it, but the author says that it is 3 times faster than the non-multithreaded method ...

+2
Aug 08 '13 at 9:21 on
source share

The fastest way I've come is to use EnumerateFiles with SearchOption.AllDirectories. This method also allows you to update the interface when viewing files and counting size. Long path names do not cause any problems since FileInfo or DirectoryInfo are not created for the long path name. When listing files, although the file name is longer, the FileInfo returned by EnumerateFiles does not cause problems if the name of the initial directory is not too large. There is still a problem with UnauthorizedAccess.

  private void DirectoryCountEnumTest(string sourceDirName) { // Get the subdirectories for the specified directory. long dataSize = 0; long fileCount = 0; string prevText = richTextBox1.Text; if (Directory.Exists(sourceDirName)) { DirectoryInfo dir = new DirectoryInfo(sourceDirName); foreach (FileInfo file in dir.EnumerateFiles("*", SearchOption.AllDirectories)) { fileCount++; try { dataSize += file.Length; richTextBox1.Text = prevText + ("\nCounting size: " + dataSize.ToString()); } catch (Exception e) { richTextBox1.AppendText("\n" + e.Message); } } richTextBox1.AppendText("\n files:" + fileCount.ToString()); } } 
+1
Sep 09 '16 at 6:55
source share

This .NET kernel command line application here calculates the directory sizes for a given path:

https://github.com/garethrbrown/folder-size

The key method is one that recursively checks subdirectories to get the total size.

 private static long DirectorySize(SortDirection sortDirection, DirectoryInfo directoryInfo, DirectoryData directoryData) { long directorySizeBytes = 0; // Add file sizes for current directory FileInfo[] fileInfos = directoryInfo.GetFiles(); foreach (FileInfo fileInfo in fileInfos) { directorySizeBytes += fileInfo.Length; } directoryData.Name = directoryInfo.Name; directoryData.SizeBytes += directorySizeBytes; // Recursively add subdirectory sizes DirectoryInfo[] subDirectories = directoryInfo.GetDirectories(); foreach (DirectoryInfo di in subDirectories) { var subDirectoryData = new DirectoryData(sortDirection); directoryData.DirectoryDatas.Add(subDirectoryData); directorySizeBytes += DirectorySize(sortDirection, di, subDirectoryData); } directoryData.SizeBytes = directorySizeBytes; return directorySizeBytes; } } 
+1
Feb 26 '19 at 9:16
source share

An alternative to Trikaldarshi one line solution. (This eliminates the need to create FileInfo objects)

 long sizeInBytes = Directory.EnumerateFiles("{path}","*", SearchOption.AllDirectories).Sum(fileInfo => new FileInfo(fileInfo).Length); 
0
Sep 06
source share

I know this is not a .net solution, but it comes anyway. Maybe this is useful to people who have Windows 10 and want a faster solution. For example, if you run this command on the command line or press winKey + R :

 bash -c "du -sh /mnt/c/Users/; sleep 5" 

sleep 5 so that you have time to see the results and the window does not close

On my computer it displays:

enter image description here

Pay attention to how it shows 85G (85 gigabytes). This is dinner compared to .Net. If you want to see the size more accurately, remove the h that stands for readable.

So just do something like Processes.Start("bash",... arguments) This is not exact code, but you get the idea.

-one
May 17 '19 at 22:25
source share



All Articles