The following Powershell script aims to count and calculate file sizes of each unique file extension in the current folder recursively.
How would you speed up this script for a folder with over 170000 entries?
The problem is the following two statements being repeated for each unique file extension, causing a re-parse of $AllFiles which contains over 170000 entries.
$Files = $AllFiles | Where { $_.Extension -eq $Ext.Extension } $FilesSum = ($Files | Measure-Object Length -Sum).Sum
*** BEGIN *** Write-Host "Reading directory ... $Path" $AllFiles = Get-ChildItem $Path -Include * -Recurse -OutBuffer 2048| Where { $_.PSisContainer -eq $false } Write-Host "Number of Files = $(@($AllFiles).Count)" $AllSum = ($AllFiles | Measure-Object Length -Sum).Sum If ($AllFiles.Length) { Write-Host "Counting unique file extensions = " -NoNewLine $Extensions = $AllFiles | Select Extension -Unique | Sort Extension Write-Host ($Extensions).Count
foreach ($x in $Extensions){ Write-Host $x.Extension } exit ForEach ($Ext in $Extensions) { Write-Host "Counting ... $Ext = " -NoNewline $Files = $AllFiles | Where { $_.Extension -eq $Ext.Extension } $FilesSum = ($Files | Measure-Object Length -Sum).Sum Write-Host $FilesSum $Percent = "{0:N0}" -f (($FilesSum / $AllSum) * 100) $Body += " <tr><td>$($Ext.Extension)</td><td>$(@($Files).Count)</td><td><div class=""green"" style=""width:$Percent%"">$('{0:N2}MB' -f ($FilesSum / 1mb))</div></td></tr>" } $HTML = $HeaderHTML + $Body + $FooterHTML Write-Host "Writing file $OutputPath\FilesByExtension.html" $HTML | Out-File $OutputPath\FilesByExtension.html Write-Host "Done!" } Else { Write-Host "`nNo files found in $Path" } *** END ***