Use Powershell to print case-sensitive line numbers

I think we have a bunch of commented code in our source, and instead of removing it immediately, we just left it. Now I would like to do some cleaning.

Therefore, assuming that I have RegEx good enough to find comments (the RegEx below is simple and I can extend it based on our coding standards), how can I take the results of the file I read and output the following:

  • File name
  • Line number
  • Actual line of code

I think I have the basis of the answer, but I don’t know how to take the file I read, and analyzed RegEx and spat it out in this format.

I'm not looking for the perfect solution - I just want to find large blocks of code with comments. If you look at the result and see a bunch of files with the same name and consecutive line numbers, I have to do it.

$Location = "c:\codeishere" [regex]$Regex = "//.*;" #simple example - Will expand on this... $Files = get-ChildItem $Location -include *cs -recurse foreach ($File in $Files) { $contents = get-Content $File $Regex.Matches($contents) | WHAT GOES HERE? } 
+4
source share
4 answers

You can do:

 dir c:\codeishere -filter *.cs -recurse | select-string -Pattern '//.*;' | select Line,LineNumber,Filename 
+12
source
 gci c:\codeishere *.cs -r | select-string "//.*;" 

The select-string cmdlet already does exactly what you are asking for, although the display name of the file is a relative path.

+2
source

I would personally go even further. I would like to calculate the number of consecutive lines. Then print the file name, number of lines and line. You can sort the result by the number of rows (candidates for deletion?). Please note that my code does not count with empty lines between comments, so this part is considered as two blocks of code with comments:

 // int a = 10; // int b = 20; // DoSomething() // SomethingAgain() 

Here is my code.

 $Location = "c:\codeishere" $occurences = get-ChildItem $Location *cs -recurse | select-string '//.*;' $grouped = $occurences | group FileName function Compute([Microsoft.PowerShell.Commands.MatchInfo[]]$lines) { $local:lastLineNum = $null $local:lastLine = $null $local:blocks = @() $local:newBlock = $null $lines | % { if (!$lastLineNum) { # first line $lastLineNum = -2 # some number so that the following if is $true (-2 and lower) } if ($_.LineNumber - $lastLineNum -gt 1) { #new block of commented code if ($newBlock) { $blocks += $newBlock } $newBlock = $null } else { # two consecutive lines of commented code if (!$newBlock) { $newBlock = '' | select File,StartLine,CountOfLines,Lines $newBlock.File, $newBlock.StartLine, $newBlock.CountOfLines, $newBlock.Lines = $_.Filename,($_.LineNumber-1),2, @($lastLine,$_.Line) } else { $newBlock.CountOfLines += 1 $newBlock.Lines += $_.Line } } $lastLineNum=$_.LineNumber $lastLine = $_.Line } if ($newBlock) { $blocks += $newBlock } $blocks } # foreach GroupInfo objects from group cmdlet # get Group collection and compute $result = $grouped | % { Compute $_.Group } #how to print $result | % { write-host "`nFile $($_.File), line $($_.StartLine), count of lines: $($_.CountOfLines)" -foreground Green $_.Lines | % { write-host $_ } } # you may sort it by count of lines: $result2 = $result | sort CountOfLines -desc $result2 | % { write-host "`nFile $($_.File), line $($_.StartLine), count of lines: $($_.CountOfLines)" -foreground Green $_.Lines | % { write-host $_ } } 

If you have an idea how to improve the code, post it! I have the feeling that I can do this with some standard cmdlets, and the code may be shorter.

+2
source

I would look at something like:

 dir $location -inc *.cs -rec | ` %{ $file = $_; $n = 0; get-content $_ } | ` %{ $_.FileName = $file; $_.Line = ++$n; $_ } | ` ?{ $_ -match $regex } | ` %{ "{0}:{1}: {2}" -f ($_.FileName, $_.Line, $_)} 

those. add additional properties to the line to indicate the file name and line number that can be piped after matching the regular expression.

(Using ForEach-Object -begin / -end script blocks should be able to simplify this.)

+1
source

Source: https://habr.com/ru/post/1285982/


All Articles