Why does grep freeze at startup in the / directory?

My question has two parts:

1) Why does grep freeze when I grep all the files in the "/" section?

eg:

grep -r 'h' ./ 

(note: right before the freeze / crash, I notice that I see some messages "there is no such device or address" regarding sockets ....

Of course, I know that grep should not work against a socket, but I think that since sockets are just files on Unix, it should return a negative result, not a failure.

2) Now, my next question is: anyway - how can I grep the entire file system? Are there certain * NIX directories that we should ignore? In particular, I am looking for all recently recorded log files.

+4
source share
3 answers

As @ninjalj said, if you don't use -D skip , grep will try to read all the files on your device, socket files and FIFO files. In particular, on a Linux system (and many Unix systems), it will try to read /dev/zero , which looks infinitely long.

You will wait a while.

If you are looking for syslog, it is best to use /var/log .

If you are looking for something that can really be anywhere on your file system, you can do something like this:

 find / -xdev -type f -print0 | xargs -0 grep -H pattern 

The -xdev argument to find indicates that it remains inside the same file system; this avoids /proc and /dev (as well as any mounted file systems). -type f restricts the search to regular files. -print0 prints file names separated by null characters, not newlines; this avoids problems with files with spaces or other funny characters in their names.

xargs reads a list of file names (or something else) on its standard input and calls the specified command for everything in the list. The -0 option works with find -print0 .

The -H for grep tells it the prefix of each match with the file name. By default, grep does this only if there are two or more file names on the command line. Since xargs splits its arguments into batches, it is possible that the last batch will have only one file, which will give you inconsistent results.

Consider using find ... -name '*.log' to limit your search to files with names ending in .log (assuming your log files have these names) and / or using grep -I ... to skip binary files.

Note that this all depends on the features specific to GNU. Some of these options may not be available on MacOS (based on BSD) or on other Unix systems. Consult your local documentation and consider installing GNU findutils (for find and xargs ) and / or GNU grep.

Before trying to use this, use df to find out how big your root file system is. Mine is currently 268 gigabytes; finding all of this will probably take several hours. After a few minutes, (a) restricting the files you are looking for, and (b) make sure that the command is correct will be worth the time you spend.

+13
source

By default, grep tries to read every file. Use -D skip to skip device files, socket files, and FIFO files.

+8
source

If you continue to see error messages, then grep does not hang. Store iotop in the second window to see how much your system works to pull all the contents from its media into main memory, in parts. This operation should be slow or you have a very simple system.

Now, my next question is: In any case - how can I grep the entire file system? Are there certain * NIX directories that we should ignore? In particular, Im is looking for all recently recorded log files.

Graping the entire FS is very rarely a good idea. Try grepping the directory where the log files were to be written; probably /var/log . Even better, if you know anything about the names of the files you are looking for (for example, they have the extension .log ), then run find or locate and grep files specified by these programs.

+1
source

Source: https://habr.com/ru/post/1379191/


All Articles