How to grep for lines containing specific words in a log file?

I have a large log file that I am trying to scan for specific words. In general, I will have a few words that I need to grep on my large log file and print a line containing these words.

I know how to make simple grep in a file. Suppose if my file name is abc.log and I need to find a line containing the word β€œhello” then I always do it like this and it prints the line for me.

 grep -i "hello" abc.log 

But I do not know how to make grep for a combination of words. Meaning I would have a list of words, and I will scan my abc.log file for all these words, and I will print lines containing these words individually.

 #!/bin/bash data="hello,world,tester" # find all the lines which contains word hello or world or tester 

So, in my shell script, I would split my data variable and look at the word hello in abc.log so that any line containing the word hello, I print and similarly with the world and the tester.

I am trying to make this pretty general, so I just need to add my list of words to the data variable without touching the actual logic of grepping logs.

+5
source share
3 answers

If you store patterns in a file, one per line, you can use grep -f file-with-patterns file-to-search.log

On the man page:

  -f FILE, --file=FILE Obtain patterns from FILE, one per line. The empty file contains zero patterns, and therefore matches nothing. (-f is specified by POSIX.) 
+6
source

I would use a regular expression, for example:

 grep -E 'hello|world|tester' abc.log 
+7
source

In addition to bruchowski answer , you can also use:

 grep -i -e "hello" -e "world" -e "tester" abc.log 

OR

 grep 'hello\|world\|tester' abc.log 

OR

 egrep 'hello|world|tester' abc.log 
0
source

Source: https://habr.com/ru/post/1204396/


All Articles