AWK: do a CURL on each line and parse the result

given the input stream with the following lines:

123 456 789 098 ... 

I would call

 curl -s http://foo.bar/some.php?id=xxx 

where xxx is the number for each line, and each time, let awk script extract some information from the curl output, which is written to the output stream. I am wondering if this is possible without using the awk call to "system ()" as follows:

 cat lines | grep "^[0-9]*$" | awk ' { system("curl -s " $0 \ " | awk \'{ #parsing; print }\'") }' 
+6
source share
3 answers

You can use bash and avoid awk system call:

 grep "^[0-9]*$" lines | while read line; do curl -s "http://foo.bar/some.php?id=$line" | awk 'do your parsing ...' done 
+1
source

The shell outline will achieve a similar result, as shown below:

 #!/bin/bash for f in $(cat lines|grep "^[0-9]*$"); do curl -s "http://foo.bar/some.php?id=$f" | awk '{....}' done 

Alternative methods to accomplish such tasks include using Perl or Python with an HTTP client.

0
source

If your file dynamically adds identifiers, you can unmount a small while to continue checking more data in the file, for example:

while IFS= read -d $'\n' -ra || sleep 1; do [[ -n "$a" ]] && curl -s "http://foo.bar/some.php?id=${a}"; done < lines.txt

Otherwise, if it is static, you can change sleep 1 to break , and it will read the file and exit when there is no data left, it is quite useful to know how to do this.

0
source

Source: https://habr.com/ru/post/893014/


All Articles