Find - grep takes too long

First of all, I am new to bash script, so forgive me if I make minor mistakes.

Here is my problem. I needed to download my company website. I am doing this with wgetno problems, but since some files have a character ?and windows do not like file names with ?, I had to create a script that renames the files and also updates the source code of all the files that cause the rename file.

For this, I use the following code:

find . -type f -name '*\?*' | while read -r file ; do
 SUBSTRING=$(echo $file | rev | cut -d/ -f1 | rev)
 NEWSTRING=$(echo $SUBSTRING | sed 's/?/-/g')
 mv "$file" "${file//\?/-}"
 grep -rl "$SUBSTRING" * | xargs sed -i '' "s/$SUBSTRING/$NEWSTRING/g"
done

This has 2 problems.

  • It's too long, I waited more than 5 hours and still go.
  • This seems to be an addition to the source code, because when I stop the script and look for changes, the URL repeats as 4 times (or more).

, 2 separete , , FYI, 3291 , wget, , bash

+4
3

, ():

  • , sed
  • , sed , /, .
  • script sed, , , , , data

:

sedfile=/tmp/tmp.sed
data=data
rm -f $sedfile
# locate ourselves in the subdir to preserve the naming logic
cd $data

# rename the files and compose the big sedfile

find . -type f -name '*\?*' | while read -r file ; do
 SUBSTRING=$(echo $file | rev | cut -d/ -f1 | rev)
 NEWSTRING=$(echo $SUBSTRING | sed 's/?/-/g')
 mv "$file" "${file//\?/-}"
 echo "s/$SUBSTRING/$NEWSTRING/g" >> $sedfile
done

# now apply the big sedfile once on all the files:    
# if you need to go recursive:
find . -type f  | xargs sed -i -f $sedfile
# if you don't:
sed -i -f $sedfile *
+1

, ? . URL - ? . wget - , , , php . , , wget , URL- .

, .

, wget.

.

xargs , grepping . sed .

+2

grep find ls, , .

, :

ls -1 /path/to/files/* | xargs sed -i '' "s/$SUBSTRING/$NEWSTRING/g"

, , , grep :

Linux - , 12

0

Source: https://habr.com/ru/post/1656969/


All Articles