Here is a version that does not use ls
. It should be less vulnerable to strange characters in file names:
find . -maxdepth 1 -type f -name '*.html' -print0 \| xargs -0 stat --printf "%Y\t%n\n" \| sort -n \| tail -n 10 \| cut -f 2 \| xargs cp -t ../Test/
I used find
for two reasons:
1) if there are too many files in the directory, bash will prevent the expansion of the * substitution.
2) Using the -print0
argument to find
covers the bash problem of expanding spaces in the file name for multiple tokens.
* In fact, bash uses a memory buffer to expand substitution and environment variables, so it is not strictly a function of the number of file names, but rather the total length of file names and environment variables. Too many environment variables => no wildcard expansion.
EDIT: Some improvements to @glennjackman included. The initial use of find
fixed to avoid using a wildcard extension that could fail in a large directory.
source share