最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

bash - How to iterate LIVE through a directory? - Stack Overflow

programmeradmin2浏览0评论

I am running a shell script via bash that iterates through a folders' files via

for file in ./folder/*.xml; do ...

However, it can happen that some files are changed or deleted while the script is running which results in an error.

Basically, I want to run the same script multiple times at the same time to make the process faster. The script imports files and deletes them afterwards.

Is there a possibility for a while loop that reads the files freshly with each iteration?

I am running a shell script via bash that iterates through a folders' files via

for file in ./folder/*.xml; do ...

However, it can happen that some files are changed or deleted while the script is running which results in an error.

Basically, I want to run the same script multiple times at the same time to make the process faster. The script imports files and deletes them afterwards.

Is there a possibility for a while loop that reads the files freshly with each iteration?

Share Improve this question edited Feb 15 at 14:14 Ted Lyngmo 117k7 gold badges82 silver badges132 bronze badges asked Feb 15 at 12:22 dominikweberdominikweber 6441 gold badge9 silver badges23 bronze badges 7
  • But if it starts fresh every time when should the script finish? – Arkadiusz Drabczyk Commented Feb 15 at 12:44
  • 4 Would it be enough to skip deleted files by putting [[ -e $file ]] || continue at the start of the loop body? If not, please explain why not. It would also help if you could explain why changed files cause problems. – pjh Commented Feb 15 at 13:25
  • @pjh yes that would be sufficient I think. I will try it out. Background: there are multiple tasks removing files from that directory and those should be skipped then. – dominikweber Commented Feb 15 at 13:36
  • 2 A file could change or be deleted while you are working on it, never mind while you are choosing a file to work on. You should think more about how your script interacts with the other processes working on the same files. – chepner Commented Feb 15 at 16:32
  • 1 Consider using inotifywait, incron, or similar tools to run your script only when a new file has finished writing, for that new file only, and at no other time. systemd has similar support for triggering a service from a path unit, by the way. – Charles Duffy Commented Feb 15 at 19:49
 |  Show 2 more comments

1 Answer 1

Reset to default 3

I suggest making a slight change to what you are currently doing. Instead of running the script multiple times to work on the same set of files, I'd make it work at one file at a time in parallel. Here's a version where the script is using xargs -P to restart itself given one of the .xml files in folder:

#!/bin/bash

if [[ $# -eq 1 ]]; then
    echo "$$: working with: $1"

    # do the work with the file $1 here

    # remove the file or move it to a "done" folder:
    if [[ -d folder/done ]]; then
        mv -f "$1" folder/done
    fi
else
    mkdir -p folder/done
    shopt -s nullglob
    files=( folder/*.xml )
    if (( ${#files} > 0 )); then
        echo "${files[@]}" | xargs -n1 -P0 "$0"
    else
        echo "no files to process"
    fi
fi

If new .xml files may be added to folder while the script is running, then run the script in a loop. Below is an alternative to xargs -P which spawns the processes manually and loops to take care of any files added while the processing is going on.

#!/bin/bash

work () {
    echo "$$: working with: $1"

    # do the work with the file $1 here

    # remove the file or move it to a "done" folder:
    if [[ -d folder/done ]]; then
        mv -f "$1" folder/done
    fi
}

mkdir -p folder/done
shopt -s nullglob

processed=1
while (( processed > 0 )); do
    processed=0
    for file in folder/*.xml
    do
        (( ++processed ))
        # spawn a worker process:
        work "$file" &
    done

    wait # for all the started worker processes

    echo "processed $processed files"
done
发布评论

评论列表(0)

  1. 暂无评论