← Back to context

Comment by ndsipa_pomu

1 month ago

Yep, you can chain multiple commands with find's "-exec", but I'm not a fan of it myself. I suspect setting variables in the current process is trickier though.

(Very minor nitpick, it should be 'a\nb.mp3' to be included, but that does work fine)

Incidentally, ShellCheck isn't happy with that although I don't follow their reasoning:

  find . -type f -name '*.mp3' -exec bash -c "
                                           ^-- SC2156 (warning): Injecting filenames is fragile and insecure. Use parameters.

https://www.shellcheck.net/wiki/SC2156

> although I don't follow their reasoning

I think it is sound. Imagine what happens when the filename contains:

    ' && shutdown now && '.mp3

  • Of course that makes sense now.

    Anyhow here's an example of how I would use the while loop and process substitution in a BASH script:

      declare -i file_count=0
      while IFS= LC_ALL=C read -r -d '' file; do
        file_count+=1
        printf "file: %s\n"  "${file}"
      done < <(find . -type f -name '*.mp3' -print0)
      printf "Processed %d files\n" "${file_count}"
    

    I think that'd be tricky to do using just a find/-exec command.

    • I see, but now you are essentially operating on multiple files, at once, so the serialization makes some sense. Although for just this, I wouldn't write the operation in bash at all:

          find . -type f -name '*.mp3' | wc -l
      

      Honestly I don't really view the shell / filesystem interface as a security boundary. I use the shell mainly for (automation of) interactive use, so any screwup due to e.g. quoting issues is my own fault, maybe even of using stupid filenames. Shell is a great language to connect streams of different programs into each other, not so much for doing any work. If I do that, I would reach for C.

      1 reply →