Just wanted to share an alias I have in use and found it useful again. It’s a simple wrapper around xargs, which I always forget how to use properly, so I set up an alias for. All it does is operate on each line on stdout.

The arguments are interpreted as the command to execute. The only thing to remember is using the {} as a placeholder for the input line. Look in the examples to understand how its used.

# Pipe each line and execute a command. The "{}" will be replaced by the line.
#
# Example:
#   cat url.txt | foreach echo download {} to directory
#   ls -1 | foreach echo {}
#   find . -maxdepth 2 -type f -name 'M*' | foreach grep "USB" {}
alias foreach='xargs -d "\n" -I{}'

Useful for quickly operating on each line of a file (in example to download from list of urls) or do something with any stdout output line by line. Without remembering or typing a for loop in terminal.

  • thevoidzero@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    1 day ago

    I recommend you gnu parallel. It does similar things, but runs the commands in parallel. And it’s way easier to pipe than xargs. If you really need it to run one command at a time you can give number of cores to 1. And it also has progress bars, colors to differentiate stdout fo different commands, etc.

    Basic example: to echo each line

    parallel echo < somefile.txt

    To download all links, number of jobs 4, show progress

    parallel -j 4 --bar ''curl -O" < links.txt

    You can do lot more stuffs with inputs, like placing them wherever with {}, numbers ({1} is first) that allow multiple unique arguments, transformers like remove extension, remove parent path, etc. worth learning

    • thingsiplay@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      I am actually aware of parallel and use it for a different tool / script I built. The purpose of parallel is different than xargs, right? I mean xargs works on each line of a stdout string, which is what I was using it for. I never thought parallel as an alternative to xargs and need to investigate into this idea more. Thanks.

  • non_burglar@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    Be careful.

    Because it only formats stdin streams to into string(s), xargs can be very dangerous, depending on the command to which the arguments are being passed.

    Xargs used to be a practical way to get around bash globbing issues and parenthetical clause behavior, but most commands have alternate and safer ways of handling passed arguments.

    find -exec is preferable to xargs to avoid file expansion “bombs”, plus find doesn’t involve the shell, so it doesn’t care about whitespace problems.

  • bizdelnick@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    I almost never use xargs. The most common case for it is find, but it is easier to use its -exec option. Also, with find your example is incorrect. You forgot that file names can contain special characters, the newline character in particular. That’s why you need to pass -print0 option to find and -0 option to xargs.

    • thingsiplay@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      The example itself is not incorrect. It is just an example to show how the foreach works, not meant to be a full command on itself. Usually I don’t have newline characters in files either, so that is not a concern for myself. If I would want to be sure, then yes I would use zero option. But its good to point that out.