python - Bash: Tokenize string using shell rules without eval'ing it? -


i'm writing wrapper script. original program's arguments in separate file, args. script needs split contents of args using shell parameter rules , run program. partial solution (set + eval) offered in splitting string tokens according shell parameter rules without eval:

#!/usr/bin/env bash stdout="$1" stderr="$2" ( set -f ; eval "set -- $(cat args)"; exec run_in_container "$@" >"$stdout" 2>"$stderr" ) 

but in case args user-generated. 1 can imagine

  • args: echo "hello, 'world'! $(rm -rf /)" (not cool, harmless: commands run in e.g. docker container)
  • args: bash -c "$java_home/<...> > <...> && <...>" (harmful: $java_home intended container's value of environment variable java_home, substituted earlier, when eval'ing command in wrapper script's subshell.)

i tried python, , works:

#!/usr/bin/env python import shlex, subprocess, sys  open('args', 'r') argsfile:     args = argsfile.read()  open(sys.argv[1], 'w') outfile, open(sys.argv[2], 'w') errfile:     exit(subprocess.call(["run_in_container"] + shlex.split(args), stdout=outfile, stderr=errfile)) 

is there way shlex in bash: tokenize string using shell parameter rules, don't substitute variables' values, don't execute $(...) etc.?


Comments

Popular posts from this blog

javascript - jQuery: Add class depending on URL in the best way -

caching - How to check if a url path exists in the service worker cache -

Redirect to a HTTPS version using .htaccess -