I'm debugging an old shell script; I want to check the values of all the variables used, it's a huge ugly script with approx more than 140 variables used. Is there anyway I can extract the variable names from the script and put them in a convenient pattern like:
#!/bin/sh
if [ ${BLAH} ....
.....
rm -rf ${JUNK}.....
to
echo ${BLAH}
echo ${JUNK}
...
Try running your script as follows:
Or enable the setting in the script:
You can dump all interested variables in one command using:
To dump all the variables to stdout use:
or
from inside your script.
In bash, but not sh,
compgen -v
will list the names of all variables assigned (compare this toset
, which has a great deal of output other than variable names, and thus needs to be parsed).Thus, if you change the top of the script to
#!/bin/bash
, you will be able to usecompgen -v
to generate that list.That said, the person who advised you use
set -x
did well. Consider this extension on that:This will print the source file and line number before every command (or variable assignment) which is executed, so you will have a log not only of which variables are set, but just where in the source each one was assigned. This makes tracking down where each variable is set far easier.
You can extract a (sub)list of the variables declared in your script using
grep
:Disclaimer: why "sublist"?
The expression given will match string followed by an egal sign (
=
) and a double quote ("
). So if you don't use syntax such asmyvar="my-value"
it won't work. But you got the idea.grep
Options-P --perl-regexp
: Interpret PATTERN as a Perl regular expression (PCRE, see below) (experimental) ;-o --only-matching
: Print only the matched (non-empty) parts of a matching line, with each such part on a separate output line.Pattern
I'm using a positive lookahead:
(?==\")
to require an egal sign followed by a double quote.