9 Tips For Writing Safer Shell Scripts

Shell scripting is a powerful tool available on all platforms, even Windows, thanks to WSL. But it can be easy to make mistakes. Here are some tips to improve our scripts and avoid many problems.
Table of Contents
#1: Better Shell Options
All shells have configurable options, which can be used to enable behavior. Many of them are considered safer than the shell defaults.
Fail on Errors
An absolute no-brainer is the option set -e
.
With this option, any command returning a non-zero status will exit the whole script, and not execute any further commands.
By default, a shell script is run entirely, regardless of any errors:
#!/usr/bin/env bash
# Non-existing command
non-existent
echo "Will still be echoed"
Output:
line 4: non-existent: command not found
Will still be echoed
By setting -e
we can prevent the echo
:
#!/usr/bin/env bash
set -e
# Non-existing command
non-existent
echo "Won't still be echoed"
Output:
line 5: non-existent: command not found
There are certain exceptions to make it actually more usable:
||
commands lists: The full list will be evaluated, and will fail afterward, if necessary:
# The command list will evaluate to true, so
# we can use set -e, and circumvent it, if necessary
non-existent || true
Test conditions are allowed to fail, e.g., if
, while
, until
:
if non-existent; then
echo "success"
else
echo "failure"
fi
echo "after the failure"
Output:
failure
after the failure
Fail on Unset Variables
Never trip over unset variables ever again!
By using set -u
the script will exit on using an unset variable:
#!/usr/bin/env bash
set -u
# Non-existing variable
echo "${NOT_SET}"
echo "Won't be echoed"
Output:
line 5: NOT_SET: unbound variable
Of course, there are some exceptions to the rules again:
- The special parameters
@
and*
are still allowed. - Default values can be set:
ACTUAL_PATH=${MAYBE_SET_PATH:-~/home/$USER}
One caveat exists, though: an array is assumed unset if empty.
Safer Pipelines
By default, the exit code of a pipeline is determined by the last command regardless of any previous non-zero status codes:
#!/usr/bin/env bash
non-existent | echo -e "last command"
echo $?
Output:
last command
bash: non-existent: command not found
0
With set -o pipefail
, a pipeline only returns a zero status if ALL the parts exited successfully.
This won’t affect that all parts of the pipeline will be executed.
But now we can use set -e
with pipelines.
Better Empty Globbing
Filename expansion, also known as globbing, can be the root of many bugs.
One particular questionable default is the handling of empty expansions.
If no files are found by expanding the glob, by default, it’s passed “as-is” (passglob), instead of an empty variable:
#!/usr/bin/env bash
for f in *.log; do
echo "$f"
done
Output, if no log files are found:
*.log
This behavior can be disabled by shopt -s nullglob
, which is already the default in some non-bash shells:
#!/usr/bin/env bash
shopt -s nullglob
for f in *.log; do
echo "$f"
done
NO OUTPUT if no log files are found.
Disable Globbing
The option set -f
disables filename expansion altogether.
It’s a good idea if you don’t need globing for our script and prevent any accidental expansion.
Debug Output
The option set -x
enables trace-mode, which will print each command to stdout
before actually executing it.
Remember that all options can be set from the outside, too:
sh -x my-script.sh
#2: It’s a TRAP
Our scripts are often not stateless, creating the need for some kind of cleanup. Especially in case of a premature exit on errors, we need a way to get notified.
By using the built-in trap
function we can register a command to be executed for a specific signal:
# trap <command> <signals>;
We can use functions or commands directly:
function cleanup() {
# ...
}
trap cleanup EXIT
trap 'rm command.lock' ERR
Error-handling can be improved by using $LINENO
to actually know where it all went wrong.
Resources
- The Bash Trap Command (Linux Journal)
#3: Check Requirements Early
If our script relies on an external program not usually found in a default installation, we should check their existence first:
# from one of my internal scripts
REQUIREMENTS=(jq ssh sed nc column)
for APP in "${REQUIREMENTS\[@\]}"; do
command -v "$APP" > /dev/null 2>&1
if \[\[ $? -ne 0 \]\]; then
>&2 echo "Required '$APP' is not installed"
exit 1
fi
done
This little snippet helps to enforce that all requirements are available, or exits on the first unavailable command.
#4: Temporary Files & Directories
There are many reasons for needing temporary files: downloads, atomic operations, etc.
A random name for a temporary file or directory is mandatory, or we might overwrite something by accident.
The shell helps us with mktemp
, creating files and directories in /tmp
:
# Random filename
mktemp
# Custom filename (X = random char)
mktemp -t foo.XXXXXX
# Random directory
mktemp -d
# Custom directory (X = random char)
mktemp -d -t foo.XXXXXX
The command creates the file or directory, not just return a random name/path.
It has a --dry-run / -u
option, but it’s considered unsafe.
There are more options to customize the generated filename/path, check out its manpage.
#5: Quoting (almost) everything
Always use quotes. It’s better to quote too much than not enough.
As shown before, automatic parameter expansion can be the source of many bugs. To preserve the literal meaning of a string, we need to quote it.
If a string contains whitespace or an asterisk, it’s a ticking timebomb:
FILENAME="This contains spaces"
touch $FILENAME
# Creates 3 files:
# - This
# - contains
# - spaces
By quoting variables when expanding them, we can make sure that the result will be passed as a single argument:
FILENAME="This contains spaces"
touch "$FILENAME"
# Creates 1 file:
# - This\ contains\ spaces
#6: Linting with ShellCheck
It’s easy to lint our scripts against 350 different rules!
Here is one of the examples:
#!/bin/sh
# Example: a typical script with several problems
for f in $(ls *.m3u); do
grep -qi hq.*mp3 $f \
&& echo -e 'Playlist $f contains a HQ file in mp3 format'
done
Spellcheck Output:
Line 3:
for f in $(ls *.m3u); do
^-- SC2045: Iterating over ls output is fragile. Use globs.
^-- SC2035: Use ./*glob* or -- *glob* so names with dashes won't become options.
Line 4:
grep -qi hq.*mp3 $f \
^-- SC2062: Quote the grep pattern so the shell won't interpret it.
^-- SC2086: Double quote to prevent globbing and word splitting.
Did you mean: (apply this, apply all SC2086)
grep -qi hq.*mp3 "$f" \
Line 5:
&& echo -e 'Playlist $f contains a HQ file in mp3 format'
^-- SC2039: In POSIX sh, echo flags are undefined.
^-- SC2016: Expressions don't expand in single quotes, use double quotes for that.
ShellCheck highlights the detected problems, showing us exactly what went wrong, and where, so it can easily be fixed.
It’s available in the repositories of most Linux distributions, and can also be integrated into many editors, like Visual Studio Code.
#7: Doin’ it in Style
I’ve argued before about using style guides for our preferred language, to establish a good baseline.
The most widely mentioned style guide for shell scripts is the “Shell Style Guide” by Google.
It’s an extensive read, but worth it.
As usual, don’t force what won’t fit. For example, I’m a strong proponent of using 4 spaces and a line length of ~110 spaces. No style guide will make me change my mind.
But I’m willing to adapt if a project demands it.
#8: Targeting the Right Shell
There are multiple shells available to use, with different options.
Especially in the age of containers, it’s no longer a given that we encounter a full-fledged bash shell.
That’s why we should try to target the right kind of shell, and not depend too much on shell-specific behavior. At least if we can’t be absolutely sure about which environment we’re running in.
What’s the Difference Between Bash, Zsh, and Other Linux Shells? (How-To Geek)
If unsure, a shebang tells the executing shell how the script should be run:
#!/bin/bash
What if bash
isn’t available in /bin
but /usr/bin
instead? We can utilize env
get the correct location by returning the first occurrence in $PATH
:
#!/usr/bin/env bash
#9: Don’t Use Shell Script
An essential aspect of being a developer is knowing the limitations of languages and tools. Not everything should be a shell script. Many higher-level languages provide a safer and more concise environment to begin with.
And thanks to shebangs we can use many different languages in scripts, given a suitable interpreter is available:
Or we could always use a “compiles to single binary”-style languages, like Golang or Rust. They also support cross-compilation, so it’s easy to target more than just our own platform.
Conclusion
Shell scripts are great. From automating a tiny task, up to full-fledged TUI apps like bashtop, almost anything is possible.
But the syntax is a little unusual compared to a high-level language. And errors can easily be introduced, even by just a typo.
Setting the right options and linting our scripts will help avoid common sources of problems in the first place. They are easy to use and integrate into our workflow, and shouldn’t be dismissed due to convenience, especially if others have to use our scripts.
What are your favorite tips and tricks for better shell scripts?
Resources
- Bash Reference Manual (Gnu.org)
- Safe ways to do things in bash (Github)
- Shell Style Guide (Google)
- ShellCheck