foundations · level 2

The Terminal

Pipes, redirects, exit codes, and the shell pipeline.

150 XP

The Terminal

The shell is a program that reads lines of text, runs commands, and streams output back. Understanding the shell means understanding three things: the filesystem, pipes, and exit codes.

Analogy

The terminal is like an industrial kitchen assembly line. Each command is a single-purpose station — one that chops, one that fries, one that plates. A pipe | is the conveyor belt carrying the half-finished dish from one station to the next without ever setting it on a counter. The exit code is the station lead shouting "done" or "burnt" down the line so the next station knows whether to proceed. Nobody station tries to do the whole dish — they each do one thing, and the line composes the meal.

Navigating the filesystem

The filesystem is a tree. Every file has an absolute path — from the root / down to the file. A relative path is relative to where you currently are (pwd prints that).

pwd              # print working directory
ls               # list files in current directory
ls -la           # long format, including hidden files
cd /var/log      # jump to an absolute path
cd ..            # move one level up
cd -             # jump back to the previous directory

Hidden files start with a dot: .gitignore, .env. ls skips them by default; ls -a shows them.

Searching files

grep "error" app.log          # lines containing "error"
grep -r "TODO" src/           # recursive search across a directory
grep -n "error" app.log       # include line numbers
grep -i "error" app.log       # case-insensitive

awk is a row-column processor. It splits each line into fields by whitespace (or any delimiter you choose):

awk '{print $1}' access.log         # first field of each line
awk -F: '{print $1}' /etc/passwd    # split on ":", print first field
awk '$3 > 404 {print $0}' log       # print rows where field 3 > 404

Pipes and redirects

The pipe | connects the stdout of one command to the stdin of the next. Nothing is written to disk — bytes flow directly from process to process.

cat app.log | grep "ERROR" | wc -l

This reads the log, filters for "ERROR" lines, and counts them. Three separate processes; one pipeline.

Redirects send stdout or stdin to files:

command > out.txt       # stdout → file (truncate)
command >> out.txt      # stdout → file (append)
command 2> err.txt      # stderr → file
command &> all.txt      # stdout + stderr → file
command < in.txt        # file → stdin

Exit codes

Every process exits with an integer code. 0 means success. Anything else means failure.

ls /tmp && echo "ok"      # echo only runs if ls succeeds
ls /nope || echo "failed" # echo only runs if ls fails

$? holds the exit code of the last command:

grep "pattern" file.txt
echo $?    # 0 if found, 1 if not found, 2 if file error

Scripts should always set set -e at the top so they stop on the first failing command. Silent failures are the hardest bugs to trace.

Combining it all

Find the five most common IP addresses in an nginx access log:

awk '{print $1}' /var/log/nginx/access.log \
  | sort \
  | uniq -c \
  | sort -rn \
  | head -5

Each stage transforms the stream: extract the first field, sort alphabetically, count duplicates, sort numerically descending, take the top five. No temporary files. No loops. The pipe is the data structure.