Pipes and Redirection

Master Bash pipes to chain commands, redirect stdin/stdout/stderr, use tee, xargs, and command substitution for powerful data processing.

📖 4 min read📅 2026-02-10Core Concepts

Pipes (|)

Pipes connect the standard output (stdout) of one command to the standard input (stdin) of the next.

# Basic pipe
ls -la | less                    # Page through listing
cat file.txt | wc -l             # Count lines
ps aux | grep nginx              # Find nginx processes
 
# Multiple pipes
cat access.log | grep "404" | wc -l
# Read file → Filter 404 errors → Count them
 
# Real-world pipeline
ps aux | sort -nrk 3 | head -10 | awk '{print $11, $3"%"}'
# All processes → Sort by CPU desc → Top 10 → Show name and CPU%

Output Redirection

stdout Redirection

# Redirect to file (overwrite)
echo "Hello" > output.txt
ls -la > listing.txt
 
# Redirect to file (append)
echo "World" >> output.txt
date >> log.txt
 
# Redirect to /dev/null (discard)
command > /dev/null              # Discard stdout

stderr Redirection

# stderr is file descriptor 2
command 2> errors.txt            # Redirect errors to file
command 2>> errors.txt           # Append errors
 
# Discard errors
command 2> /dev/null
 
# Redirect both stdout and stderr
command > output.txt 2>&1        # Both to same file
command &> output.txt            # Shorthand (Bash 4+)
 
# Separate files
command > stdout.txt 2> stderr.txt

stdin Redirection

# Read input from file
sort < unsorted.txt
wc -l < data.txt
 
# Here document (multi-line input)
cat << EOF
Hello, this is a
multi-line input
with variable expansion: $USER
EOF
 
# Here string
grep "pattern" <<< "search in this string"

tee — Split Output

Send output to both screen and file:

# Display and save
ls -la | tee listing.txt
 
# Append mode
echo "new entry" | tee -a log.txt
 
# Multiple files
command | tee file1.txt file2.txt
 
# Chain with more pipes
ps aux | tee all_processes.txt | grep nginx | tee nginx_processes.txt | wc -l

xargs — Build Commands from Input

# Basic usage
echo "file1 file2 file3" | xargs touch
# Creates file1, file2, file3
 
# Find and delete
find . -name "*.tmp" | xargs rm
 
# With placeholder
echo "1 2 3" | xargs -I {} echo "Number: {}"
 
# Parallel execution
find . -name "*.jpg" | xargs -P 4 -I {} convert {} -resize 50% resized_{}
 
# Handle filenames with spaces
find . -name "*.txt" -print0 | xargs -0 wc -l

Command Substitution

Use the output of a command as part of another command:

# Modern syntax $(command)
echo "Today is $(date)"
echo "You are $(whoami)"
echo "Files: $(ls | wc -l)"
 
# Store in variable
current_dir=$(pwd)
file_count=$(find . -type f | wc -l)
echo "There are $file_count files in $current_dir"
 
# Nested substitution
echo "Kernel: $(uname -r) on $(uname -s)"
 
# Backtick syntax (older, avoid)
echo "Today is `date`"

Process Substitution

Treat command output as a file:

# Compare outputs of two commands
diff <(ls dir1/) <(ls dir2/)
 
# Pass command output as file argument
wc -l <(find . -type f)
 
# Multiple process substitutions
paste <(cut -f1 file1.txt) <(cut -f2 file2.txt)

Practical Pipeline Examples

Example 1: Log Analysis

# Count unique IP addresses in access log
cat access.log | awk '{print $1}' | sort | uniq -c | sort -rn | head -20
 
# Find most requested pages
cat access.log | awk '{print $7}' | sort | uniq -c | sort -rn | head -10
 
# Count 404 errors per hour
grep "404" access.log | awk '{print $4}' | cut -d: -f1,2 | sort | uniq -c

Example 2: System Monitoring

# Top memory consumers
ps aux --sort=-%mem | head -11 | awk '{printf "%-20s %s%%\n", $11, $4}'
 
# Disk usage summary
du -sh /* 2>/dev/null | sort -rh | head -10
 
# Network connections by state
ss -ant | tail -n +2 | awk '{print $1}' | sort | uniq -c | sort -rn

Example 3: Data Processing

# CSV processing: get unique values from column 3
cut -d, -f3 data.csv | sort -u
 
# Sum numbers in a file
cat numbers.txt | paste -sd+ | bc
 
# Find duplicate lines
sort file.txt | uniq -d

Exercises

  1. Create a pipeline that finds the 5 largest files in your home directory
  2. Redirect both stdout and stderr of a find command to separate files
  3. Use tee to log a command's output while also filtering it
  4. Use command substitution to create a timestamped backup filename
  5. Build a pipeline to analyze a log file and count errors by type

Next: Variables and Environment — learn to store and use data!