Back to skills
SkillHub ClubRun DevOpsFull StackDevOpsTesting

shell-scripting

Write robust, portable shell scripts with proper error handling, argument parsing, and testing. Use when automating system tasks, building CI/CD scripts, or creating container entrypoints.

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
318
Hot score
99
Updated
March 20, 2026
Overall rating
C4.1
Composite score
4.1
Best-practice grade
B79.6

Install command

npx @skill-hub/cli install ancoleman-ai-design-components-shell-scripting

Repository

ancoleman/ai-design-components

Skill path: skills/shell-scripting

Write robust, portable shell scripts with proper error handling, argument parsing, and testing. Use when automating system tasks, building CI/CD scripts, or creating container entrypoints.

Open repository

Best for

Primary workflow: Run DevOps.

Technical facets: Full Stack, DevOps, Testing.

Target audience: everyone.

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: ancoleman.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install shell-scripting into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/ancoleman/ai-design-components before adding shell-scripting to shared team environments
  • Use shell-scripting for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

Original source / Raw SKILL.md

---
name: shell-scripting
description: Write robust, portable shell scripts with proper error handling, argument parsing, and testing. Use when automating system tasks, building CI/CD scripts, or creating container entrypoints.
---

# Shell Scripting

## Purpose

Provides patterns and best practices for writing maintainable shell scripts with error handling, argument parsing, and portability considerations. Covers POSIX sh vs Bash decision-making, parameter expansion, integration with common utilities (jq, yq, awk), and testing with ShellCheck and Bats.

## When to Use This Skill

Use shell scripting when:
- Orchestrating existing command-line tools and system utilities
- Writing CI/CD pipeline scripts (GitHub Actions, GitLab CI)
- Creating container entrypoints and initialization scripts
- Automating system administration tasks (backups, log rotation)
- Building development tooling (build scripts, test runners)

Consider Python/Go instead when:
- Complex business logic or data structures required
- Cross-platform GUI needed
- Heavy API integration (REST, gRPC)
- Script exceeds 200 lines with significant logic complexity

## POSIX sh vs Bash

**Use POSIX sh (#!/bin/sh) when:**
- Maximum portability required (Linux, macOS, BSD, Alpine)
- Minimal container images needed
- Embedded systems or unknown target environments

**Use Bash (#!/bin/bash) when:**
- Controlled environment (specific OS, container)
- Arrays or associative arrays needed
- Advanced parameter expansion beneficial
- Process substitution `<(cmd)` useful

For detailed comparison and testing strategies, see `references/portability-guide.md`.

## Essential Error Handling

### Fail-Fast Pattern

```bash
#!/bin/bash
set -euo pipefail

# -e: Exit on error
# -u: Exit on undefined variable
# -o pipefail: Pipeline fails if any command fails
```

Use for production automation, CI/CD scripts, and critical operations.

### Explicit Exit Code Checking

```bash
#!/bin/bash

if ! command_that_might_fail; then
    echo "Error: Command failed" >&2
    exit 1
fi
```

Use for custom error messages and interactive scripts.

### Trap Handlers for Cleanup

```bash
#!/bin/bash
set -euo pipefail

TEMP_FILE=$(mktemp)

cleanup() {
    rm -f "$TEMP_FILE"
}

trap cleanup EXIT
```

Use for guaranteed cleanup of temporary files, locks, and resources.

For comprehensive error patterns, see `references/error-handling.md`.

## Argument Parsing

### Short Options with getopts (POSIX)

```bash
#!/bin/bash

while getopts "hvf:o:" opt; do
    case "$opt" in
        h) usage ;;
        v) VERBOSE=true ;;
        f) INPUT_FILE="$OPTARG" ;;
        o) OUTPUT_FILE="$OPTARG" ;;
        *) usage ;;
    esac
done

shift $((OPTIND - 1))
```

### Long Options (Manual Parsing)

```bash
#!/bin/bash

while [[ $# -gt 0 ]]; do
    case "$1" in
        --help) usage ;;
        --verbose) VERBOSE=true; shift ;;
        --file) INPUT_FILE="$2"; shift 2 ;;
        --file=*) INPUT_FILE="${1#*=}"; shift ;;
        *) break ;;
    esac
done
```

For hybrid approaches and validation patterns, see `references/argument-parsing.md`.

## Parameter Expansion Quick Reference

```bash
# Default values
${var:-default}              # Use default if unset
${var:=default}              # Assign default if unset
: "${API_KEY:?Error: required}"  # Error if unset

# String manipulation
${#var}                      # String length
${var:offset:length}         # Substring
${var%.txt}                  # Remove suffix
${var##*/}                   # Basename
${var/old/new}               # Replace first
${var//old/new}              # Replace all

# Case conversion (Bash 4+)
${var^^}                     # Uppercase
${var,,}                     # Lowercase
```

For complete expansion patterns and array handling, see `references/parameter-expansion.md`.

## Common Utilities Integration

### JSON with jq

```bash
# Extract field
name=$(curl -sSL https://api.example.com/user | jq -r '.name')

# Filter array
active=$(jq '.users[] | select(.active) | .name' data.json)

# Check existence
if ! echo "$json" | jq -e '.field' >/dev/null; then
    echo "Error: Field missing" >&2
fi
```

### YAML with yq

```bash
# Read value (yq v4)
host=$(yq eval '.database.host' config.yaml)

# Update in-place
yq eval '.port = 5432' -i config.yaml

# Convert to JSON
yq eval -o=json config.yaml
```

### Text Processing

```bash
# awk: Extract columns
awk -F',' '{print $1, $3}' data.csv

# sed: Replace text
sed 's/old/new/g' file.txt

# grep: Pattern match
grep -E "ERROR|WARN" logfile.txt
```

For detailed examples and best practices, see `references/common-utilities.md`.

## Testing and Validation

### ShellCheck: Static Analysis

```bash
# Check script
shellcheck script.sh

# POSIX compliance
shellcheck --shell=sh script.sh

# Exclude warnings
shellcheck --exclude=SC2086 script.sh
```

### Bats: Automated Testing

```bash
#!/usr/bin/env bats

@test "script runs successfully" {
    run ./script.sh --help
    [ "$status" -eq 0 ]
    [ "${lines[0]}" = "Usage: script.sh [OPTIONS]" ]
}

@test "handles missing argument" {
    run ./script.sh
    [ "$status" -eq 1 ]
    [[ "$output" =~ "Error" ]]
}
```

Run tests:
```bash
bats test/
```

For CI/CD integration and debugging techniques, see `references/testing-guide.md`.

## Defensive Programming Checklist

```bash
#!/bin/bash
set -euo pipefail

# Check required commands
command -v jq >/dev/null 2>&1 || {
    echo "Error: jq required" >&2
    exit 1
}

# Check environment variables
: "${API_KEY:?Error: API_KEY required}"

# Check files
[ -f "$CONFIG_FILE" ] || {
    echo "Error: Config not found: $CONFIG_FILE" >&2
    exit 1
}

# Quote all variables
echo "Processing: $file"        # ❌ Unquoted
echo "Processing: \"$file\""    # ✅ Quoted
```

## Platform Considerations

### macOS vs Linux Differences

```bash
# sed in-place
sed -i '' 's/old/new/g' file.txt    # macOS
sed -i 's/old/new/g' file.txt       # Linux

# Portable: Use temp file
sed 's/old/new/g' file.txt > file.txt.tmp
mv file.txt.tmp file.txt

# readlink
readlink -f /path                    # Linux only
cd "$(dirname "$0")" && pwd         # Portable
```

For complete platform differences, see `references/portability-guide.md`.

## Script Categories

**System Administration:** Cron jobs, log rotation, backup automation
**Build/Deployment:** CI/CD pipelines, Docker builds, deployments
**Development Tooling:** Project setup, test runners, code generators
**Container Entrypoints:** Initialization, signal handling, configuration

## Production Script Template

```bash
#!/bin/bash
set -euo pipefail

readonly SCRIPT_NAME="$(basename "$0")"
readonly SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"

TEMP_DIR=""

cleanup() {
    local exit_code=$?
    rm -rf "$TEMP_DIR"
    exit "$exit_code"
}

trap cleanup EXIT

log() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] $*" >&2
}

main() {
    # Check dependencies
    command -v jq >/dev/null 2>&1 || exit 1

    # Parse arguments
    # Validate input
    # Process
    # Report results

    log "Completed successfully"
}

main "$@"
```

For complete production template, see `examples/production-template.sh`.

## Tool Recommendations

**Core Tools:**
- **jq**: JSON parsing and transformation
- **yq**: YAML parsing (v4 recommended)
- **ShellCheck**: Static analysis and linting
- **Bats**: Automated testing framework

**Installation:**
```bash
# macOS
brew install jq yq shellcheck bats-core

# Ubuntu/Debian
apt-get install jq shellcheck
```

## Related Skills

- **linux-administration**: System commands and administration
- **building-ci-pipelines**: Using scripts in CI/CD
- **infrastructure-as-code**: Terraform/Pulumi wrappers
- **kubernetes-operations**: kubectl scripts, Helm hooks
- **writing-dockerfiles**: Container entrypoints

## Additional Resources

**Reference Files:**
- `references/error-handling.md` - Comprehensive error patterns
- `references/argument-parsing.md` - Advanced parsing techniques
- `references/parameter-expansion.md` - Complete expansion reference
- `references/portability-guide.md` - POSIX vs Bash differences
- `references/testing-guide.md` - ShellCheck and Bats guide
- `references/common-utilities.md` - jq, yq, awk, sed usage

**Example Scripts:**
- `examples/production-template.sh` - Production-ready template
- `examples/getopts-basic.sh` - Simple getopts usage
- `examples/getopts-advanced.sh` - Complex option handling
- `examples/long-options.sh` - Manual long option parsing
- `examples/error-handling.sh` - Error handling patterns
- `examples/json-yaml-processing.sh` - jq/yq examples

**Utility Scripts:**
- `scripts/lint-script.sh` - ShellCheck wrapper for CI
- `scripts/test-script.sh` - Bats wrapper for CI


---

## Referenced Files

> The following files are referenced in this skill and included for context.

### references/portability-guide.md

```markdown
# Shell Portability Guide: POSIX sh vs Bash

Comprehensive guide to writing portable shell scripts and understanding differences between POSIX sh and Bash.

## Table of Contents

- [Decision Framework](#decision-framework)
- [POSIX sh Features](#posix-sh-features)
- [Bash-Specific Features](#bash-specific-features)
- [Common Bashisms and Alternatives](#common-bashisms-and-alternatives)
- [Platform Differences](#platform-differences)
- [Testing for Portability](#testing-for-portability)
- [Best Practices](#best-practices)

---

## Decision Framework

### Use POSIX sh (#!/bin/sh) When:

✅ **Maximum portability required:**
- Script runs on Linux, macOS, BSD, Solaris, AIX
- Minimal container images (Alpine, Distroless)
- Embedded systems
- Unknown target environments

✅ **Guaranteed availability:**
- POSIX sh is guaranteed on all Unix-like systems
- No need to check if Bash is installed

✅ **Minimal dependencies:**
- Alpine Linux base images (`/bin/sh` is dash or busybox)
- Distroless containers
- Restricted environments

### Use Bash (#!/bin/bash) When:

✅ **Controlled environment:**
- Known OS/distribution (CentOS, Ubuntu, Debian)
- Specific container base image with Bash
- Development environment (macOS, Linux workstations)

✅ **Complexity justifies Bash features:**
- Arrays or associative arrays needed
- Advanced parameter expansion simplifies logic
- Process substitution `<(cmd)` beneficial
- `[[]]` for safer conditionals

✅ **Performance matters:**
- Bash builtins are faster than external commands
- String manipulation without spawning processes

---

## POSIX sh Features

### Guaranteed Features in POSIX sh

```sh
#!/bin/sh

# Variable assignment
name="Alice"

# Command substitution (backticks or $(...))
current_date=$(date +%Y-%m-%d)
current_date=`date +%Y-%m-%d`  # Old style, still POSIX

# Conditionals with [ or test
if [ -f "$file" ]; then
    echo "File exists"
fi

# Case statements
case "$var" in
    pattern1) echo "Match 1" ;;
    pattern2) echo "Match 2" ;;
    *) echo "Default" ;;
esac

# Loops
for item in item1 item2 item3; do
    echo "$item"
done

while [ "$count" -lt 10 ]; do
    echo "$count"
    count=$((count + 1))
done

# Functions
my_function() {
    echo "Arguments: $*"
    return 0
}

# Arithmetic with $((...))
result=$((5 + 3))

# Here documents
cat <<EOF
This is a here document
With multiple lines
EOF

# Redirection
echo "Output" > file.txt
echo "Append" >> file.txt
command 2>&1  # Redirect stderr to stdout
```

### POSIX String Operations

```sh
#!/bin/sh

# Parameter expansion
echo "${var:-default}"     # Default value
echo "${var:=default}"     # Assign default
echo "${var:?error}"       # Error if unset

# Remove shortest prefix/suffix
echo "${var#prefix}"       # Remove shortest prefix
echo "${var##prefix}"      # Remove longest prefix
echo "${var%suffix}"       # Remove shortest suffix
echo "${var%%suffix}"      # Remove longest suffix

# String length (POSIX-compliant method)
var="hello"
length=$(echo "$var" | wc -c)
length=$((length - 1))  # Remove newline
```

---

## Bash-Specific Features

### Arrays (NOT in POSIX sh)

```bash
#!/bin/bash

# Indexed arrays
files=("file1.txt" "file2.txt" "file3.txt")
echo "${files[0]}"        # First element
echo "${files[@]}"        # All elements
echo "${#files[@]}"       # Array length

# Associative arrays (Bash 4+)
declare -A config
config[host]="localhost"
config[port]="8080"
echo "${config[host]}"
```

**POSIX Alternative:**
```sh
#!/bin/sh

# Space-separated list
files="file1.txt file2.txt file3.txt"
for file in $files; do
    echo "$file"
done

# Positional parameters as array
set -- "item1" "item2" "item3"
echo "$1"  # First item
echo "$#"  # Count
```

### [[ ]] Conditionals (NOT in POSIX sh)

```bash
#!/bin/bash

# [[ ]] is Bash-specific
if [[ -f "$file" && -r "$file" ]]; then
    echo "File exists and is readable"
fi

# Pattern matching
if [[ "$var" == pattern* ]]; then
    echo "Matches pattern"
fi

# Regular expressions
if [[ "$email" =~ ^[a-z]+@[a-z]+\.[a-z]+$ ]]; then
    echo "Valid email"
fi
```

**POSIX Alternative:**
```sh
#!/bin/sh

# Use [ ] with separate conditions
if [ -f "$file" ] && [ -r "$file" ]; then
    echo "File exists and is readable"
fi

# Pattern matching with case
case "$var" in
    pattern*) echo "Matches pattern" ;;
esac

# Regular expressions with grep
if echo "$email" | grep -qE '^[a-z]+@[a-z]+\.[a-z]+$'; then
    echo "Valid email"
fi
```

### Process Substitution (NOT in POSIX sh)

```bash
#!/bin/bash

# Process substitution <(...)
diff <(sort file1.txt) <(sort file2.txt)

# Named pipe alternative
while read -r line; do
    echo "Line: $line"
done < <(command)
```

**POSIX Alternative:**
```sh
#!/bin/sh

# Use temporary files
sort file1.txt > /tmp/sorted1.txt
sort file2.txt > /tmp/sorted2.txt
diff /tmp/sorted1.txt /tmp/sorted2.txt
rm -f /tmp/sorted1.txt /tmp/sorted2.txt
```

### Advanced Parameter Expansion

```bash
#!/bin/bash

# Pattern replacement
echo "${var/old/new}"      # Replace first
echo "${var//old/new}"     # Replace all

# Case modification (Bash 4+)
echo "${var^^}"            # Uppercase
echo "${var,,}"            # Lowercase

# Substring extraction
echo "${var:0:5}"          # First 5 characters
```

**POSIX Alternative:**
```sh
#!/bin/sh

# Pattern replacement with sed
new_var=$(echo "$var" | sed 's/old/new/')      # Replace first
new_var=$(echo "$var" | sed 's/old/new/g')     # Replace all

# Case modification with tr
upper=$(echo "$var" | tr '[:lower:]' '[:upper:]')
lower=$(echo "$var" | tr '[:upper:]' '[:lower:]')

# Substring extraction with cut/sed
first_five=$(echo "$var" | cut -c1-5)
```

### Local Variables (NOT in POSIX sh)

```bash
#!/bin/bash

my_function() {
    local var="local scope"
    echo "$var"
}

var="global scope"
my_function
echo "$var"  # Still "global scope"
```

**POSIX Alternative:**
```sh
#!/bin/sh

my_function() {
    # Use unique variable names
    _my_function_var="function scope"
    echo "$_my_function_var"
}

# Or save/restore
my_function() {
    _saved_var="$var"
    var="function scope"
    echo "$var"
    var="$_saved_var"
}
```

---

## Common Bashisms and Alternatives

### String Comparison

```bash
# ❌ Bashism: [[ ]] with ==
if [[ "$var" == "value" ]]; then

# ✅ POSIX: [ ] with =
if [ "$var" = "value" ]; then
```

### Arithmetic

```bash
# ❌ Bashism: ((  ))
((count++))
((count = count + 1))

# ✅ POSIX: $(( ))
count=$((count + 1))
count=$((count + 1))
```

### Function Declaration

```bash
# ❌ Bashism: function keyword
function my_func {
    echo "Hello"
}

# ✅ POSIX: No function keyword
my_func() {
    echo "Hello"
}
```

### echo vs printf

```bash
# ⚠️ Inconsistent: echo (varies by platform)
echo -n "No newline"    # Works on Linux, not portable
echo -e "Line\nBreak"   # Works on Linux, not portable

# ✅ POSIX: printf
printf "No newline"
printf "Line\nBreak\n"
```

### Source vs Dot

```bash
# ❌ Bashism: source
source script.sh

# ✅ POSIX: . (dot)
. script.sh
```

### Brace Expansion

```bash
# ❌ Bashism: Brace expansion
echo {1..10}
mkdir -p dir/{sub1,sub2,sub3}

# ✅ POSIX: Explicit or loop
echo 1 2 3 4 5 6 7 8 9 10
mkdir -p dir/sub1 dir/sub2 dir/sub3

# Or use seq
for i in $(seq 1 10); do
    echo "$i"
done
```

---

## Platform Differences

### Linux vs macOS

#### sed -i (In-Place Editing)

```bash
# macOS requires empty string
sed -i '' 's/old/new/g' file.txt

# Linux doesn't
sed -i 's/old/new/g' file.txt

# Portable solution: Use temporary file
sed 's/old/new/g' file.txt > file.txt.tmp
mv file.txt.tmp file.txt
```

#### readlink (Resolve Symlinks)

```bash
# Linux (GNU coreutils)
realpath=$(readlink -f /path/to/symlink)

# macOS (BSD)
# readlink -f not supported

# Portable solution
script_dir="$(cd "$(dirname "$0")" && pwd)"

# Or install GNU coreutils on macOS
# brew install coreutils
realpath=$(greadlink -f /path/to/symlink)
```

#### date (Date Arithmetic)

```bash
# Linux (GNU date)
yesterday=$(date -d "yesterday" +%Y-%m-%d)
next_week=$(date -d "+7 days" +%Y-%m-%d)

# macOS (BSD date)
yesterday=$(date -v-1d +%Y-%m-%d)
next_week=$(date -v+7d +%Y-%m-%d)

# Portable solution: Use date -u and timestamps
timestamp=$(date +%s)
yesterday_ts=$((timestamp - 86400))
yesterday=$(date -u -r "$yesterday_ts" +%Y-%m-%d)  # macOS
yesterday=$(date -u -d "@$yesterday_ts" +%Y-%m-%d)  # Linux
```

#### find (Extended Options)

```bash
# GNU find (Linux)
find . -type f -newer reference_file

# BSD find (macOS)
# Some options differ

# Portable: Stick to POSIX find options
find . -type f -name "*.txt"
```

### Alpine Linux (Busybox)

Alpine uses Busybox, which provides limited versions of common utilities:

```sh
#!/bin/sh

# Busybox sh is POSIX-compliant but minimal
# No Bash features
# Limited options for common commands

# Check if running in Busybox
if [ -L /bin/sh ] && [ "$(readlink /bin/sh)" = "busybox" ]; then
    echo "Running in Busybox"
fi

# Use POSIX-compliant patterns
# Avoid GNU-specific options (--long-options)
```

---

## Testing for Portability

### Test with Different Shells

```bash
# Test with sh (POSIX)
sh script.sh

# Test with dash (strict POSIX)
dash script.sh

# Test with bash
bash script.sh

# Test with busybox
busybox sh script.sh
```

### ShellCheck for Portability

```bash
# Check for POSIX compliance
shellcheck --shell=sh script.sh

# Check for Bash-specific issues
shellcheck --shell=bash script.sh

# Check for specific issues
shellcheck --exclude=SC2086 script.sh
```

**Common ShellCheck Warnings:**
- SC2006: Use $(...) instead of backticks
- SC2039: POSIX sh doesn't support arrays, [[ ]], etc.
- SC2086: Quote variables to prevent word splitting
- SC2046: Quote command substitution

### Docker Testing

```bash
# Test in Alpine (Busybox)
docker run --rm -v "$PWD:/scripts" alpine:latest sh /scripts/script.sh

# Test in Debian (dash for /bin/sh)
docker run --rm -v "$PWD:/scripts" debian:latest sh /scripts/script.sh

# Test in Ubuntu (dash for /bin/sh)
docker run --rm -v "$PWD:/scripts" ubuntu:latest sh /scripts/script.sh
```

### Platform Detection

```bash
#!/bin/sh

# Detect OS
case "$(uname -s)" in
    Linux*)
        OS="Linux"
        ;;
    Darwin*)
        OS="macOS"
        ;;
    FreeBSD*|OpenBSD*|NetBSD*)
        OS="BSD"
        ;;
    SunOS*)
        OS="Solaris"
        ;;
    *)
        OS="Unknown"
        ;;
esac

echo "Detected OS: $OS"

# Platform-specific behavior
if [ "$OS" = "macOS" ]; then
    sed -i '' 's/old/new/g' file.txt
else
    sed -i 's/old/new/g' file.txt
fi
```

---

## Best Practices

### 1. Choose Shebang Carefully

```bash
#!/bin/sh        # POSIX sh (portable)
#!/bin/bash      # Bash-specific
#!/usr/bin/env bash  # Find Bash in PATH (more portable)
```

### 2. Test on Target Platforms

- Develop on one platform, test on all targets
- Use Docker for cross-platform testing
- Test with dash (strict POSIX compliance)

### 3. Use ShellCheck

```bash
# Install ShellCheck
# macOS: brew install shellcheck
# Ubuntu: apt-get install shellcheck

# Check script
shellcheck --shell=sh script.sh
```

### 4. Quote Variables

```bash
# ❌ Unquoted (word splitting, glob expansion)
echo $var
cp $source $dest

# ✅ Quoted
echo "$var"
cp "$source" "$dest"
```

### 5. Avoid Bashisms in sh Scripts

```bash
# ❌ Using [[ ]] in #!/bin/sh
#!/bin/sh
if [[ "$var" == "value" ]]; then

# ✅ Use [ ] in #!/bin/sh
#!/bin/sh
if [ "$var" = "value" ]; then
```

### 6. Use POSIX Utilities

```bash
# Avoid GNU-specific long options
grep --color=auto  # GNU
grep -E            # POSIX

# Avoid GNU-specific features
date -d "tomorrow"  # GNU
date -v+1d          # BSD
```

### 7. Provide Fallbacks

```bash
#!/bin/sh

# Check for command availability
if command -v jq >/dev/null 2>&1; then
    # Use jq
    result=$(echo "$json" | jq -r '.field')
else
    # Fallback to sed/awk
    result=$(echo "$json" | sed -n 's/.*"field": "\([^"]*\)".*/\1/p')
fi
```

### 8. Document Requirements

```bash
#!/bin/sh

# Requirements:
# - POSIX sh
# - jq (for JSON parsing)
# - curl (for HTTP requests)

# Check dependencies
for cmd in jq curl; do
    command -v "$cmd" >/dev/null 2>&1 || {
        echo "Error: $cmd is required but not installed" >&2
        exit 1
    }
done
```

---

## Summary

**POSIX sh:**
- Maximum portability
- Guaranteed on all Unix-like systems
- Smaller feature set, more verbose
- Use for production scripts in unknown environments

**Bash:**
- Rich feature set (arrays, [[ ]], process substitution)
- More concise for complex tasks
- Requires Bash installation
- Use in controlled environments

**Testing:**
- ShellCheck for static analysis
- dash for strict POSIX compliance
- Docker for cross-platform testing

**Portability Checklist:**
- [ ] Use `#!/bin/sh` for portable scripts
- [ ] Avoid Bashisms (arrays, [[ ]], function keyword)
- [ ] Test with ShellCheck --shell=sh
- [ ] Test on target platforms
- [ ] Quote all variables
- [ ] Use POSIX utilities and options
- [ ] Document dependencies

```

### references/error-handling.md

```markdown
# Error Handling Patterns for Shell Scripts

Comprehensive guide to error handling patterns in shell scripts, covering fail-fast behaviors, explicit checking, cleanup handlers, and defensive programming.

## Table of Contents

- [Set Options for Fail-Fast Behavior](#set-options-for-fail-fast-behavior)
- [Explicit Exit Code Checking](#explicit-exit-code-checking)
- [Trap Handlers for Cleanup](#trap-handlers-for-cleanup)
- [Defensive Programming Patterns](#defensive-programming-patterns)
- [Error Reporting Best Practices](#error-reporting-best-practices)
- [Common Error Scenarios](#common-error-scenarios)

---

## Set Options for Fail-Fast Behavior

### The Standard Pattern: set -euo pipefail

```bash
#!/bin/bash
set -euo pipefail

# -e: Exit immediately if any command exits with non-zero status
# -u: Treat unset variables as an error
# -o pipefail: Return exit code of rightmost failed command in pipeline
```

### Individual Option Behaviors

#### -e (errexit)

Exits immediately if any command returns non-zero:

```bash
#!/bin/bash
set -e

false  # Script exits here with exit code 1
echo "This will never execute"
```

**Exceptions where -e does NOT exit:**
- Commands in conditional tests: `if command; then`
- Commands with `||` or `&&`: `command || true`
- Commands in pipelines (without -o pipefail)

#### -u (nounset)

Exits if an undefined variable is referenced:

```bash
#!/bin/bash
set -u

echo "$UNDEFINED_VAR"  # Script exits with error
```

**Common issue:**
```bash
# ❌ Fails with set -u if variable is empty
if [ -z "$OPTIONAL_VAR" ]; then
    echo "Variable is empty"
fi

# ✅ Use parameter expansion
if [ -z "${OPTIONAL_VAR:-}" ]; then
    echo "Variable is empty"
fi
```

#### -o pipefail

Returns exit code of rightmost failed command in pipeline:

```bash
#!/bin/bash
set -o pipefail

# Without pipefail: exit code is from 'wc' (0), even if grep fails
grep "pattern" file.txt | wc -l

# With pipefail: exit code is from 'grep' if it fails
```

### When to Use set -euo pipefail

**✅ Use for:**
- Production automation scripts
- CI/CD build scripts
- Deployment scripts
- Scripts where partial execution is dangerous
- Cron jobs and scheduled tasks

**❌ Do NOT use for:**
- Interactive scripts where user can recover
- Scripts that intentionally handle failures
- Scripts using commands that may fail normally (e.g., grep returning no matches)

### Alternative: set -Eeuo pipefail

```bash
#!/bin/bash
set -Eeuo pipefail

# -E: ERR trap is inherited by functions and command substitutions
```

Enables more comprehensive error handling with trap:

```bash
#!/bin/bash
set -Eeuo pipefail

error_handler() {
    local line=$1
    echo "Error on line $line" >&2
    exit 1
}

trap 'error_handler $LINENO' ERR

# Error will trigger trap with line number
false
```

---

## Explicit Exit Code Checking

### Basic Pattern

```bash
#!/bin/bash

if ! command_that_might_fail; then
    echo "Error: Command failed" >&2
    exit 1
fi
```

### Capturing and Checking Exit Codes

```bash
#!/bin/bash

command_that_might_fail
exit_code=$?

if [ "$exit_code" -ne 0 ]; then
    echo "Error: Command failed with exit code $exit_code" >&2
    exit 1
fi
```

### Different Handling per Exit Code

```bash
#!/bin/bash

curl -sSL https://example.com/api
exit_code=$?

case "$exit_code" in
    0)
        echo "Success"
        ;;
    6)
        echo "Error: Could not resolve host" >&2
        exit 1
        ;;
    7)
        echo "Error: Failed to connect" >&2
        exit 1
        ;;
    22)
        echo "Error: HTTP error (4xx/5xx)" >&2
        exit 1
        ;;
    *)
        echo "Error: curl failed with exit code $exit_code" >&2
        exit 1
        ;;
esac
```

### Conditional Execution with Exit Codes

```bash
#!/bin/bash

# Execute command2 only if command1 succeeds
command1 && command2

# Execute command2 only if command1 fails
command1 || command2

# Chain multiple commands (all must succeed)
command1 && command2 && command3

# Fallback pattern
command_that_might_fail || {
    echo "Error: Command failed, attempting recovery" >&2
    recovery_command
}
```

### Ignoring Expected Failures

```bash
#!/bin/bash
set -e

# Explicitly ignore failure (when failure is acceptable)
grep "pattern" file.txt || true

# Or disable -e temporarily
set +e
command_that_may_fail
exit_code=$?
set -e

if [ "$exit_code" -ne 0 ]; then
    # Handle failure
    echo "Command failed (expected)" >&2
fi
```

---

## Trap Handlers for Cleanup

### Basic Cleanup Pattern

```bash
#!/bin/bash
set -euo pipefail

# Create temporary file
TEMP_FILE=$(mktemp)

# Cleanup function
cleanup() {
    echo "Cleaning up..." >&2
    rm -f "$TEMP_FILE"
}

# Register cleanup on EXIT
trap cleanup EXIT

# Script logic
echo "data" > "$TEMP_FILE"
# Cleanup runs automatically on exit (success or failure)
```

### Preserving Exit Code in Cleanup

```bash
#!/bin/bash
set -euo pipefail

cleanup() {
    local exit_code=$?
    echo "Cleaning up..." >&2
    rm -f "$TEMP_FILE"
    # Exit with original exit code
    exit "$exit_code"
}

trap cleanup EXIT
```

### Multiple Trap Signals

```bash
#!/bin/bash

cleanup() {
    echo "Cleanup triggered by: $1" >&2
    rm -f "$LOCK_FILE"
}

# Trap multiple signals
trap 'cleanup EXIT' EXIT
trap 'cleanup SIGINT' INT
trap 'cleanup SIGTERM' TERM
```

### Advanced: Error Handler with Line Number

```bash
#!/bin/bash
set -Eeuo pipefail

error_handler() {
    local line=$1
    local command=$2
    echo "Error on line $line: $command" >&2
    cleanup
    exit 1
}

cleanup() {
    echo "Performing cleanup..." >&2
    rm -f "$TEMP_FILE"
}

trap 'error_handler $LINENO "$BASH_COMMAND"' ERR
trap cleanup EXIT

# Script logic
TEMP_FILE=$(mktemp)
false  # Triggers error handler with line number
```

### Nested Trap Handlers

```bash
#!/bin/bash
set -euo pipefail

# Global cleanup
global_cleanup() {
    echo "Global cleanup" >&2
    rm -f /tmp/global_resource
}

trap global_cleanup EXIT

# Function-specific cleanup
process_file() {
    local temp_file=$(mktemp)

    # Local cleanup
    local_cleanup() {
        echo "Local cleanup" >&2
        rm -f "$temp_file"
    }

    trap local_cleanup RETURN

    # Process file
    echo "Processing..." > "$temp_file"
}

process_file
# Local cleanup runs when function returns
# Global cleanup runs when script exits
```

---

## Defensive Programming Patterns

### Check Required Commands

```bash
#!/bin/bash

# Method 1: Using command -v
command -v jq >/dev/null 2>&1 || {
    echo "Error: jq is required but not installed" >&2
    exit 1
}

# Method 2: Using type
type jq >/dev/null 2>&1 || {
    echo "Error: jq is required but not installed" >&2
    exit 1
}

# Method 3: Using which (less portable)
which jq >/dev/null 2>&1 || {
    echo "Error: jq is required but not installed" >&2
    exit 1
}
```

**Preferred:** `command -v` (most portable, POSIX-compliant)

### Check Required Environment Variables

```bash
#!/bin/bash

# Method 1: Parameter expansion with error
: "${API_KEY:?Error: API_KEY environment variable is required}"

# Method 2: Explicit check
if [ -z "${API_KEY:-}" ]; then
    echo "Error: API_KEY environment variable is required" >&2
    exit 1
fi

# Method 3: Check multiple variables
for var in API_KEY API_SECRET DATABASE_URL; do
    if [ -z "${!var:-}" ]; then
        echo "Error: $var environment variable is required" >&2
        exit 1
    fi
done
```

### Check File Existence and Permissions

```bash
#!/bin/bash

CONFIG_FILE="config.yaml"

# Check file exists
if [ ! -f "$CONFIG_FILE" ]; then
    echo "Error: Config file not found: $CONFIG_FILE" >&2
    exit 1
fi

# Check file is readable
if [ ! -r "$CONFIG_FILE" ]; then
    echo "Error: Cannot read config file: $CONFIG_FILE" >&2
    exit 1
fi

# Check file is writable
if [ ! -w "$CONFIG_FILE" ]; then
    echo "Error: Cannot write to config file: $CONFIG_FILE" >&2
    exit 1
fi

# Check directory exists
if [ ! -d "$DATA_DIR" ]; then
    echo "Error: Data directory not found: $DATA_DIR" >&2
    exit 1
fi
```

### Validate Input Arguments

```bash
#!/bin/bash

validate_input() {
    local input=$1

    # Check not empty
    if [ -z "$input" ]; then
        echo "Error: Input cannot be empty" >&2
        return 1
    fi

    # Check format (e.g., email)
    if ! echo "$input" | grep -qE '^[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}$'; then
        echo "Error: Invalid email format: $input" >&2
        return 1
    fi

    return 0
}

# Usage
if ! validate_input "$USER_EMAIL"; then
    exit 1
fi
```

### Defensive Variable Quoting

```bash
#!/bin/bash

# ❌ BAD: Unquoted variables (word splitting, glob expansion)
echo $file
cp $source $dest
for item in $list; do

# ✅ GOOD: Quoted variables
echo "$file"
cp "$source" "$dest"
for item in "$list"; do

# ❌ BAD: Unquoted command substitution
files=$(ls *.txt)
for file in $files; do

# ✅ GOOD: Use array (Bash)
files=(*.txt)
for file in "${files[@]}"; do
```

---

## Error Reporting Best Practices

### Structured Logging

```bash
#!/bin/bash

# Logging functions
log_info() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] INFO: $*" >&2
}

log_error() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] ERROR: $*" >&2
}

log_warning() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] WARNING: $*" >&2
}

# Usage
log_info "Starting script"
log_error "Failed to connect to database"
log_warning "Retrying in 5 seconds"
```

### Error Messages to stderr

```bash
#!/bin/bash

# ❌ BAD: Error to stdout
echo "Error: Something went wrong"

# ✅ GOOD: Error to stderr
echo "Error: Something went wrong" >&2

# ✅ GOOD: Using printf
printf "Error: %s\n" "Something went wrong" >&2
```

### Contextual Error Messages

```bash
#!/bin/bash

# ❌ BAD: Generic error
echo "Error: Command failed" >&2

# ✅ GOOD: Contextual error
echo "Error: Failed to download file from https://example.com/data.json" >&2
echo "Error: Database connection failed (host: $DB_HOST, port: $DB_PORT)" >&2
echo "Error: Invalid configuration in $CONFIG_FILE at line $line_number" >&2
```

### Notification on Critical Errors

```bash
#!/bin/bash

notify_error() {
    local message=$1

    # Log to syslog
    logger -t "$SCRIPT_NAME" -p user.err "$message"

    # Send email (if configured)
    if [ -n "${ADMIN_EMAIL:-}" ]; then
        echo "$message" | mail -s "[$HOSTNAME] Script Error" "$ADMIN_EMAIL"
    fi

    # Send to Slack (if webhook configured)
    if [ -n "${SLACK_WEBHOOK:-}" ]; then
        curl -X POST "$SLACK_WEBHOOK" \
            -H 'Content-Type: application/json' \
            -d "{\"text\": \"$message\"}"
    fi
}

# Usage
critical_operation || {
    notify_error "Critical operation failed on $HOSTNAME"
    exit 1
}
```

---

## Common Error Scenarios

### Network Requests

```bash
#!/bin/bash
set -euo pipefail

# Download with retry
download_with_retry() {
    local url=$1
    local output=$2
    local max_attempts=3
    local attempt=1

    while [ "$attempt" -le "$max_attempts" ]; do
        echo "Attempt $attempt of $max_attempts..." >&2

        if curl -sSL -o "$output" "$url"; then
            echo "Download successful" >&2
            return 0
        fi

        echo "Download failed, retrying in 5 seconds..." >&2
        sleep 5
        attempt=$((attempt + 1))
    done

    echo "Error: Failed to download after $max_attempts attempts" >&2
    return 1
}

# Usage
download_with_retry "https://example.com/data.json" "data.json" || exit 1
```

### File Operations

```bash
#!/bin/bash
set -euo pipefail

# Safe file operations with error handling
safe_copy() {
    local source=$1
    local dest=$2

    # Check source exists
    if [ ! -f "$source" ]; then
        echo "Error: Source file not found: $source" >&2
        return 1
    fi

    # Check destination directory exists
    local dest_dir=$(dirname "$dest")
    if [ ! -d "$dest_dir" ]; then
        echo "Error: Destination directory not found: $dest_dir" >&2
        return 1
    fi

    # Perform copy with error handling
    if ! cp "$source" "$dest"; then
        echo "Error: Failed to copy $source to $dest" >&2
        return 1
    fi

    echo "Successfully copied $source to $dest" >&2
    return 0
}

# Usage
safe_copy "source.txt" "destination.txt" || exit 1
```

### Command Execution

```bash
#!/bin/bash
set -euo pipefail

# Execute command with timeout
execute_with_timeout() {
    local timeout=$1
    shift
    local command=("$@")

    # Run command with timeout
    if ! timeout "$timeout" "${command[@]}"; then
        local exit_code=$?

        if [ "$exit_code" -eq 124 ]; then
            echo "Error: Command timed out after ${timeout}s: ${command[*]}" >&2
        else
            echo "Error: Command failed with exit code $exit_code: ${command[*]}" >&2
        fi

        return 1
    fi

    return 0
}

# Usage
execute_with_timeout 30 curl -sSL https://example.com/api || exit 1
```

### Process Management

```bash
#!/bin/bash
set -euo pipefail

# Start background process with PID tracking
start_background_process() {
    local command=$1
    local pid_file=$2

    # Start process in background
    $command &
    local pid=$!

    # Save PID
    echo "$pid" > "$pid_file"

    # Wait briefly and check if process is running
    sleep 1
    if ! kill -0 "$pid" 2>/dev/null; then
        echo "Error: Background process failed to start" >&2
        rm -f "$pid_file"
        return 1
    fi

    echo "Started background process with PID $pid" >&2
    return 0
}

# Stop background process
stop_background_process() {
    local pid_file=$1

    if [ ! -f "$pid_file" ]; then
        echo "Error: PID file not found: $pid_file" >&2
        return 1
    fi

    local pid=$(cat "$pid_file")

    if kill -0 "$pid" 2>/dev/null; then
        echo "Stopping process $pid..." >&2
        kill "$pid"

        # Wait for graceful shutdown
        local timeout=10
        while [ "$timeout" -gt 0 ] && kill -0 "$pid" 2>/dev/null; do
            sleep 1
            timeout=$((timeout - 1))
        done

        # Force kill if still running
        if kill -0 "$pid" 2>/dev/null; then
            echo "Force killing process $pid..." >&2
            kill -9 "$pid"
        fi
    fi

    rm -f "$pid_file"
    return 0
}
```

---

## Summary

**Key Principles:**
1. Use `set -euo pipefail` for production scripts
2. Always check exit codes of critical commands explicitly
3. Use trap handlers for guaranteed cleanup
4. Perform defensive checks early (commands, variables, files)
5. Report errors to stderr with context
6. Quote all variables and command substitutions

**Anti-Patterns to Avoid:**
- Ignoring exit codes
- Using unquoted variables
- Missing cleanup handlers
- Generic error messages
- Errors to stdout instead of stderr

```

### references/argument-parsing.md

```markdown
# Argument Parsing in Shell Scripts

Comprehensive guide to parsing command-line arguments in shell scripts, covering getopts, manual parsing, and hybrid approaches.

## Table of Contents

- [getopts for Short Options](#getopts-for-short-options)
- [Manual Parsing for Long Options](#manual-parsing-for-long-options)
- [Hybrid Approach (Short + Long)](#hybrid-approach-short--long)
- [Positional Arguments](#positional-arguments)
- [Required vs Optional Arguments](#required-vs-optional-arguments)
- [Validation Patterns](#validation-patterns)
- [Usage Documentation](#usage-documentation)

---

## getopts for Short Options

### Basic Pattern

```bash
#!/bin/bash

usage() {
    cat <<EOF
Usage: $0 [-h] [-v] [-f FILE]

Options:
    -h          Show this help message
    -v          Enable verbose mode
    -f FILE     Input file (required)
EOF
    exit 1
}

# Default values
VERBOSE=false
INPUT_FILE=""

# Parse options
while getopts "hvf:" opt; do
    case "$opt" in
        h) usage ;;
        v) VERBOSE=true ;;
        f) INPUT_FILE="$OPTARG" ;;
        *) usage ;;
    esac
done

# Shift past parsed options
shift $((OPTIND - 1))

# Check required arguments
if [ -z "$INPUT_FILE" ]; then
    echo "Error: -f FILE is required" >&2
    usage
fi

# Remaining positional arguments in "$@"
echo "Input file: $INPUT_FILE"
echo "Remaining args: $*"
```

### Option Types

```bash
#!/bin/bash

# Option string: "ab:c::"
# a   - Boolean flag (no argument)
# b:  - Required argument
# c:: - Optional argument (non-POSIX, Bash only)

while getopts "ab:c::" opt; do
    case "$opt" in
        a)
            FLAG_A=true
            ;;
        b)
            VALUE_B="$OPTARG"
            ;;
        c)
            VALUE_C="${OPTARG:-default}"
            ;;
        *)
            usage
            ;;
    esac
done
```

### Option Bundling

getopts supports option bundling (-abc equivalent to -a -b -c):

```bash
#!/bin/bash

# Parse bundled options
while getopts "vdf:" opt; do
    case "$opt" in
        v) VERBOSE=true ;;
        d) DEBUG=true ;;
        f) FILE="$OPTARG" ;;
        *) usage ;;
    esac
done

# All of these work:
# ./script.sh -v -d -f file.txt
# ./script.sh -vd -f file.txt
# ./script.sh -vdf file.txt
```

### Multiple Values for Same Option

```bash
#!/bin/bash

# Collect multiple values in array
FILES=()

while getopts "f:" opt; do
    case "$opt" in
        f)
            FILES+=("$OPTARG")
            ;;
        *)
            usage
            ;;
    esac
done

# Usage: ./script.sh -f file1.txt -f file2.txt -f file3.txt
echo "Files: ${FILES[@]}"
```

### Error Handling in getopts

```bash
#!/bin/bash

# Silent error reporting (handle errors manually)
while getopts ":hvf:" opt; do
    case "$opt" in
        h) usage ;;
        v) VERBOSE=true ;;
        f) INPUT_FILE="$OPTARG" ;;
        \?)
            echo "Error: Invalid option -$OPTARG" >&2
            usage
            ;;
        :)
            echo "Error: Option -$OPTARG requires an argument" >&2
            usage
            ;;
    esac
done

# Note: Leading ':' in ":hvf:" enables silent error reporting
```

---

## Manual Parsing for Long Options

### Basic Pattern

```bash
#!/bin/bash

usage() {
    cat <<EOF
Usage: $0 [OPTIONS]

Options:
    --help              Show this help
    --verbose           Enable verbose mode
    --file FILE         Input file (required)
    --output OUTPUT     Output file (optional)
    --dry-run           Show what would be done
EOF
    exit 1
}

# Defaults
VERBOSE=false
INPUT_FILE=""
OUTPUT_FILE=""
DRY_RUN=false

# Parse long options
while [[ $# -gt 0 ]]; do
    case "$1" in
        --help)
            usage
            ;;
        --verbose)
            VERBOSE=true
            shift
            ;;
        --file)
            INPUT_FILE="$2"
            shift 2
            ;;
        --output)
            OUTPUT_FILE="$2"
            shift 2
            ;;
        --dry-run)
            DRY_RUN=true
            shift
            ;;
        -*)
            echo "Error: Unknown option: $1" >&2
            usage
            ;;
        *)
            # Positional argument
            break
            ;;
    esac
done

# Check required
if [ -z "$INPUT_FILE" ]; then
    echo "Error: --file is required" >&2
    usage
fi
```

### With Equals Sign (--file=value)

```bash
#!/bin/bash

while [[ $# -gt 0 ]]; do
    case "$1" in
        --file=*)
            INPUT_FILE="${1#*=}"
            shift
            ;;
        --file)
            INPUT_FILE="$2"
            shift 2
            ;;
        --output=*)
            OUTPUT_FILE="${1#*=}"
            shift
            ;;
        --output)
            OUTPUT_FILE="$2"
            shift 2
            ;;
        *)
            break
            ;;
    esac
done

# Supports both:
# ./script.sh --file=input.txt --output=output.txt
# ./script.sh --file input.txt --output output.txt
```

### Error Handling for Long Options

```bash
#!/bin/bash

parse_arguments() {
    while [[ $# -gt 0 ]]; do
        case "$1" in
            --file)
                if [ -z "${2:-}" ]; then
                    echo "Error: --file requires an argument" >&2
                    return 1
                fi
                INPUT_FILE="$2"
                shift 2
                ;;
            --verbose)
                VERBOSE=true
                shift
                ;;
            -*)
                echo "Error: Unknown option: $1" >&2
                return 1
                ;;
            *)
                # Positional argument
                POSITIONAL_ARGS+=("$1")
                shift
                ;;
        esac
    done
    return 0
}

# Usage
if ! parse_arguments "$@"; then
    usage
fi
```

---

## Hybrid Approach (Short + Long)

### Supporting Both Short and Long Options

```bash
#!/bin/bash

usage() {
    cat <<EOF
Usage: $0 [OPTIONS]

Options:
    -h, --help              Show this help
    -v, --verbose           Enable verbose mode
    -f, --file FILE         Input file (required)
    -o, --output OUTPUT     Output file
    -d, --dry-run           Show what would be done
EOF
    exit 1
}

# Defaults
VERBOSE=false
INPUT_FILE=""
OUTPUT_FILE=""
DRY_RUN=false

# Parse options
while [[ $# -gt 0 ]]; do
    case "$1" in
        -h|--help)
            usage
            ;;
        -v|--verbose)
            VERBOSE=true
            shift
            ;;
        -f|--file)
            INPUT_FILE="$2"
            shift 2
            ;;
        --file=*)
            INPUT_FILE="${1#*=}"
            shift
            ;;
        -o|--output)
            OUTPUT_FILE="$2"
            shift 2
            ;;
        --output=*)
            OUTPUT_FILE="${1#*=}"
            shift
            ;;
        -d|--dry-run)
            DRY_RUN=true
            shift
            ;;
        -*)
            echo "Error: Unknown option: $1" >&2
            usage
            ;;
        *)
            # Positional argument
            break
            ;;
    esac
done

# Check required
if [ -z "$INPUT_FILE" ]; then
    echo "Error: --file is required" >&2
    usage
fi
```

### Using getopt (GNU Enhanced Version)

**Note:** GNU getopt is not POSIX, not available on all systems (especially macOS).

```bash
#!/bin/bash

# Parse options with GNU getopt
PARSED=$(getopt -o hvf:o: --long help,verbose,file:,output: -n "$0" -- "$@")
eval set -- "$PARSED"

# Defaults
VERBOSE=false
INPUT_FILE=""
OUTPUT_FILE=""

# Process options
while true; do
    case "$1" in
        -h|--help)
            usage
            ;;
        -v|--verbose)
            VERBOSE=true
            shift
            ;;
        -f|--file)
            INPUT_FILE="$2"
            shift 2
            ;;
        -o|--output)
            OUTPUT_FILE="$2"
            shift 2
            ;;
        --)
            shift
            break
            ;;
        *)
            echo "Error: Unknown option: $1" >&2
            exit 1
            ;;
    esac
done

# Remaining arguments in "$@"
```

**Portability Warning:** GNU getopt is not available on BSD/macOS by default. Use manual parsing for portability.

---

## Positional Arguments

### Basic Positional Arguments

```bash
#!/bin/bash

# Access positional arguments
FIRST_ARG="$1"
SECOND_ARG="$2"
THIRD_ARG="$3"

# All arguments as array
ALL_ARGS=("$@")

# Number of arguments
ARG_COUNT=$#

echo "First: $FIRST_ARG"
echo "Second: $SECOND_ARG"
echo "Count: $ARG_COUNT"
```

### After Option Parsing

```bash
#!/bin/bash

# Parse options first
while getopts "v" opt; do
    case "$opt" in
        v) VERBOSE=true ;;
        *) usage ;;
    esac
done

# Shift past options
shift $((OPTIND - 1))

# Now "$@" contains only positional arguments
if [ $# -lt 1 ]; then
    echo "Error: At least one positional argument required" >&2
    usage
fi

COMMAND="$1"
shift

# Remaining arguments
ARGS=("$@")

echo "Command: $COMMAND"
echo "Args: ${ARGS[@]}"
```

### Variable Number of Positional Arguments

```bash
#!/bin/bash

# Accept variable number of files
FILES=("$@")

if [ ${#FILES[@]} -eq 0 ]; then
    echo "Error: At least one file required" >&2
    exit 1
fi

for file in "${FILES[@]}"; do
    echo "Processing: $file"
done
```

### Subcommand Pattern

```bash
#!/bin/bash

usage() {
    cat <<EOF
Usage: $0 <command> [options]

Commands:
    start       Start the service
    stop        Stop the service
    restart     Restart the service
    status      Show service status
EOF
    exit 1
}

# Check for command
if [ $# -lt 1 ]; then
    echo "Error: Command required" >&2
    usage
fi

COMMAND="$1"
shift

# Process subcommand
case "$COMMAND" in
    start)
        # Parse start-specific options
        while getopts "p:" opt; do
            case "$opt" in
                p) PORT="$OPTARG" ;;
                *) usage ;;
            esac
        done
        echo "Starting service on port ${PORT:-8080}"
        ;;
    stop)
        echo "Stopping service"
        ;;
    restart)
        echo "Restarting service"
        ;;
    status)
        echo "Service status"
        ;;
    *)
        echo "Error: Unknown command: $COMMAND" >&2
        usage
        ;;
esac
```

---

## Required vs Optional Arguments

### Validation Pattern

```bash
#!/bin/bash

# Parse options
while getopts "f:o:v" opt; do
    case "$opt" in
        f) INPUT_FILE="$OPTARG" ;;
        o) OUTPUT_FILE="$OPTARG" ;;
        v) VERBOSE=true ;;
        *) usage ;;
    esac
done

shift $((OPTIND - 1))

# Check required options
MISSING=()

if [ -z "${INPUT_FILE:-}" ]; then
    MISSING+=("-f FILE")
fi

if [ -z "${OUTPUT_FILE:-}" ]; then
    MISSING+=("-o OUTPUT")
fi

if [ ${#MISSING[@]} -gt 0 ]; then
    echo "Error: Missing required options: ${MISSING[*]}" >&2
    usage
fi
```

### Default Values for Optional Arguments

```bash
#!/bin/bash

# Parse with defaults
while getopts "f:o:p:" opt; do
    case "$opt" in
        f) INPUT_FILE="$OPTARG" ;;
        o) OUTPUT_FILE="$OPTARG" ;;
        p) PORT="$OPTARG" ;;
        *) usage ;;
    esac
done

# Apply defaults for optional arguments
: "${OUTPUT_FILE:=output.txt}"
: "${PORT:=8080}"

# Check required (no default)
if [ -z "${INPUT_FILE:-}" ]; then
    echo "Error: -f FILE is required" >&2
    usage
fi

echo "Input: $INPUT_FILE"
echo "Output: $OUTPUT_FILE (default: output.txt)"
echo "Port: $PORT (default: 8080)"
```

### Environment Variables as Defaults

```bash
#!/bin/bash

# Parse options
while getopts "k:h:" opt; do
    case "$opt" in
        k) API_KEY="$OPTARG" ;;
        h) API_HOST="$OPTARG" ;;
        *) usage ;;
    esac
done

# Use environment variables as defaults
: "${API_KEY:=${API_KEY_ENV}}"
: "${API_HOST:=${API_HOST_ENV:-https://api.example.com}}"

# Check required
if [ -z "${API_KEY:-}" ]; then
    echo "Error: API key required (use -k or set API_KEY_ENV)" >&2
    usage
fi

echo "Using API host: $API_HOST"
```

---

## Validation Patterns

### Type Validation

```bash
#!/bin/bash

# Validate integer
validate_integer() {
    local value=$1
    local name=$2

    if ! [[ "$value" =~ ^[0-9]+$ ]]; then
        echo "Error: $name must be an integer: $value" >&2
        return 1
    fi
    return 0
}

# Validate range
validate_range() {
    local value=$1
    local min=$2
    local max=$3
    local name=$4

    if [ "$value" -lt "$min" ] || [ "$value" -gt "$max" ]; then
        echo "Error: $name must be between $min and $max: $value" >&2
        return 1
    fi
    return 0
}

# Parse and validate
while getopts "p:" opt; do
    case "$opt" in
        p)
            PORT="$OPTARG"
            validate_integer "$PORT" "port" || exit 1
            validate_range "$PORT" 1 65535 "port" || exit 1
            ;;
        *) usage ;;
    esac
done
```

### File Validation

```bash
#!/bin/bash

# Validate file exists and is readable
validate_input_file() {
    local file=$1

    if [ ! -f "$file" ]; then
        echo "Error: File not found: $file" >&2
        return 1
    fi

    if [ ! -r "$file" ]; then
        echo "Error: File not readable: $file" >&2
        return 1
    fi

    return 0
}

# Validate output path
validate_output_path() {
    local file=$1
    local dir=$(dirname "$file")

    if [ ! -d "$dir" ]; then
        echo "Error: Output directory does not exist: $dir" >&2
        return 1
    fi

    if [ ! -w "$dir" ]; then
        echo "Error: Output directory not writable: $dir" >&2
        return 1
    fi

    return 0
}

# Parse and validate
while getopts "f:o:" opt; do
    case "$opt" in
        f)
            INPUT_FILE="$OPTARG"
            validate_input_file "$INPUT_FILE" || exit 1
            ;;
        o)
            OUTPUT_FILE="$OPTARG"
            validate_output_path "$OUTPUT_FILE" || exit 1
            ;;
        *) usage ;;
    esac
done
```

### Format Validation

```bash
#!/bin/bash

# Validate email format
validate_email() {
    local email=$1

    if ! echo "$email" | grep -qE '^[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}$'; then
        echo "Error: Invalid email format: $email" >&2
        return 1
    fi
    return 0
}

# Validate URL format
validate_url() {
    local url=$1

    if ! echo "$url" | grep -qE '^https?://[A-Za-z0-9.-]+'; then
        echo "Error: Invalid URL format: $url" >&2
        return 1
    fi
    return 0
}

# Parse and validate
while getopts "e:u:" opt; do
    case "$opt" in
        e)
            EMAIL="$OPTARG"
            validate_email "$EMAIL" || exit 1
            ;;
        u)
            URL="$OPTARG"
            validate_url "$URL" || exit 1
            ;;
        *) usage ;;
    esac
done
```

---

## Usage Documentation

### Comprehensive Help Message

```bash
#!/bin/bash

usage() {
    cat <<EOF
$(basename "$0") - Process data files

Usage:
    $(basename "$0") [OPTIONS] [FILES...]

Description:
    Process one or more data files with optional transformations.

Options:
    -h, --help              Show this help message
    -v, --verbose           Enable verbose output
    -f, --file FILE         Input file (required)
    -o, --output FILE       Output file (default: stdout)
    -t, --type TYPE         Processing type: json|yaml|csv (default: json)
    -d, --dry-run           Show what would be done without executing

Examples:
    # Process single file
    $(basename "$0") --file input.json --output output.json

    # Process with type specification
    $(basename "$0") -f data.csv -t csv -o results.csv

    # Dry run to preview changes
    $(basename "$0") --file input.yaml --dry-run

    # Verbose mode for debugging
    $(basename "$0") -v -f input.json

Environment Variables:
    API_KEY                 API key for authentication (optional)
    LOG_LEVEL              Logging level: debug|info|warn|error (default: info)

Exit Codes:
    0                      Success
    1                      General error
    2                      Invalid arguments
    3                      File not found

For more information, see: https://example.com/docs
EOF
    exit "${1:-0}"
}
```

### Auto-Generated Help from Comments

```bash
#!/bin/bash

# USAGE: script.sh [OPTIONS]
# DESCRIPTION: Process data files with transformations
# OPTIONS:
#   -h, --help      Show help
#   -f, --file      Input file (required)
#   -o, --output    Output file (optional)

usage() {
    # Extract usage from comments
    grep "^# " "$0" | sed 's/^# //'
    exit 0
}

# Parse options
while [[ $# -gt 0 ]]; do
    case "$1" in
        -h|--help) usage ;;
        # ... other options
    esac
done
```

### Version Information

```bash
#!/bin/bash

VERSION="1.2.3"

show_version() {
    cat <<EOF
$(basename "$0") version $VERSION
Copyright (C) 2025 Your Organization
License: MIT
EOF
    exit 0
}

usage() {
    cat <<EOF
Usage: $(basename "$0") [OPTIONS]

Options:
    -h, --help      Show help
    -V, --version   Show version
    # ... other options
EOF
    exit 0
}

# Parse options
while [[ $# -gt 0 ]]; do
    case "$1" in
        -h|--help) usage ;;
        -V|--version) show_version ;;
        # ... other options
    esac
done
```

---

## Summary

**Best Practices:**
1. Use getopts for POSIX-compliant short options
2. Use manual parsing for long options (requires Bash)
3. Support both short and long options for user convenience
4. Always provide --help with examples
5. Validate arguments early before processing
6. Provide clear error messages for invalid input
7. Use defaults for optional arguments
8. Document environment variable alternatives

**Anti-Patterns to Avoid:**
- Relying on GNU getopt (not portable)
- Missing validation for required arguments
- Poor error messages ("Invalid argument")
- No help/usage documentation
- Modifying "$@" before option parsing

```

### references/parameter-expansion.md

```markdown
# Parameter Expansion and String Manipulation

Complete reference for parameter expansion, string manipulation, and array handling in shell scripts.

## Table of Contents

- [Basic Parameter Expansion](#basic-parameter-expansion)
- [Default Values and Assignments](#default-values-and-assignments)
- [String Length](#string-length)
- [Substring Extraction](#substring-extraction)
- [Pattern Removal (Prefix/Suffix)](#pattern-removal-prefixsuffix)
- [Pattern Replacement](#pattern-replacement)
- [Case Modification](#case-modification)
- [Array Handling (Bash)](#array-handling-bash)
- [Associative Arrays (Bash)](#associative-arrays-bash)
- [POSIX-Compliant Alternatives](#posix-compliant-alternatives)

---

## Basic Parameter Expansion

### Simple Expansion

```bash
#!/bin/bash

# Basic variable expansion
name="Alice"
echo "$name"          # Output: Alice
echo "${name}"        # Output: Alice (same, explicit braces)

# Concatenation
first="John"
last="Doe"
full="$first $last"   # Output: John Doe
echo "${first}_${last}"  # Output: John_Doe
```

### Brace Expansion vs Variable Expansion

```bash
#!/bin/bash

# Variable expansion
files="file1 file2 file3"
echo $files           # Output: file1 file2 file3

# Brace expansion (generates sequences)
echo file{1,2,3}      # Output: file1 file2 file3
echo {a..z}           # Output: a b c ... z
echo {1..10}          # Output: 1 2 3 ... 10
```

---

## Default Values and Assignments

### Use Default if Unset

```bash
#!/bin/bash

# ${var:-default} - Use default if var is unset or null
echo "${UNDEFINED:-default}"     # Output: default
echo "${EMPTY:-default}"         # Output: default (if EMPTY="")

# Practical example
PORT="${PORT:-8080}"
echo "Server port: $PORT"

# Multiple levels
CONFIG_FILE="${CONFIG_FILE:-${HOME}/.config/app.conf}"
```

### Assign Default if Unset

```bash
#!/bin/bash

# ${var:=default} - Assign default if var is unset or null
: "${PORT:=8080}"
echo "Port: $PORT"  # PORT is now set to 8080

# Common pattern for configuration
: "${LOG_LEVEL:=info}"
: "${DATABASE_HOST:=localhost}"
: "${DATABASE_PORT:=5432}"

echo "Log level: $LOG_LEVEL"
echo "Database: $DATABASE_HOST:$DATABASE_PORT"
```

### Error if Unset

```bash
#!/bin/bash

# ${var:?message} - Exit with error if var is unset or null
: "${API_KEY:?Error: API_KEY environment variable is required}"

# Custom error message
: "${DATABASE_URL:?Error: DATABASE_URL must be set}"

# Practical example
validate_required() {
    : "${INPUT_FILE:?Error: INPUT_FILE is required}"
    : "${OUTPUT_DIR:?Error: OUTPUT_DIR is required}"
}

validate_required
```

### Use Alternative if Set

```bash
#!/bin/bash

# ${var:+alternative} - Use alternative if var is set and not null
DEBUG="${DEBUG:+--verbose}"
echo "Running with $DEBUG"

# Practical example
DOCKER_ARGS=""
DOCKER_ARGS+="${PRIVILEGED:+ --privileged}"
DOCKER_ARGS+="${NETWORK:+ --network=$NETWORK}"
DOCKER_ARGS+="${VOLUME:+ -v $VOLUME}"

docker run $DOCKER_ARGS image_name
```

---

## String Length

```bash
#!/bin/bash

# ${#var} - Length of variable
name="Alice"
echo "${#name}"       # Output: 5

path="/usr/local/bin"
echo "${#path}"       # Output: 14

# Check if empty
if [ "${#INPUT_FILE}" -eq 0 ]; then
    echo "INPUT_FILE is empty"
fi

# Validate length
PASSWORD="secret123"
if [ "${#PASSWORD}" -lt 8 ]; then
    echo "Error: Password must be at least 8 characters"
fi
```

---

## Substring Extraction

### Basic Extraction

```bash
#!/bin/bash

# ${var:offset} - Extract from offset to end
text="Hello, World!"
echo "${text:7}"      # Output: World!

# ${var:offset:length} - Extract substring
echo "${text:0:5}"    # Output: Hello
echo "${text:7:5}"    # Output: World
```

### Negative Offsets (Bash 4.2+)

```bash
#!/bin/bash

# Extract from end
text="Hello, World!"
echo "${text: -6}"    # Output: World! (note space after colon)
echo "${text: -6:5}"  # Output: World

# Without space (POSIX alternative)
echo "${text:${#text}-6}"  # Output: World!
```

### Practical Examples

```bash
#!/bin/bash

# Extract file extension
filename="document.tar.gz"
extension="${filename:${#filename}-3}"  # Output: .gz

# Extract directory from path
path="/usr/local/bin/myapp"
dirname="${path:0:${#path}-6}"  # Output: /usr/local/bin

# Extract first N characters
uuid="550e8400-e29b-41d4-a716-446655440000"
short_id="${uuid:0:8}"  # Output: 550e8400
```

---

## Pattern Removal (Prefix/Suffix)

### Remove Shortest Match

```bash
#!/bin/bash

# ${var#pattern} - Remove shortest prefix match
path="/usr/local/bin/myapp"
echo "${path#*/}"     # Output: usr/local/bin/myapp (remove first /)

# ${var%pattern} - Remove shortest suffix match
filename="report.csv.gz"
echo "${filename%.gz}"     # Output: report.csv
echo "${filename%.*}"      # Output: report.csv (remove .gz)
```

### Remove Longest Match

```bash
#!/bin/bash

# ${var##pattern} - Remove longest prefix match
path="/usr/local/bin/myapp"
echo "${path##*/}"    # Output: myapp (basename)

email="[email protected]"
echo "${email##*@}"   # Output: example.com (domain)

# ${var%%pattern} - Remove longest suffix match
filename="report.csv.gz"
echo "${filename%%.*}"     # Output: report (remove all extensions)

path="/usr/local/bin/myapp"
echo "${path%%/*}"         # Output: (empty, removes everything after first /)
```

### Practical Examples

```bash
#!/bin/bash

# Get filename without extension
file="document.tar.gz"
name="${file%%.*}"         # Output: document (all extensions)
name="${file%.*}"          # Output: document.tar (last extension only)

# Get basename (filename without path)
path="/usr/local/bin/myapp"
basename="${path##*/}"     # Output: myapp

# Get dirname (path without filename)
dirname="${path%/*}"       # Output: /usr/local/bin

# Extract domain from email
email="[email protected]"
domain="${email##*@}"      # Output: example.com

# Extract username from email
username="${email%%@*}"    # Output: user

# Remove protocol from URL
url="https://example.com/path"
no_protocol="${url#*://}"  # Output: example.com/path
```

---

## Pattern Replacement

### Replace First Match

```bash
#!/bin/bash

# ${var/pattern/replacement} - Replace first match
text="hello world world"
echo "${text/world/universe}"  # Output: hello universe world
```

### Replace All Matches

```bash
#!/bin/bash

# ${var//pattern/replacement} - Replace all matches
text="hello world world"
echo "${text//world/universe}"  # Output: hello universe universe

# Remove all occurrences (empty replacement)
path="/usr//local//bin"
echo "${path//\/\//\/}"         # Output: /usr/local/bin (normalize path)

# Replace spaces with underscores
filename="My Document.txt"
echo "${filename// /_}"         # Output: My_Document.txt
```

### Replace Prefix/Suffix Only

```bash
#!/bin/bash

# ${var/#pattern/replacement} - Replace if matches prefix
url="http://example.com"
echo "${url/#http:/https:}"    # Output: https://example.com

# ${var/%pattern/replacement} - Replace if matches suffix
file="document.txt"
echo "${file/%.txt/.md}"       # Output: document.md
```

### Practical Examples

```bash
#!/bin/bash

# Sanitize filename
filename="My Document (Copy).txt"
safe="${filename// /_}"          # Replace spaces
safe="${safe//[()]/}"            # Remove parentheses
safe="${safe//_-_/-}"            # Normalize separators
echo "$safe"  # Output: My_Document_Copy.txt

# Convert path separators (Windows to Unix)
winpath="C:\\Users\\Alice\\Documents"
unixpath="${winpath//\\//}"
echo "$unixpath"  # Output: C:/Users/Alice/Documents

# URL encoding (simple example)
text="hello world"
encoded="${text// /%20}"
echo "$encoded"  # Output: hello%20world
```

---

## Case Modification

### Uppercase/Lowercase (Bash 4+)

```bash
#!/bin/bash

# ${var^^} - Convert to uppercase
name="alice"
echo "${name^^}"      # Output: ALICE

# ${var,,} - Convert to lowercase
name="ALICE"
echo "${name,,}"      # Output: alice

# ${var^} - Uppercase first character only
name="alice"
echo "${name^}"       # Output: Alice

# ${var,} - Lowercase first character only
name="ALICE"
echo "${name,}"       # Output: aLICE
```

### Pattern-Based Case Modification

```bash
#!/bin/bash

# ${var^^pattern} - Uppercase matching pattern
text="hello world"
echo "${text^^[hw]}"  # Output: Hello World

# ${var,,pattern} - Lowercase matching pattern
text="HELLO WORLD"
echo "${text,,[HW]}"  # Output: hELLO wORLD
```

### POSIX-Compliant Alternatives

```bash
#!/bin/sh

# Uppercase using tr
name="alice"
upper=$(echo "$name" | tr '[:lower:]' '[:upper:]')
echo "$upper"  # Output: ALICE

# Lowercase using tr
name="ALICE"
lower=$(echo "$name" | tr '[:upper:]' '[:lower:]')
echo "$lower"  # output: alice

# Capitalize first letter (POSIX)
name="alice"
first=$(echo "$name" | cut -c1 | tr '[:lower:]' '[:upper:]')
rest=$(echo "$name" | cut -c2-)
capitalized="${first}${rest}"
echo "$capitalized"  # Output: Alice
```

---

## Array Handling (Bash)

### Array Basics

```bash
#!/bin/bash

# Declare array
files=("file1.txt" "file2.txt" "file3.txt")

# Access elements
echo "${files[0]}"      # Output: file1.txt (first element)
echo "${files[1]}"      # Output: file2.txt
echo "${files[-1]}"     # Output: file3.txt (last element, Bash 4.2+)

# All elements
echo "${files[@]}"      # Output: file1.txt file2.txt file3.txt
echo "${files[*]}"      # Output: file1.txt file2.txt file3.txt

# Array length
echo "${#files[@]}"     # Output: 3

# Element length
echo "${#files[0]}"     # Output: 9 (length of "file1.txt")
```

### Array Manipulation

```bash
#!/bin/bash

# Append to array
files=("file1.txt")
files+=("file2.txt")
files+=("file3.txt")

# Prepend to array
files=("file0.txt" "${files[@]}")

# Insert at specific position (replace element)
files[1]="new_file.txt"

# Remove element (unset)
unset 'files[1]'

# Reassign indices (compact array after unset)
files=("${files[@]}")
```

### Iterating Arrays

```bash
#!/bin/bash

files=("file1.txt" "file2.txt" "file3.txt")

# Iterate over elements
for file in "${files[@]}"; do
    echo "Processing: $file"
done

# Iterate with indices
for i in "${!files[@]}"; do
    echo "Index $i: ${files[$i]}"
done

# C-style loop
for ((i=0; i<${#files[@]}; i++)); do
    echo "File $i: ${files[$i]}"
done
```

### Array Slicing

```bash
#!/bin/bash

files=("file1" "file2" "file3" "file4" "file5")

# Extract slice: ${array[@]:offset:length}
echo "${files[@]:1:3}"   # Output: file2 file3 file4
echo "${files[@]:2}"     # Output: file3 file4 file5 (from index 2 to end)

# Copy array
files_copy=("${files[@]}")

# Subset matching pattern
matching=("${files[@]/*2*/}")  # Elements containing "2"
```

### Practical Examples

```bash
#!/bin/bash

# Collect files matching pattern
png_files=(*.png)
if [ ${#png_files[@]} -eq 0 ]; then
    echo "No PNG files found"
fi

# Process multiple inputs
process_files() {
    local files=("$@")

    for file in "${files[@]}"; do
        echo "Processing: $file"
        # Process file
    done
}

process_files "${png_files[@]}"

# Split string into array
IFS=',' read -ra items <<< "item1,item2,item3"
for item in "${items[@]}"; do
    echo "Item: $item"
done
```

---

## Associative Arrays (Bash)

### Declare and Use

```bash
#!/bin/bash

# Declare associative array
declare -A config

# Assign values
config[host]="localhost"
config[port]="8080"
config[database]="mydb"

# Access values
echo "${config[host]}"      # Output: localhost
echo "${config[port]}"      # Output: 8080

# Check if key exists
if [ -n "${config[host]:-}" ]; then
    echo "Host is configured"
fi

# Delete key
unset 'config[port]'
```

### Iterate Associative Arrays

```bash
#!/bin/bash

declare -A config=(
    [host]="localhost"
    [port]="8080"
    [database]="mydb"
)

# Iterate over keys
for key in "${!config[@]}"; do
    echo "$key = ${config[$key]}"
done

# Iterate over values
for value in "${config[@]}"; do
    echo "Value: $value"
done
```

### Practical Examples

```bash
#!/bin/bash

# Configuration map
declare -A settings=(
    [log_level]="info"
    [max_connections]="100"
    [timeout]="30"
)

# Function using associative array
get_setting() {
    local key=$1
    local default=$2
    echo "${settings[$key]:-$default}"
}

log_level=$(get_setting "log_level" "warn")
echo "Log level: $log_level"

# Count occurrences
declare -A counts
while read -r line; do
    ((counts[$line]++))
done < data.txt

for key in "${!counts[@]}"; do
    echo "$key: ${counts[$key]}"
done
```

---

## POSIX-Compliant Alternatives

### Simulating Arrays in POSIX sh

```sh
#!/bin/sh

# Space-separated list (simple cases)
files="file1.txt file2.txt file3.txt"

for file in $files; do
    echo "Processing: $file"
done

# Using positional parameters as array
set -- "item1" "item2" "item3"
echo "$1"  # Output: item1
echo "$2"  # Output: item2
echo "$#"  # Output: 3 (number of items)

# Iterate
for item in "$@"; do
    echo "Item: $item"
done
```

### String Operations Without Parameter Expansion

```sh
#!/bin/sh

# Remove prefix using sed
path="/usr/local/bin/myapp"
basename=$(echo "$path" | sed 's|.*/||')
echo "$basename"  # Output: myapp

# Remove suffix using sed
filename="document.txt"
name=$(echo "$filename" | sed 's/\.[^.]*$//')
echo "$name"  # Output: document

# Replace pattern using sed
text="hello world"
replaced=$(echo "$text" | sed 's/world/universe/')
echo "$replaced"  # Output: hello universe

# Uppercase using tr
name="alice"
upper=$(echo "$name" | tr '[:lower:]' '[:upper:]')
echo "$upper"  # Output: ALICE
```

---

## Summary

**Bash Features:**
- Parameter expansion provides powerful string manipulation
- Arrays and associative arrays for structured data
- Case modification (Bash 4+)
- Substring extraction and pattern replacement

**POSIX Alternatives:**
- Use external tools: sed, awk, tr, cut
- Positional parameters as simple arrays
- More verbose but portable

**Best Practices:**
- Use parameter expansion for simple operations
- Use external tools for complex transformations
- Quote all expansions: `"${var}"` not `$var`
- Use arrays (Bash) for multiple values, not space-separated strings

```

### references/common-utilities.md

```markdown
# Common Utilities Integration

Guide to integrating common command-line utilities in shell scripts: jq, yq, awk, sed, and more.

## Table of Contents

- [jq: JSON Processing](#jq-json-processing)
- [yq: YAML Processing](#yq-yaml-processing)
- [awk: Text Processing](#awk-text-processing)
- [sed: Stream Editing](#sed-stream-editing)
- [grep: Pattern Matching](#grep-pattern-matching)
- [Other Useful Utilities](#other-useful-utilities)

---

## jq: JSON Processing

### Installation

```bash
# macOS
brew install jq

# Ubuntu/Debian
apt-get install jq

# Fedora/RHEL
dnf install jq

# Docker
docker run --rm -i stedolan/jq
```

### Basic Usage

```bash
# Pretty-print JSON
echo '{"name":"Alice","age":30}' | jq '.'

# Extract field
echo '{"name":"Alice","age":30}' | jq '.name'
# Output: "Alice"

# Extract field without quotes (-r for raw output)
echo '{"name":"Alice","age":30}' | jq -r '.name'
# Output: Alice

# Extract nested field
echo '{"user":{"name":"Alice"}}' | jq '.user.name'
# Output: "Alice"
```

### Working with Arrays

```bash
# Extract array element
echo '["a","b","c"]' | jq '.[0]'
# Output: "a"

# Extract all array elements
echo '["a","b","c"]' | jq '.[]'
# Output: "a" "b" "c" (one per line)

# Array length
echo '["a","b","c"]' | jq 'length'
# Output: 3

# Extract field from each object in array
echo '[{"name":"Alice"},{"name":"Bob"}]' | jq '.[].name'
# Output: "Alice" "Bob"
```

### Filtering and Selecting

```bash
# Filter array by condition
echo '[{"name":"Alice","age":30},{"name":"Bob","age":25}]' | \
    jq '.[] | select(.age > 26)'
# Output: {"name":"Alice","age":30}

# Filter and extract field
echo '[{"name":"Alice","active":true},{"name":"Bob","active":false}]' | \
    jq '.[] | select(.active == true) | .name'
# Output: "Alice"

# Map array
echo '[1,2,3]' | jq 'map(. * 2)'
# Output: [2,4,6]
```

### Constructing JSON

```bash
# Create object
jq -n '{"name": "Alice", "age": 30}'

# Create array
jq -n '["a", "b", "c"]'

# Construct from variables
name="Alice"
age=30
jq -n --arg name "$name" --argjson age "$age" '{"name": $name, "age": $age}'
```

### Practical Examples

```bash
#!/bin/bash

# Parse API response
response=$(curl -sSL https://api.example.com/users)

# Extract specific field
user_id=$(echo "$response" | jq -r '.data.id')

# Check for error
if echo "$response" | jq -e '.error' >/dev/null; then
    error_msg=$(echo "$response" | jq -r '.error.message')
    echo "Error: $error_msg" >&2
    exit 1
fi

# Extract array of values
ids=$(echo "$response" | jq -r '.users[].id')

# Transform JSON
echo "$response" | jq '{
    id: .data.id,
    name: .data.name,
    email: .data.contact.email
}'

# Merge JSON files
jq -s '.[0] * .[1]' file1.json file2.json > merged.json
```

### Error Handling

```bash
#!/bin/bash

# Check if jq is installed
command -v jq >/dev/null 2>&1 || {
    echo "Error: jq is required but not installed" >&2
    exit 1
}

# Validate JSON
if ! echo "$json" | jq empty 2>/dev/null; then
    echo "Error: Invalid JSON" >&2
    exit 1
fi

# Check if field exists
if ! echo "$json" | jq -e '.field' >/dev/null 2>&1; then
    echo "Error: Required field missing" >&2
    exit 1
fi
```

---

## yq: YAML Processing

### Installation

```bash
# macOS (yq v4, jq-compatible syntax)
brew install yq

# Ubuntu/Debian
wget https://github.com/mikefarah/yq/releases/latest/download/yq_linux_amd64
chmod +x yq_linux_amd64
mv yq_linux_amd64 /usr/local/bin/yq

# Docker
docker run --rm -v "$PWD":/workdir mikefarah/yq
```

**Note:** yq v4 uses jq-compatible syntax. yq v3 has different syntax.

### Basic Usage (yq v4)

```bash
# Read YAML
yq eval '.key' config.yaml

# Read nested value
yq eval '.database.host' config.yaml

# Read array element
yq eval '.servers[0]' config.yaml

# Read all array elements
yq eval '.servers[]' config.yaml
```

### Modifying YAML

```bash
# Update value
yq eval '.database.port = 5432' config.yaml

# Update in-place
yq eval '.database.port = 5432' -i config.yaml

# Add new key
yq eval '.newkey = "value"' config.yaml

# Delete key
yq eval 'del(.oldkey)' config.yaml
```

### Converting Formats

```bash
# YAML to JSON
yq eval -o=json config.yaml > config.json

# JSON to YAML
yq eval -P config.json > config.yaml

# YAML to XML
yq eval -o=xml config.yaml

# YAML to properties
yq eval -o=props config.yaml
```

### Practical Examples

```bash
#!/bin/bash

# Read configuration
database_host=$(yq eval '.database.host' config.yaml)
database_port=$(yq eval '.database.port' config.yaml)

# Update configuration
yq eval ".environment = \"$ENV\"" -i config.yaml
yq eval ".version = \"$VERSION\"" -i config.yaml

# Merge YAML files
yq eval-all '. as $item ireduce ({}; . * $item)' file1.yaml file2.yaml

# Extract subset
yq eval '.services.web' docker-compose.yaml > web-service.yaml

# Validate YAML
if ! yq eval '.' config.yaml >/dev/null 2>&1; then
    echo "Error: Invalid YAML" >&2
    exit 1
fi
```

---

## awk: Text Processing

### Basic Usage

```bash
# Print entire line
awk '{print}' file.txt
# Same as: cat file.txt

# Print specific field (space-separated)
awk '{print $1}' file.txt  # First field
awk '{print $2}' file.txt  # Second field
awk '{print $NF}' file.txt # Last field

# Print multiple fields
awk '{print $1, $3}' file.txt

# Custom field separator
awk -F',' '{print $1}' file.csv  # CSV
awk -F':' '{print $1}' /etc/passwd  # Colon-separated
```

### Pattern Matching

```bash
# Print lines matching pattern
awk '/pattern/' file.txt

# Print lines NOT matching pattern
awk '!/pattern/' file.txt

# Print lines where field matches
awk '$3 == "value"' file.txt
awk '$2 > 100' file.txt
awk '$1 ~ /^prefix/' file.txt  # Regex match
```

### Arithmetic Operations

```bash
# Sum column
awk '{sum += $1} END {print sum}' numbers.txt

# Average
awk '{sum += $1; count++} END {print sum/count}' numbers.txt

# Min/Max
awk 'NR==1 {max=$1; min=$1} {if($1>max) max=$1; if($1<min) min=$1} END {print min, max}' numbers.txt

# Count occurrences
awk '{count[$1]++} END {for (word in count) print word, count[word]}' file.txt
```

### Practical Examples

```bash
#!/bin/bash

# Extract specific columns from CSV
awk -F',' '{print $2, $5}' data.csv

# Filter and process
awk -F',' '$3 > 100 {print $1, $2}' data.csv

# Calculate totals
total=$(awk '{sum += $1} END {print sum}' sales.txt)

# Format output
awk '{printf "Name: %-20s Age: %3d\n", $1, $2}' people.txt

# Process log file
awk '/ERROR/ {print $1, $2, $NF}' application.log

# Group and sum
awk '{sum[$1] += $2} END {for (key in sum) print key, sum[key]}' data.txt
```

---

## sed: Stream Editing

### Basic Usage

```bash
# Substitute pattern
sed 's/old/new/' file.txt  # First occurrence per line
sed 's/old/new/g' file.txt  # All occurrences

# In-place editing
sed -i 's/old/new/g' file.txt       # Linux
sed -i '' 's/old/new/g' file.txt    # macOS
```

### Pattern Matching

```bash
# Delete lines matching pattern
sed '/pattern/d' file.txt

# Delete empty lines
sed '/^$/d' file.txt

# Print only lines matching pattern
sed -n '/pattern/p' file.txt

# Print line range
sed -n '10,20p' file.txt  # Lines 10-20
sed -n '10p' file.txt     # Line 10 only
```

### Advanced Substitution

```bash
# Case-insensitive substitution
sed 's/pattern/replacement/gi' file.txt

# Substitute with backreferences
sed 's/\(.*\)@\(.*\)/\2: \1/' emails.txt
# Input: [email protected]
# Output: example.com: user

# Multiple substitutions
sed -e 's/old1/new1/g' -e 's/old2/new2/g' file.txt

# Substitute on specific line
sed '5s/old/new/' file.txt  # Only line 5
```

### Insertion and Deletion

```bash
# Insert line before pattern
sed '/pattern/i\New line' file.txt

# Insert line after pattern
sed '/pattern/a\New line' file.txt

# Delete lines
sed '5d' file.txt          # Delete line 5
sed '5,10d' file.txt       # Delete lines 5-10
sed '/pattern/d' file.txt  # Delete matching lines
```

### Practical Examples

```bash
#!/bin/bash

# Replace config values
sed -i "s/^PORT=.*/PORT=$NEW_PORT/" config.env

# Remove comments
sed 's/#.*//' file.txt

# Extract email addresses
sed -n 's/.*\([a-z0-9._%+-]\+@[a-z0-9.-]\+\.[a-z]\{2,\}\).*/\1/p' file.txt

# Remove trailing whitespace
sed 's/[[:space:]]*$//' file.txt

# Double-space file
sed G file.txt

# Portable in-place editing
sed 's/old/new/g' file.txt > file.txt.tmp && mv file.txt.tmp file.txt
```

---

## grep: Pattern Matching

### Basic Usage

```bash
# Search for pattern
grep "pattern" file.txt

# Case-insensitive search
grep -i "pattern" file.txt

# Invert match (lines NOT matching)
grep -v "pattern" file.txt

# Count matches
grep -c "pattern" file.txt

# Show line numbers
grep -n "pattern" file.txt
```

### Regular Expressions

```bash
# Extended regex (-E)
grep -E "pattern1|pattern2" file.txt

# Beginning of line
grep "^pattern" file.txt

# End of line
grep "pattern$" file.txt

# Word boundaries
grep "\bword\b" file.txt

# Character classes
grep "[0-9]" file.txt       # Any digit
grep "[a-z]" file.txt       # Any lowercase letter
grep "[^0-9]" file.txt      # Any non-digit
```

### Context and Output

```bash
# Show context (lines before/after)
grep -A 3 "pattern" file.txt  # 3 lines after
grep -B 3 "pattern" file.txt  # 3 lines before
grep -C 3 "pattern" file.txt  # 3 lines before and after

# Show only matched text
grep -o "pattern" file.txt

# Quiet mode (exit code only)
grep -q "pattern" file.txt
if [ $? -eq 0 ]; then
    echo "Pattern found"
fi
```

### Practical Examples

```bash
#!/bin/bash

# Check if file contains pattern
if grep -q "ERROR" logfile.txt; then
    echo "Errors found in log"
fi

# Extract lines with email addresses
grep -Eo "\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}\b" file.txt

# Find files containing pattern
grep -r "pattern" directory/

# Exclude files/directories
grep -r "pattern" --exclude="*.log" directory/
grep -r "pattern" --exclude-dir="node_modules" directory/

# Count occurrences
count=$(grep -c "ERROR" logfile.txt)
echo "Found $count errors"
```

---

## Other Useful Utilities

### cut: Extract Fields

```bash
# Extract specific field
cut -d',' -f1 file.csv     # First field (comma-separated)
cut -d':' -f1,3 /etc/passwd # Fields 1 and 3

# Extract character range
cut -c1-10 file.txt        # Characters 1-10
```

### sort: Sort Lines

```bash
# Sort alphabetically
sort file.txt

# Sort numerically
sort -n numbers.txt

# Reverse sort
sort -r file.txt

# Sort by specific field
sort -t',' -k2 file.csv    # Sort by 2nd field (CSV)

# Unique sort
sort -u file.txt           # Same as: sort file.txt | uniq
```

### uniq: Remove Duplicates

```bash
# Remove adjacent duplicates
uniq file.txt

# Count occurrences
uniq -c file.txt

# Show only duplicates
uniq -d file.txt

# Show only unique lines
uniq -u file.txt
```

### tr: Translate Characters

```bash
# Uppercase
echo "hello" | tr '[:lower:]' '[:upper:]'
# Output: HELLO

# Delete characters
echo "hello123" | tr -d '[:digit:]'
# Output: hello

# Squeeze repeats
echo "hello   world" | tr -s ' '
# Output: hello world

# Replace characters
echo "hello" | tr 'el' 'ip'
# Output: hippo
```

### find: Find Files

```bash
# Find by name
find . -name "*.txt"

# Find by type
find . -type f             # Files
find . -type d             # Directories

# Find and execute
find . -name "*.log" -delete
find . -name "*.sh" -exec chmod +x {} \;

# Find modified recently
find . -mtime -7           # Modified in last 7 days
```

### xargs: Build Command Lines

```bash
# Process find results
find . -name "*.txt" | xargs grep "pattern"

# Parallel execution
find . -name "*.jpg" | xargs -P 4 -I {} convert {} {}.png

# Handle spaces in filenames
find . -name "*.txt" -print0 | xargs -0 grep "pattern"
```

---

## Summary

**JSON Processing:**
- jq for parsing and transforming JSON
- Use -r for raw output
- Use -e to check existence

**YAML Processing:**
- yq v4 for YAML (jq-compatible syntax)
- Convert between YAML, JSON, XML
- Modify YAML in-place

**Text Processing:**
- awk for field extraction and calculations
- sed for pattern replacement and editing
- grep for pattern matching

**Utilities:**
- cut: Extract fields
- sort: Sort lines
- uniq: Remove duplicates
- tr: Character translation
- find: Find files
- xargs: Build commands

**Best Practices:**
- Check command availability before use
- Handle errors gracefully
- Quote variables
- Use appropriate tool for task

```

### references/testing-guide.md

```markdown
# Testing and Linting Shell Scripts

Comprehensive guide to testing shell scripts with Bats and linting with ShellCheck.

## Table of Contents

- [ShellCheck: Static Analysis](#shellcheck-static-analysis)
- [Bats: Automated Testing](#bats-automated-testing)
- [Integration Testing](#integration-testing)
- [CI/CD Integration](#cicd-integration)
- [Debugging Techniques](#debugging-techniques)

---

## ShellCheck: Static Analysis

### Installation

```bash
# macOS
brew install shellcheck

# Ubuntu/Debian
apt-get install shellcheck

# Fedora/RHEL
dnf install ShellCheck

# Docker
docker pull koalaman/shellcheck:stable
```

### Basic Usage

```bash
# Check single script
shellcheck script.sh

# Check multiple scripts
shellcheck *.sh

# Check with specific shell
shellcheck --shell=sh script.sh     # POSIX sh
shellcheck --shell=bash script.sh   # Bash

# Output formats
shellcheck --format=gcc script.sh   # GCC-style
shellcheck --format=json script.sh  # JSON
shellcheck --format=tty script.sh   # Terminal (default)
```

### Common Warnings

#### SC2086: Quote variables to prevent word splitting

```bash
# ❌ Problematic
echo $var
cp $source $dest

# ✅ Fixed
echo "$var"
cp "$source" "$dest"
```

#### SC2006: Use $(...) instead of backticks

```bash
# ❌ Problematic
result=`command`

# ✅ Fixed
result=$(command)
```

#### SC2046: Quote command substitution

```bash
# ❌ Problematic
for file in $(ls *.txt); do

# ✅ Fixed
for file in *.txt; do
```

#### SC2039: POSIX sh doesn't support arrays

```bash
# ❌ Problematic (in #!/bin/sh)
#!/bin/sh
files=("file1" "file2")

# ✅ Fixed (use Bash)
#!/bin/bash
files=("file1" "file2")

# Or use POSIX alternative
#!/bin/sh
files="file1 file2"
```

#### SC2155: Declare and assign separately

```bash
# ❌ Problematic
local result=$(command)

# ✅ Fixed
local result
result=$(command)
```

### Excluding Warnings

```bash
# Exclude specific warning
shellcheck --exclude=SC2086 script.sh

# Exclude multiple warnings
shellcheck --exclude=SC2086,SC2046 script.sh

# In-file exclusion
#!/bin/bash
# shellcheck disable=SC2086
echo $var

# Disable for single line
echo $var  # shellcheck disable=line
```

### Severity Levels

```bash
# Show only errors
shellcheck --severity=error script.sh

# Show errors and warnings
shellcheck --severity=warning script.sh

# Show errors, warnings, and info
shellcheck --severity=info script.sh

# Show everything (including style)
shellcheck --severity=style script.sh
```

### CI Integration

```bash
# Exit with non-zero on any issue
shellcheck script.sh
if [ $? -ne 0 ]; then
    echo "ShellCheck failed"
    exit 1
fi

# Or use in CI pipeline
shellcheck *.sh || exit 1
```

### Docker Usage

```bash
# Run ShellCheck in container
docker run --rm -v "$PWD:/mnt" koalaman/shellcheck:stable script.sh

# Check all scripts
docker run --rm -v "$PWD:/mnt" koalaman/shellcheck:stable /mnt/*.sh

# With specific options
docker run --rm -v "$PWD:/mnt" koalaman/shellcheck:stable \
    --shell=sh --severity=error /mnt/script.sh
```

---

## Bats: Automated Testing

### Installation

```bash
# macOS
brew install bats-core

# Git submodule
git submodule add https://github.com/bats-core/bats-core.git test/bats
git submodule add https://github.com/bats-core/bats-support.git test/test_helper/bats-support
git submodule add https://github.com/bats-core/bats-assert.git test/test_helper/bats-assert

# npm
npm install -g bats
```

### Basic Test Structure

```bash
#!/usr/bin/env bats

# test/example.bats

@test "addition using bc" {
    result="$(echo 2+2 | bc)"
    [ "$result" -eq 4 ]
}

@test "script exists and is executable" {
    [ -f "script.sh" ]
    [ -x "script.sh" ]
}
```

### Running Tests

```bash
# Run all tests in directory
bats test/

# Run specific test file
bats test/example.bats

# Verbose output
bats --tap test/

# Count only
bats --count test/

# Pretty output (default)
bats --pretty test/
```

### Test Assertions

```bash
#!/usr/bin/env bats

@test "string comparison" {
    result="hello"
    [ "$result" = "hello" ]
}

@test "numeric comparison" {
    result=42
    [ "$result" -eq 42 ]
}

@test "file exists" {
    [ -f "README.md" ]
}

@test "directory exists" {
    [ -d "src/" ]
}

@test "command succeeds" {
    true
}

@test "command fails" {
    run false
    [ "$status" -ne 0 ]
}
```

### Using run Command

```bash
#!/usr/bin/env bats

@test "script runs successfully" {
    run ./script.sh --help

    # Check exit code
    [ "$status" -eq 0 ]

    # Check output (entire output)
    [ "$output" = "Usage: script.sh [OPTIONS]" ]

    # Check specific line
    [ "${lines[0]}" = "Usage: script.sh [OPTIONS]" ]
    [ "${lines[1]}" = "" ]
}

@test "script handles error" {
    run ./script.sh --invalid-option

    # Should fail
    [ "$status" -eq 1 ]

    # Should print error
    [[ "$output" =~ "Error" ]]
}
```

### Setup and Teardown

```bash
#!/usr/bin/env bats

setup() {
    # Run before each test
    export TEST_VAR="test_value"
    mkdir -p tmp/
}

teardown() {
    # Run after each test
    rm -rf tmp/
}

@test "uses setup variables" {
    [ "$TEST_VAR" = "test_value" ]
}

@test "uses setup directory" {
    [ -d "tmp/" ]
}
```

### setup_file and teardown_file

```bash
#!/usr/bin/env bats

setup_file() {
    # Run once before all tests in file
    export GLOBAL_VAR="global"
}

teardown_file() {
    # Run once after all tests in file
    unset GLOBAL_VAR
}

setup() {
    # Run before each test
    export TEST_VAR="test"
}

teardown() {
    # Run after each test
    unset TEST_VAR
}
```

### Skipping Tests

```bash
#!/usr/bin/env bats

@test "this test is skipped" {
    skip "Not implemented yet"
    # Test code here won't run
}

@test "conditional skip" {
    if [ ! -f "required_file.txt" ]; then
        skip "required_file.txt not found"
    fi

    # Test code
}
```

### Using bats-support and bats-assert

```bash
#!/usr/bin/env bats

load 'test_helper/bats-support/load'
load 'test_helper/bats-assert/load'

@test "assert_success" {
    run echo "hello"
    assert_success
}

@test "assert_failure" {
    run false
    assert_failure
}

@test "assert_output" {
    run echo "hello world"
    assert_output "hello world"
}

@test "assert_output with pattern" {
    run echo "hello world"
    assert_output --partial "world"
}

@test "assert_line" {
    run echo -e "line1\nline2\nline3"
    assert_line --index 0 "line1"
    assert_line --index 2 "line3"
}

@test "refute_output" {
    run echo ""
    refute_output
}
```

### Testing Functions

```bash
#!/usr/bin/env bats

# Load script being tested
load '../script.sh'

@test "function returns correct value" {
    run my_function "arg1" "arg2"

    assert_success
    assert_output "expected_result"
}

@test "function handles error" {
    run my_function "invalid_arg"

    assert_failure
    assert_output --partial "Error"
}
```

### Testing with Fixtures

```bash
#!/usr/bin/env bats

setup() {
    # Create fixture files
    mkdir -p test/fixtures
    echo "test data" > test/fixtures/input.txt
}

teardown() {
    # Clean up fixtures
    rm -rf test/fixtures
}

@test "processes fixture file" {
    run ./script.sh test/fixtures/input.txt

    assert_success
    assert_output "test data processed"
}
```

---

## Integration Testing

### Testing Complete Workflows

```bash
#!/usr/bin/env bats

@test "end-to-end workflow" {
    # Setup
    mkdir -p tmp/input tmp/output

    # Create input
    echo "data" > tmp/input/file.txt

    # Run script
    run ./script.sh --input tmp/input --output tmp/output

    # Verify exit code
    assert_success

    # Verify output file created
    [ -f tmp/output/file.txt ]

    # Verify output content
    result=$(cat tmp/output/file.txt)
    [ "$result" = "processed data" ]

    # Cleanup
    rm -rf tmp/
}
```

### Testing External Dependencies

```bash
#!/usr/bin/env bats

@test "requires jq" {
    # Skip if jq not installed
    if ! command -v jq >/dev/null 2>&1; then
        skip "jq not installed"
    fi

    run ./script.sh --format json

    assert_success
}

@test "handles missing dependency gracefully" {
    # Temporarily hide command
    PATH="/tmp/empty:$PATH" run ./script.sh

    assert_failure
    assert_output --partial "Error: jq required"
}
```

### Testing with Mock Data

```bash
#!/usr/bin/env bats

setup() {
    # Create mock API response
    export MOCK_API_RESPONSE='{"status": "success", "data": []}'

    # Mock curl command
    curl() {
        echo "$MOCK_API_RESPONSE"
    }
    export -f curl
}

@test "handles API response" {
    run ./script.sh --api-endpoint https://example.com/api

    assert_success
    assert_output --partial "success"
}
```

---

## CI/CD Integration

### GitHub Actions

```yaml
name: Shell Script Tests

on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v3

      - name: Install dependencies
        run: |
          sudo apt-get update
          sudo apt-get install -y shellcheck
          npm install -g bats

      - name: Run ShellCheck
        run: shellcheck *.sh

      - name: Run Bats tests
        run: bats test/
```

### GitLab CI

```yaml
shellcheck:
  image: koalaman/shellcheck-alpine:stable
  script:
    - shellcheck *.sh

bats:
  image: bats/bats:latest
  script:
    - bats test/
```

### Pre-commit Hook

```bash
#!/bin/bash
# .git/hooks/pre-commit

echo "Running ShellCheck..."
shellcheck *.sh || {
    echo "ShellCheck failed. Commit aborted."
    exit 1
}

echo "Running Bats tests..."
bats test/ || {
    echo "Tests failed. Commit aborted."
    exit 1
}

echo "All checks passed."
```

### Makefile for Testing

```makefile
.PHONY: test lint check

# Run all tests
test:
	bats test/

# Run ShellCheck
lint:
	shellcheck *.sh

# Run both
check: lint test

# Watch mode (requires entr)
watch:
	find . -name "*.sh" | entr make check
```

---

## Debugging Techniques

### Verbose Execution

```bash
#!/bin/bash

# Print each command before execution
set -x

# Your script logic
echo "Hello"

# Disable verbose mode
set +x
```

### Trace Execution

```bash
# Run script with trace
bash -x script.sh

# Or use set -x in script
#!/bin/bash
set -x  # Enable trace
# ... script logic
```

### Debug Output

```bash
#!/bin/bash

DEBUG=${DEBUG:-false}

debug() {
    if [ "$DEBUG" = "true" ]; then
        echo "[DEBUG] $*" >&2
    fi
}

# Usage
debug "Variable value: $var"
debug "About to execute command"

# Run with: DEBUG=true ./script.sh
```

### Dry Run Mode

```bash
#!/bin/bash

DRY_RUN=${DRY_RUN:-false}

execute() {
    if [ "$DRY_RUN" = "true" ]; then
        echo "[DRY RUN] Would execute: $*" >&2
    else
        "$@"
    fi
}

# Usage
execute rm -f /tmp/file.txt

# Run with: DRY_RUN=true ./script.sh
```

### Logging

```bash
#!/bin/bash

LOG_FILE=${LOG_FILE:-/tmp/script.log}

log() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] $*" | tee -a "$LOG_FILE" >&2
}

log "Script started"
log "Processing file: $file"
log "Script completed"
```

### Interactive Debugging

```bash
#!/bin/bash

# Break into debugger
read -p "Press Enter to continue..." _

# Inspect variables
read -p "DEBUG: var=$var. Press Enter..." _
```

---

## Summary

**ShellCheck:**
- Static analysis for shell scripts
- Catches common bugs and portability issues
- Integrates easily into CI/CD
- Use `--shell=sh` for POSIX compliance

**Bats:**
- Automated testing framework for shell scripts
- Simple test syntax
- Good for integration tests
- Use with bats-support and bats-assert for better assertions

**CI/CD:**
- Run ShellCheck and Bats in pipelines
- Use Docker images for consistent environment
- Pre-commit hooks prevent broken code

**Debugging:**
- `set -x` for trace execution
- Debug functions for conditional output
- Dry run mode for safety
- Logging for production scripts

**Best Practices:**
- Write tests first (TDD)
- Test on target platforms
- Use ShellCheck in editor
- Run tests before commit
- Maintain test coverage

```

### examples/production-template.sh

```bash
#!/bin/bash
#
# Production Shell Script Template
#
# This template demonstrates best practices for production-ready shell scripts.
# Includes error handling, argument parsing, logging, and cleanup.

set -euo pipefail

#============================================================================
# Script Metadata
#============================================================================

readonly SCRIPT_NAME="$(basename "$0")"
readonly SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
readonly VERSION="1.0.0"

#============================================================================
# Configuration
#============================================================================

# Defaults (can be overridden by command-line arguments or environment)
: "${LOG_LEVEL:=info}"
: "${DRY_RUN:=false}"
: "${VERBOSE:=false}"

#============================================================================
# Global Variables
#============================================================================

TEMP_DIR=""
EXIT_CODE=0

#============================================================================
# Cleanup Handler
#============================================================================

cleanup() {
    local exit_code=$?

    # Perform cleanup operations
    if [ -n "$TEMP_DIR" ] && [ -d "$TEMP_DIR" ]; then
        log_debug "Removing temporary directory: $TEMP_DIR"
        rm -rf "$TEMP_DIR"
    fi

    # Additional cleanup operations
    # - Remove lock files
    # - Close connections
    # - Save state

    log_info "Cleanup completed"

    # Exit with original exit code
    exit "$exit_code"
}

# Register cleanup handler
trap cleanup EXIT
trap 'EXIT_CODE=$?; cleanup; exit $EXIT_CODE' INT TERM

#============================================================================
# Logging Functions
#============================================================================

log_debug() {
    if [ "$LOG_LEVEL" = "debug" ] || [ "$VERBOSE" = "true" ]; then
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] [DEBUG] $*" >&2
    fi
}

log_info() {
    if [ "$LOG_LEVEL" = "debug" ] || [ "$LOG_LEVEL" = "info" ] || [ "$VERBOSE" = "true" ]; then
        echo "[$(date +'%Y-%m-%d %H:%M:%S')] [INFO] $*" >&2
    fi
}

log_warning() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] [WARNING] $*" >&2
}

log_error() {
    echo "[$(date +'%Y-%m-%d %H:%M:%S')] [ERROR] $*" >&2
}

#============================================================================
# Usage Documentation
#============================================================================

usage() {
    cat <<EOF
$SCRIPT_NAME - Production script template

Usage:
    $SCRIPT_NAME [OPTIONS] [ARGS]

Description:
    This script demonstrates production-ready patterns for shell scripting.

Options:
    -h, --help              Show this help message
    -v, --verbose           Enable verbose output
    -V, --version           Show version information
    -f, --file FILE         Input file (required)
    -o, --output FILE       Output file (default: stdout)
    -d, --dry-run           Show what would be done without executing
    --log-level LEVEL       Set log level: debug|info|warning|error (default: info)

Examples:
    # Basic usage
    $SCRIPT_NAME --file input.txt

    # With output file
    $SCRIPT_NAME -f input.txt -o output.txt

    # Dry run mode
    $SCRIPT_NAME --file input.txt --dry-run

    # Verbose mode with debug logging
    $SCRIPT_NAME -v --log-level debug --file input.txt

Environment Variables:
    LOG_LEVEL              Set default log level (default: info)
    DRY_RUN               Enable dry-run mode (default: false)
    VERBOSE               Enable verbose output (default: false)

Exit Codes:
    0                     Success
    1                     General error
    2                     Invalid arguments
    3                     Missing dependencies
    4                     File not found

EOF
    exit "${1:-0}"
}

show_version() {
    echo "$SCRIPT_NAME version $VERSION"
    exit 0
}

#============================================================================
# Dependency Checking
#============================================================================

check_dependencies() {
    local missing_deps=()

    # List required commands
    local required_commands=(
        # Add your required commands here
        # "jq"
        # "curl"
        # "git"
    )

    for cmd in "${required_commands[@]}"; do
        if ! command -v "$cmd" >/dev/null 2>&1; then
            missing_deps+=("$cmd")
        fi
    done

    if [ ${#missing_deps[@]} -gt 0 ]; then
        log_error "Missing required dependencies: ${missing_deps[*]}"
        log_error "Please install missing dependencies and try again"
        exit 3
    fi

    log_debug "All dependencies satisfied"
}

#============================================================================
# Argument Parsing
#============================================================================

parse_arguments() {
    # Initialize variables
    INPUT_FILE=""
    OUTPUT_FILE=""

    # Parse options
    while [[ $# -gt 0 ]]; do
        case "$1" in
            -h|--help)
                usage 0
                ;;
            -v|--verbose)
                VERBOSE=true
                shift
                ;;
            -V|--version)
                show_version
                ;;
            -f|--file)
                INPUT_FILE="$2"
                shift 2
                ;;
            --file=*)
                INPUT_FILE="${1#*=}"
                shift
                ;;
            -o|--output)
                OUTPUT_FILE="$2"
                shift 2
                ;;
            --output=*)
                OUTPUT_FILE="${1#*=}"
                shift
                ;;
            -d|--dry-run)
                DRY_RUN=true
                shift
                ;;
            --log-level)
                LOG_LEVEL="$2"
                shift 2
                ;;
            --log-level=*)
                LOG_LEVEL="${1#*=}"
                shift
                ;;
            -*)
                log_error "Unknown option: $1"
                usage 2
                ;;
            *)
                # Positional argument
                break
                ;;
        esac
    done

    # Validate required arguments
    if [ -z "$INPUT_FILE" ]; then
        log_error "Missing required argument: --file"
        usage 2
    fi

    # Validate log level
    case "$LOG_LEVEL" in
        debug|info|warning|error)
            ;;
        *)
            log_error "Invalid log level: $LOG_LEVEL"
            usage 2
            ;;
    esac
}

#============================================================================
# Validation Functions
#============================================================================

validate_input() {
    log_debug "Validating input parameters"

    # Check input file exists
    if [ ! -f "$INPUT_FILE" ]; then
        log_error "Input file not found: $INPUT_FILE"
        exit 4
    fi

    # Check input file is readable
    if [ ! -r "$INPUT_FILE" ]; then
        log_error "Input file not readable: $INPUT_FILE"
        exit 4
    fi

    # Validate output path
    if [ -n "$OUTPUT_FILE" ]; then
        local output_dir
        output_dir="$(dirname "$OUTPUT_FILE")"

        if [ ! -d "$output_dir" ]; then
            log_error "Output directory does not exist: $output_dir"
            exit 4
        fi

        if [ ! -w "$output_dir" ]; then
            log_error "Output directory not writable: $output_dir"
            exit 4
        fi
    fi

    log_debug "Input validation passed"
}

#============================================================================
# Core Functions
#============================================================================

execute_command() {
    local command=("$@")

    if [ "$DRY_RUN" = "true" ]; then
        log_info "[DRY RUN] Would execute: ${command[*]}"
        return 0
    fi

    log_debug "Executing: ${command[*]}"

    if "${command[@]}"; then
        log_debug "Command succeeded: ${command[*]}"
        return 0
    else
        local exit_code=$?
        log_error "Command failed (exit code $exit_code): ${command[*]}"
        return "$exit_code"
    fi
}

process_file() {
    local input_file=$1
    local output_file=${2:-}

    log_info "Processing file: $input_file"

    # Create temporary directory if needed
    if [ -z "$TEMP_DIR" ]; then
        TEMP_DIR=$(mktemp -d)
        log_debug "Created temporary directory: $TEMP_DIR"
    fi

    # Your processing logic here
    # Example: Read input, transform, write output

    if [ -n "$output_file" ]; then
        log_info "Writing output to: $output_file"
        # Write to output file
    else
        log_info "Writing output to stdout"
        # Write to stdout
    fi

    log_info "Processing completed successfully"
}

#============================================================================
# Main Function
#============================================================================

main() {
    log_info "Starting $SCRIPT_NAME v$VERSION"

    # Parse and validate arguments
    parse_arguments "$@"

    # Check dependencies
    check_dependencies

    # Validate input
    validate_input

    # Main processing logic
    process_file "$INPUT_FILE" "$OUTPUT_FILE"

    log_info "$SCRIPT_NAME completed successfully"
    exit 0
}

#============================================================================
# Script Entry Point
#============================================================================

main "$@"

```

### examples/getopts-basic.sh

```bash
#!/bin/bash
#
# Basic getopts Example
#
# Demonstrates simple argument parsing using getopts (POSIX-compliant).
# Supports short options: -h, -v, -f FILE, -o OUTPUT

set -euo pipefail

#============================================================================
# Usage
#============================================================================

usage() {
    cat <<EOF
Usage: $(basename "$0") [-h] [-v] [-f FILE] [-o OUTPUT]

Basic getopts example for parsing command-line arguments.

Options:
    -h          Show this help message
    -v          Enable verbose mode
    -f FILE     Input file (required)
    -o OUTPUT   Output file (optional, default: stdout)

Examples:
    # Basic usage
    $(basename "$0") -f input.txt

    # With output file
    $(basename "$0") -f input.txt -o output.txt

    # Verbose mode
    $(basename "$0") -v -f input.txt

    # Option bundling (combine -v and -f)
    $(basename "$0") -vf input.txt
EOF
    exit "${1:-0}"
}

#============================================================================
# Main Script
#============================================================================

# Default values
VERBOSE=false
INPUT_FILE=""
OUTPUT_FILE=""

# Parse options
while getopts "hvf:o:" opt; do
    case "$opt" in
        h)
            usage 0
            ;;
        v)
            VERBOSE=true
            ;;
        f)
            INPUT_FILE="$OPTARG"
            ;;
        o)
            OUTPUT_FILE="$OPTARG"
            ;;
        \?)
            echo "Error: Invalid option -$OPTARG" >&2
            usage 1
            ;;
        :)
            echo "Error: Option -$OPTARG requires an argument" >&2
            usage 1
            ;;
    esac
done

# Shift past parsed options
shift $((OPTIND - 1))

# Validate required arguments
if [ -z "$INPUT_FILE" ]; then
    echo "Error: -f FILE is required" >&2
    usage 1
fi

# Check input file exists
if [ ! -f "$INPUT_FILE" ]; then
    echo "Error: Input file not found: $INPUT_FILE" >&2
    exit 1
fi

# Display configuration
if [ "$VERBOSE" = "true" ]; then
    echo "Configuration:"
    echo "  Input file: $INPUT_FILE"
    echo "  Output file: ${OUTPUT_FILE:-stdout}"
    echo "  Verbose: $VERBOSE"
    echo
fi

# Process file
echo "Processing file: $INPUT_FILE"

if [ -n "$OUTPUT_FILE" ]; then
    # Write to output file
    cat "$INPUT_FILE" > "$OUTPUT_FILE"
    echo "Output written to: $OUTPUT_FILE"
else
    # Write to stdout
    cat "$INPUT_FILE"
fi

echo "Processing complete"

```

### examples/getopts-advanced.sh

```bash
#!/bin/bash
#
# Advanced getopts Example
#
# Demonstrates advanced argument parsing with validation, multiple files,
# and comprehensive error handling.

set -euo pipefail

#============================================================================
# Logging
#============================================================================

log_info() {
    echo "[INFO] $*" >&2
}

log_error() {
    echo "[ERROR] $*" >&2
}

log_debug() {
    if [ "$VERBOSE" = "true" ]; then
        echo "[DEBUG] $*" >&2
    fi
}

#============================================================================
# Usage
#============================================================================

usage() {
    cat <<EOF
Usage: $(basename "$0") [-h] [-v] [-d] [-f FILE]... [-o OUTPUT] [-t TYPE] [FILES...]

Advanced getopts example with multiple options and validation.

Options:
    -h          Show this help message
    -v          Enable verbose mode
    -d          Enable dry-run mode
    -f FILE     Input file (can be specified multiple times)
    -o OUTPUT   Output directory (default: current directory)
    -t TYPE     Processing type: text|json|yaml (default: text)

Positional Arguments:
    FILES       Additional input files

Examples:
    # Single file
    $(basename "$0") -f input.txt

    # Multiple files using -f
    $(basename "$0") -f file1.txt -f file2.txt -f file3.txt

    # Multiple files using positional arguments
    $(basename "$0") file1.txt file2.txt file3.txt

    # Mixed -f options and positional arguments
    $(basename "$0") -f file1.txt file2.txt file3.txt

    # With output directory and type
    $(basename "$0") -f input.json -o output/ -t json

    # Dry run mode
    $(basename "$0") -d -f input.txt
EOF
    exit "${1:-0}"
}

#============================================================================
# Validation
#============================================================================

validate_type() {
    local type=$1

    case "$type" in
        text|json|yaml)
            return 0
            ;;
        *)
            log_error "Invalid type: $type (must be text, json, or yaml)"
            return 1
            ;;
    esac
}

validate_file() {
    local file=$1

    if [ ! -f "$file" ]; then
        log_error "File not found: $file"
        return 1
    fi

    if [ ! -r "$file" ]; then
        log_error "File not readable: $file"
        return 1
    fi

    return 0
}

validate_output_dir() {
    local dir=$1

    if [ ! -d "$dir" ]; then
        log_error "Output directory does not exist: $dir"
        return 1
    fi

    if [ ! -w "$dir" ]; then
        log_error "Output directory not writable: $dir"
        return 1
    fi

    return 0
}

#============================================================================
# Processing
#============================================================================

process_file() {
    local file=$1
    local type=$2
    local output_dir=$3

    log_info "Processing $file (type: $type)"

    if [ "$DRY_RUN" = "true" ]; then
        log_info "[DRY RUN] Would process: $file"
        return 0
    fi

    # Actual processing logic here
    local output_file="$output_dir/$(basename "$file")"

    log_debug "Input: $file"
    log_debug "Output: $output_file"

    # Example: Copy file to output directory
    cp "$file" "$output_file"

    log_info "Processed: $file -> $output_file"
}

#============================================================================
# Main Script
#============================================================================

# Default values
VERBOSE=false
DRY_RUN=false
OUTPUT_DIR="."
PROCESS_TYPE="text"
INPUT_FILES=()

# Parse options
while getopts "hvdf:o:t:" opt; do
    case "$opt" in
        h)
            usage 0
            ;;
        v)
            VERBOSE=true
            ;;
        d)
            DRY_RUN=true
            ;;
        f)
            INPUT_FILES+=("$OPTARG")
            ;;
        o)
            OUTPUT_DIR="$OPTARG"
            ;;
        t)
            PROCESS_TYPE="$OPTARG"
            ;;
        \?)
            log_error "Invalid option -$OPTARG"
            usage 1
            ;;
        :)
            log_error "Option -$OPTARG requires an argument"
            usage 1
            ;;
    esac
done

# Shift past parsed options
shift $((OPTIND - 1))

# Add positional arguments to input files
for arg in "$@"; do
    INPUT_FILES+=("$arg")
done

# Validate configuration
if ! validate_type "$PROCESS_TYPE"; then
    usage 1
fi

if ! validate_output_dir "$OUTPUT_DIR"; then
    exit 1
fi

# Check we have at least one input file
if [ ${#INPUT_FILES[@]} -eq 0 ]; then
    log_error "No input files specified"
    usage 1
fi

# Display configuration
log_debug "Configuration:"
log_debug "  Input files: ${INPUT_FILES[*]}"
log_debug "  Output directory: $OUTPUT_DIR"
log_debug "  Processing type: $PROCESS_TYPE"
log_debug "  Verbose: $VERBOSE"
log_debug "  Dry run: $DRY_RUN"
log_debug ""

# Validate all input files
for file in "${INPUT_FILES[@]}"; do
    if ! validate_file "$file"; then
        exit 1
    fi
done

# Process each file
processed_count=0
failed_count=0

for file in "${INPUT_FILES[@]}"; do
    if process_file "$file" "$PROCESS_TYPE" "$OUTPUT_DIR"; then
        ((processed_count++))
    else
        ((failed_count++))
        log_error "Failed to process: $file"
    fi
done

# Summary
log_info "Processing complete:"
log_info "  Processed: $processed_count"
log_info "  Failed: $failed_count"

if [ "$failed_count" -gt 0 ]; then
    exit 1
fi

exit 0

```

### examples/long-options.sh

```bash
#!/bin/bash
#
# Long Options Example
#
# Demonstrates manual parsing of long options (--help, --file, etc.)
# Supports both --option value and --option=value formats

set -euo pipefail

#============================================================================
# Usage
#============================================================================

usage() {
    cat <<EOF
Usage: $(basename "$0") [OPTIONS]

Long options example demonstrating manual argument parsing.

Options:
    --help              Show this help message
    --verbose           Enable verbose mode
    --file FILE         Input file (required)
    --output FILE       Output file (optional)
    --format FORMAT     Output format: text|json|yaml (default: text)
    --dry-run           Show what would be done without executing

Both formats are supported:
    --file input.txt    (space-separated)
    --file=input.txt    (equals sign)

Examples:
    # Basic usage
    $(basename "$0") --file input.txt

    # With equals sign
    $(basename "$0") --file=input.txt --output=output.txt

    # With format specification
    $(basename "$0") --file input.json --format json

    # Dry run mode
    $(basename "$0") --file input.txt --dry-run
EOF
    exit "${1:-0}"
}

#============================================================================
# Main Script
#============================================================================

# Default values
VERBOSE=false
DRY_RUN=false
INPUT_FILE=""
OUTPUT_FILE=""
FORMAT="text"

# Parse long options
while [[ $# -gt 0 ]]; do
    case "$1" in
        --help)
            usage 0
            ;;
        --verbose)
            VERBOSE=true
            shift
            ;;
        --dry-run)
            DRY_RUN=true
            shift
            ;;
        --file)
            if [ -z "${2:-}" ]; then
                echo "Error: --file requires an argument" >&2
                usage 1
            fi
            INPUT_FILE="$2"
            shift 2
            ;;
        --file=*)
            INPUT_FILE="${1#*=}"
            shift
            ;;
        --output)
            if [ -z "${2:-}" ]; then
                echo "Error: --output requires an argument" >&2
                usage 1
            fi
            OUTPUT_FILE="$2"
            shift 2
            ;;
        --output=*)
            OUTPUT_FILE="${1#*=}"
            shift
            ;;
        --format)
            if [ -z "${2:-}" ]; then
                echo "Error: --format requires an argument" >&2
                usage 1
            fi
            FORMAT="$2"
            shift 2
            ;;
        --format=*)
            FORMAT="${1#*=}"
            shift
            ;;
        -*)
            echo "Error: Unknown option: $1" >&2
            usage 1
            ;;
        *)
            # Positional argument
            echo "Error: Unexpected positional argument: $1" >&2
            usage 1
            ;;
    esac
done

# Validate required arguments
if [ -z "$INPUT_FILE" ]; then
    echo "Error: --file is required" >&2
    usage 1
fi

# Validate format
case "$FORMAT" in
    text|json|yaml)
        ;;
    *)
        echo "Error: Invalid format: $FORMAT (must be text, json, or yaml)" >&2
        usage 1
        ;;
esac

# Check input file exists
if [ ! -f "$INPUT_FILE" ]; then
    echo "Error: Input file not found: $INPUT_FILE" >&2
    exit 1
fi

# Display configuration
if [ "$VERBOSE" = "true" ]; then
    echo "Configuration:"
    echo "  Input file: $INPUT_FILE"
    echo "  Output file: ${OUTPUT_FILE:-stdout}"
    echo "  Format: $FORMAT"
    echo "  Dry run: $DRY_RUN"
    echo
fi

# Process file
echo "Processing file: $INPUT_FILE"

if [ "$DRY_RUN" = "true" ]; then
    echo "[DRY RUN] Would process $INPUT_FILE with format $FORMAT"
    if [ -n "$OUTPUT_FILE" ]; then
        echo "[DRY RUN] Would write output to: $OUTPUT_FILE"
    fi
else
    # Actual processing
    if [ -n "$OUTPUT_FILE" ]; then
        cat "$INPUT_FILE" > "$OUTPUT_FILE"
        echo "Output written to: $OUTPUT_FILE"
    else
        cat "$INPUT_FILE"
    fi
fi

echo "Processing complete"

```

### examples/error-handling.sh

```bash
#!/bin/bash
#
# Error Handling Examples
#
# Demonstrates various error handling patterns in shell scripts.

#============================================================================
# Example 1: Fail-Fast with set -euo pipefail
#============================================================================

example_fail_fast() {
    echo "=== Example 1: Fail-Fast Pattern ==="

    (
        set -euo pipefail

        echo "Step 1: Success"
        true

        echo "Step 2: Success"
        true

        echo "Step 3: This will fail"
        false

        echo "This line will never execute"
    )

    echo "Script continued (fail-fast in subshell)"
    echo
}

#============================================================================
# Example 2: Explicit Exit Code Checking
#============================================================================

example_explicit_checking() {
    echo "=== Example 2: Explicit Exit Code Checking ==="

    # Method 1: if ! command
    if ! grep -q "pattern" /dev/null; then
        echo "Pattern not found (expected)"
    fi

    # Method 2: Capture and check exit code
    grep -q "pattern" /dev/null
    exit_code=$?
    if [ "$exit_code" -ne 0 ]; then
        echo "Exit code: $exit_code (expected)"
    fi

    # Method 3: Different handling per exit code
    curl -sSL --max-time 1 http://localhost:99999 2>/dev/null
    exit_code=$?

    case "$exit_code" in
        0)
            echo "Success"
            ;;
        6)
            echo "Could not resolve host (expected for this example)"
            ;;
        7)
            echo "Failed to connect (expected for this example)"
            ;;
        28)
            echo "Operation timeout"
            ;;
        *)
            echo "Curl failed with exit code: $exit_code"
            ;;
    esac

    echo
}

#============================================================================
# Example 3: Trap Handlers
#============================================================================

example_trap_handlers() {
    echo "=== Example 3: Trap Handlers ==="

    (
        # Create temporary file
        TEMP_FILE=$(mktemp)
        echo "Created temp file: $TEMP_FILE"

        # Cleanup function
        cleanup() {
            echo "Cleanup: Removing $TEMP_FILE"
            rm -f "$TEMP_FILE"
        }

        # Register cleanup on EXIT
        trap cleanup EXIT

        # Do some work
        echo "data" > "$TEMP_FILE"
        echo "Working with temp file..."

        # Cleanup runs automatically when subshell exits
    )

    echo "Temp file cleaned up automatically"
    echo
}

#============================================================================
# Example 4: Error Handler with Line Number
#============================================================================

example_error_handler() {
    echo "=== Example 4: Error Handler with Line Number ==="

    (
        set -Eeuo pipefail

        error_handler() {
            local line=$1
            local command=$2
            echo "Error on line $line: $command" >&2
        }

        trap 'error_handler $LINENO "$BASH_COMMAND"' ERR

        echo "Step 1: Success"
        true

        echo "Step 2: About to fail"
        false  # This triggers error handler

        echo "This won't execute"
    )

    echo "Error handler demonstrated"
    echo
}

#============================================================================
# Example 5: Defensive Programming
#============================================================================

example_defensive_programming() {
    echo "=== Example 5: Defensive Programming ==="

    # Check required commands
    check_command() {
        local cmd=$1
        if command -v "$cmd" >/dev/null 2>&1; then
            echo "✓ $cmd is available"
        else
            echo "✗ $cmd is not available"
        fi
    }

    check_command "bash"
    check_command "grep"
    check_command "nonexistent_command"

    # Check required environment variables
    export REQUIRED_VAR="value"
    : "${REQUIRED_VAR:?Error: REQUIRED_VAR must be set}"
    echo "✓ REQUIRED_VAR is set: $REQUIRED_VAR"

    # Check optional environment variable with default
    echo "Optional var: ${OPTIONAL_VAR:-default_value}"

    # Check file exists
    if [ -f "/etc/hosts" ]; then
        echo "✓ /etc/hosts exists"
    fi

    if [ ! -f "/nonexistent/file" ]; then
        echo "✓ Correctly detected missing file"
    fi

    echo
}

#============================================================================
# Example 6: Retry Logic
#============================================================================

example_retry_logic() {
    echo "=== Example 6: Retry Logic ==="

    retry_command() {
        local max_attempts=3
        local attempt=1
        local sleep_time=1

        while [ "$attempt" -le "$max_attempts" ]; do
            echo "Attempt $attempt of $max_attempts"

            # Simulate command that fails first 2 times
            if [ "$attempt" -eq 3 ]; then
                echo "Success on attempt $attempt"
                return 0
            else
                echo "Failed, retrying in ${sleep_time}s..."
                sleep "$sleep_time"
                attempt=$((attempt + 1))
            fi
        done

        echo "Failed after $max_attempts attempts"
        return 1
    }

    if retry_command; then
        echo "Command succeeded with retry logic"
    else
        echo "Command failed after retries"
    fi

    echo
}

#============================================================================
# Example 7: Timeout Handling
#============================================================================

example_timeout() {
    echo "=== Example 7: Timeout Handling ==="

    # Run command with timeout
    if timeout 2s sleep 1; then
        echo "✓ Command completed within timeout"
    else
        echo "✗ Command timed out or failed"
    fi

    if timeout 1s sleep 2; then
        echo "✓ Command completed within timeout"
    else
        local exit_code=$?
        if [ "$exit_code" -eq 124 ]; then
            echo "✗ Command timed out (expected)"
        else
            echo "✗ Command failed with exit code: $exit_code"
        fi
    fi

    echo
}

#============================================================================
# Example 8: Pipeline Error Handling
#============================================================================

example_pipeline_errors() {
    echo "=== Example 8: Pipeline Error Handling ==="

    # Without pipefail: exit code is from last command
    echo "Without pipefail:"
    false | true
    echo "Exit code: $? (from last command: true)"

    # With pipefail: exit code is from failed command
    echo "With pipefail:"
    (
        set -o pipefail
        false | true
        echo "Exit code: $? (from failed command: false)"
    ) || echo "Pipeline failed (expected)"

    echo
}

#============================================================================
# Main
#============================================================================

main() {
    echo "Shell Error Handling Examples"
    echo "=============================="
    echo

    example_fail_fast
    example_explicit_checking
    example_trap_handlers
    example_error_handler
    example_defensive_programming
    example_retry_logic
    example_timeout
    example_pipeline_errors

    echo "All examples completed"
}

main "$@"

```

### examples/json-yaml-processing.sh

```bash
#!/bin/bash
#
# JSON and YAML Processing Examples
#
# Demonstrates using jq and yq for processing structured data.

set -euo pipefail

#============================================================================
# Example 1: Basic JSON Parsing with jq
#============================================================================

example_json_basic() {
    echo "=== Example 1: Basic JSON Parsing ==="

    # Sample JSON
    json='{"name":"Alice","age":30,"city":"New York"}'

    echo "Original JSON:"
    echo "$json" | jq '.'

    # Extract fields
    echo "Name: $(echo "$json" | jq -r '.name')"
    echo "Age: $(echo "$json" | jq -r '.age')"
    echo "City: $(echo "$json" | jq -r '.city')"

    echo
}

#============================================================================
# Example 2: JSON Arrays
#============================================================================

example_json_arrays() {
    echo "=== Example 2: JSON Arrays ==="

    # Sample JSON array
    json='[
        {"name":"Alice","age":30},
        {"name":"Bob","age":25},
        {"name":"Charlie","age":35}
    ]'

    echo "All users:"
    echo "$json" | jq '.[] | .name'

    echo "Users over 26:"
    echo "$json" | jq '.[] | select(.age > 26) | .name'

    echo "Average age:"
    echo "$json" | jq '[.[].age] | add / length'

    echo
}

#============================================================================
# Example 3: Nested JSON
#============================================================================

example_json_nested() {
    echo "=== Example 3: Nested JSON ==="

    # Sample nested JSON
    json='{
        "user": {
            "name": "Alice",
            "contact": {
                "email": "[email protected]",
                "phone": "555-1234"
            },
            "addresses": [
                {"type":"home","city":"New York"},
                {"type":"work","city":"Boston"}
            ]
        }
    }'

    echo "Email: $(echo "$json" | jq -r '.user.contact.email')"
    echo "Phone: $(echo "$json" | jq -r '.user.contact.phone')"

    echo "Cities:"
    echo "$json" | jq -r '.user.addresses[].city'

    echo "Work city:"
    echo "$json" | jq -r '.user.addresses[] | select(.type=="work") | .city'

    echo
}

#============================================================================
# Example 4: Constructing JSON
#============================================================================

example_json_construct() {
    echo "=== Example 4: Constructing JSON ==="

    # From variables
    name="Alice"
    age=30
    city="New York"

    jq -n \
        --arg name "$name" \
        --argjson age "$age" \
        --arg city "$city" \
        '{name: $name, age: $age, city: $city}'

    # Construct array
    jq -n --arg name1 "Alice" --arg name2 "Bob" \
        '[{name: $name1}, {name: $name2}]'

    echo
}

#============================================================================
# Example 5: JSON Transformation
#============================================================================

example_json_transform() {
    echo "=== Example 5: JSON Transformation ==="

    # Sample input
    input='[
        {"firstName":"Alice","lastName":"Smith"},
        {"firstName":"Bob","lastName":"Jones"}
    ]'

    # Transform structure
    echo "Transformed:"
    echo "$input" | jq '[.[] | {fullName: "\(.firstName) \(.lastName)"}]'

    # Add field
    echo "With ID:"
    echo "$input" | jq '[.[] | . + {id: (.firstName | ascii_downcase)}]'

    echo
}

#============================================================================
# Example 6: YAML Parsing with yq
#============================================================================

example_yaml_basic() {
    echo "=== Example 6: Basic YAML Parsing ==="

    # Create sample YAML file
    cat > /tmp/config.yaml <<EOF
database:
  host: localhost
  port: 5432
  name: mydb
servers:
  - name: web1
    ip: 192.168.1.10
  - name: web2
    ip: 192.168.1.11
EOF

    echo "Database host:"
    yq eval '.database.host' /tmp/config.yaml

    echo "Database port:"
    yq eval '.database.port' /tmp/config.yaml

    echo "Server names:"
    yq eval '.servers[].name' /tmp/config.yaml

    # Cleanup
    rm -f /tmp/config.yaml

    echo
}

#============================================================================
# Example 7: YAML Modification
#============================================================================

example_yaml_modify() {
    echo "=== Example 7: YAML Modification ==="

    # Create sample YAML file
    cat > /tmp/config.yaml <<EOF
version: "1.0"
debug: false
port: 8080
EOF

    echo "Original:"
    cat /tmp/config.yaml

    # Update values
    yq eval '.version = "2.0"' -i /tmp/config.yaml
    yq eval '.debug = true' -i /tmp/config.yaml
    yq eval '.port = 9000' -i /tmp/config.yaml

    echo "Modified:"
    cat /tmp/config.yaml

    # Cleanup
    rm -f /tmp/config.yaml

    echo
}

#============================================================================
# Example 8: YAML to JSON Conversion
#============================================================================

example_yaml_to_json() {
    echo "=== Example 8: YAML to JSON Conversion ==="

    # Create YAML
    cat > /tmp/data.yaml <<EOF
name: Alice
age: 30
hobbies:
  - reading
  - coding
  - hiking
EOF

    echo "YAML:"
    cat /tmp/data.yaml

    echo "JSON:"
    yq eval -o=json /tmp/data.yaml

    # Cleanup
    rm -f /tmp/data.yaml

    echo
}

#============================================================================
# Example 9: Real-World API Processing
#============================================================================

example_api_processing() {
    echo "=== Example 9: Real-World API Processing ==="

    # Simulate API response
    api_response='{
        "status": "success",
        "data": {
            "users": [
                {"id":1,"name":"Alice","active":true},
                {"id":2,"name":"Bob","active":false},
                {"id":3,"name":"Charlie","active":true}
            ],
            "total": 3
        }
    }'

    # Check status
    status=$(echo "$api_response" | jq -r '.status')
    echo "Status: $status"

    if [ "$status" != "success" ]; then
        echo "API request failed"
        return 1
    fi

    # Extract active users
    echo "Active users:"
    echo "$api_response" | jq -r '.data.users[] | select(.active) | .name'

    # Count active users
    active_count=$(echo "$api_response" | jq '[.data.users[] | select(.active)] | length')
    echo "Active user count: $active_count"

    echo
}

#============================================================================
# Example 10: Error Handling
#============================================================================

example_error_handling() {
    echo "=== Example 10: Error Handling ==="

    # Check if jq is installed
    if ! command -v jq >/dev/null 2>&1; then
        echo "Error: jq is required but not installed"
        return 1
    fi

    # Validate JSON
    invalid_json='{"name": "Alice", invalid}'

    echo "Validating invalid JSON:"
    if echo "$invalid_json" | jq empty 2>/dev/null; then
        echo "JSON is valid"
    else
        echo "JSON is invalid (expected)"
    fi

    # Check if field exists
    json='{"name":"Alice"}'

    echo "Checking for email field:"
    if echo "$json" | jq -e '.email' >/dev/null 2>&1; then
        echo "Email field exists"
    else
        echo "Email field missing (expected)"
    fi

    echo
}

#============================================================================
# Example 11: Complex jq Queries
#============================================================================

example_complex_queries() {
    echo "=== Example 11: Complex jq Queries ==="

    # Sample data
    json='[
        {"name":"Alice","dept":"Engineering","salary":100000},
        {"name":"Bob","dept":"Sales","salary":80000},
        {"name":"Charlie","dept":"Engineering","salary":90000},
        {"name":"David","dept":"Sales","salary":85000}
    ]'

    # Group by department
    echo "Average salary by department:"
    echo "$json" | jq -r '
        group_by(.dept) |
        map({
            dept: .[0].dept,
            avg_salary: (map(.salary) | add / length)
        }) |
        .[] |
        "\(.dept): \(.avg_salary)"
    '

    # Top earners
    echo "Top 2 earners:"
    echo "$json" | jq -r 'sort_by(.salary) | reverse | .[0:2] | .[] | "\(.name): \(.salary)"'

    echo
}

#============================================================================
# Main
#============================================================================

main() {
    echo "JSON and YAML Processing Examples"
    echo "=================================="
    echo

    # Check dependencies
    if ! command -v jq >/dev/null 2>&1; then
        echo "Warning: jq not installed, skipping jq examples"
        echo "Install with: brew install jq (macOS) or apt-get install jq (Linux)"
    else
        example_json_basic
        example_json_arrays
        example_json_nested
        example_json_construct
        example_json_transform
        example_api_processing
        example_complex_queries
    fi

    if ! command -v yq >/dev/null 2>&1; then
        echo "Warning: yq not installed, skipping yq examples"
        echo "Install with: brew install yq (macOS)"
    else
        example_yaml_basic
        example_yaml_modify
        example_yaml_to_json
    fi

    example_error_handling

    echo "All examples completed"
}

main "$@"

```

### scripts/lint-script.sh

```bash
#!/bin/bash
#
# ShellCheck Linting Wrapper for CI/CD
#
# Runs ShellCheck on shell scripts with configurable options.
# Designed for CI/CD pipelines.

set -euo pipefail

#============================================================================
# Configuration
#============================================================================

readonly SCRIPT_NAME="$(basename "$0")"

# Default severity level
SEVERITY="warning"

# Default shell type
SHELL_TYPE="bash"

# Exclude warnings (comma-separated)
EXCLUDE=""

# Directories/files to check
CHECK_PATHS=()

# Exit with error on warnings
STRICT_MODE=false

# Output format
FORMAT="tty"

#============================================================================
# Usage
#============================================================================

usage() {
    cat <<EOF
$SCRIPT_NAME - ShellCheck wrapper for CI/CD

Usage:
    $SCRIPT_NAME [OPTIONS] [PATHS...]

Options:
    -h, --help              Show this help message
    -s, --severity LEVEL    Set severity: error|warning|info|style (default: warning)
    --shell TYPE            Set shell type: sh|bash|dash|ksh (default: bash)
    --exclude CODES         Exclude warning codes (comma-separated)
    --strict                Exit with error on any warnings
    --format FORMAT         Output format: tty|gcc|json|checkstyle (default: tty)

Arguments:
    PATHS                   Files or directories to check (default: current directory)

Examples:
    # Check all scripts in current directory
    $SCRIPT_NAME

    # Check specific file
    $SCRIPT_NAME script.sh

    # Check with strict mode (fail on warnings)
    $SCRIPT_NAME --strict *.sh

    # Check POSIX sh compliance
    $SCRIPT_NAME --shell sh script.sh

    # Exclude specific warnings
    $SCRIPT_NAME --exclude SC2086,SC2046 script.sh

    # JSON output for CI
    $SCRIPT_NAME --format json script.sh
EOF
    exit "${1:-0}"
}

#============================================================================
# Logging
#============================================================================

log_info() {
    echo "[INFO] $*" >&2
}

log_error() {
    echo "[ERROR] $*" >&2
}

#============================================================================
# Argument Parsing
#============================================================================

parse_arguments() {
    while [[ $# -gt 0 ]]; do
        case "$1" in
            -h|--help)
                usage 0
                ;;
            -s|--severity)
                SEVERITY="$2"
                shift 2
                ;;
            --shell)
                SHELL_TYPE="$2"
                shift 2
                ;;
            --exclude)
                EXCLUDE="$2"
                shift 2
                ;;
            --strict)
                STRICT_MODE=true
                shift
                ;;
            --format)
                FORMAT="$2"
                shift 2
                ;;
            -*)
                log_error "Unknown option: $1"
                usage 1
                ;;
            *)
                CHECK_PATHS+=("$1")
                shift
                ;;
        esac
    done

    # Default to current directory if no paths specified
    if [ ${#CHECK_PATHS[@]} -eq 0 ]; then
        CHECK_PATHS=(".")
    fi
}

#============================================================================
# Validation
#============================================================================

check_dependencies() {
    if ! command -v shellcheck >/dev/null 2>&1; then
        log_error "shellcheck is required but not installed"
        log_error "Install with: brew install shellcheck (macOS) or apt-get install shellcheck (Linux)"
        exit 1
    fi
}

validate_severity() {
    case "$SEVERITY" in
        error|warning|info|style)
            return 0
            ;;
        *)
            log_error "Invalid severity: $SEVERITY (must be error, warning, info, or style)"
            return 1
            ;;
    esac
}

validate_shell_type() {
    case "$SHELL_TYPE" in
        sh|bash|dash|ksh)
            return 0
            ;;
        *)
            log_error "Invalid shell type: $SHELL_TYPE (must be sh, bash, dash, or ksh)"
            return 1
            ;;
    esac
}

validate_format() {
    case "$FORMAT" in
        tty|gcc|json|checkstyle)
            return 0
            ;;
        *)
            log_error "Invalid format: $FORMAT (must be tty, gcc, json, or checkstyle)"
            return 1
            ;;
    esac
}

#============================================================================
# File Discovery
#============================================================================

find_shell_scripts() {
    local path=$1
    local scripts=()

    if [ -f "$path" ]; then
        # Single file
        scripts+=("$path")
    elif [ -d "$path" ]; then
        # Directory: find shell scripts
        while IFS= read -r -d '' file; do
            # Check if file has shell shebang
            if head -n 1 "$file" 2>/dev/null | grep -qE '^#!.*(sh|bash|dash|ksh)'; then
                scripts+=("$file")
            fi
        done < <(find "$path" -type f -executable -print0)

        # Also include *.sh files
        while IFS= read -r -d '' file; do
            scripts+=("$file")
        done < <(find "$path" -type f -name "*.sh" -print0)
    else
        log_error "Path not found: $path"
        return 1
    fi

    # Remove duplicates and print
    printf '%s\n' "${scripts[@]}" | sort -u
}

#============================================================================
# ShellCheck Execution
#============================================================================

run_shellcheck() {
    local file=$1
    local args=()

    # Build arguments
    args+=("--severity=$SEVERITY")
    args+=("--shell=$SHELL_TYPE")
    args+=("--format=$FORMAT")

    if [ -n "$EXCLUDE" ]; then
        args+=("--exclude=$EXCLUDE")
    fi

    # Run ShellCheck
    shellcheck "${args[@]}" "$file"
    return $?
}

#============================================================================
# Main
#============================================================================

main() {
    parse_arguments "$@"

    # Validate configuration
    check_dependencies
    validate_severity || exit 1
    validate_shell_type || exit 1
    validate_format || exit 1

    # Collect all files to check
    local all_files=()
    for path in "${CHECK_PATHS[@]}"; do
        while IFS= read -r file; do
            all_files+=("$file")
        done < <(find_shell_scripts "$path")
    done

    # Check if we found any files
    if [ ${#all_files[@]} -eq 0 ]; then
        log_error "No shell scripts found"
        exit 1
    fi

    log_info "Checking ${#all_files[@]} file(s) with ShellCheck"
    log_info "Severity: $SEVERITY, Shell: $SHELL_TYPE"

    # Run ShellCheck on each file
    local failed_count=0
    local warning_count=0

    for file in "${all_files[@]}"; do
        if run_shellcheck "$file"; then
            continue
        else
            local exit_code=$?

            if [ "$exit_code" -eq 1 ]; then
                # Errors found
                ((failed_count++))
            elif [ "$STRICT_MODE" = "true" ]; then
                # Warnings in strict mode
                ((warning_count++))
            fi
        fi
    done

    # Summary
    if [ "$FORMAT" = "tty" ]; then
        echo
        log_info "ShellCheck complete: ${#all_files[@]} file(s) checked"

        if [ "$failed_count" -gt 0 ]; then
            log_error "Files with errors: $failed_count"
        fi

        if [ "$warning_count" -gt 0 ]; then
            log_error "Files with warnings (strict mode): $warning_count"
        fi
    fi

    # Exit code
    if [ "$failed_count" -gt 0 ]; then
        exit 1
    fi

    if [ "$STRICT_MODE" = "true" ] && [ "$warning_count" -gt 0 ]; then
        exit 1
    fi

    exit 0
}

main "$@"

```

### scripts/test-script.sh

```bash
#!/bin/bash
#
# Bats Testing Wrapper for CI/CD
#
# Runs Bats tests with configurable options.
# Designed for CI/CD pipelines.

set -euo pipefail

#============================================================================
# Configuration
#============================================================================

readonly SCRIPT_NAME="$(basename "$0")"

# Default test directory
TEST_DIR="test"

# Output format
FORMAT="pretty"

# Verbose mode
VERBOSE=false

# Count only
COUNT_ONLY=false

# Filter tests
FILTER=""

# Number of parallel jobs
JOBS=1

#============================================================================
# Usage
#============================================================================

usage() {
    cat <<EOF
$SCRIPT_NAME - Bats testing wrapper for CI/CD

Usage:
    $SCRIPT_NAME [OPTIONS] [TEST_FILES...]

Options:
    -h, --help              Show this help message
    -d, --dir DIR           Test directory (default: test/)
    -f, --format FORMAT     Output format: pretty|tap|junit (default: pretty)
    -v, --verbose           Verbose output
    -c, --count             Count tests only
    --filter PATTERN        Run only tests matching pattern
    -j, --jobs N            Number of parallel jobs (default: 1)

Arguments:
    TEST_FILES              Specific test files to run (default: all in test dir)

Examples:
    # Run all tests
    $SCRIPT_NAME

    # Run specific test file
    $SCRIPT_NAME test/example.bats

    # Run with TAP output
    $SCRIPT_NAME --format tap

    # Count tests
    $SCRIPT_NAME --count

    # Run tests matching pattern
    $SCRIPT_NAME --filter "argument parsing"

    # Run tests in parallel
    $SCRIPT_NAME --jobs 4
EOF
    exit "${1:-0}"
}

#============================================================================
# Logging
#============================================================================

log_info() {
    echo "[INFO] $*" >&2
}

log_error() {
    echo "[ERROR] $*" >&2
}

#============================================================================
# Argument Parsing
#============================================================================

parse_arguments() {
    TEST_FILES=()

    while [[ $# -gt 0 ]]; do
        case "$1" in
            -h|--help)
                usage 0
                ;;
            -d|--dir)
                TEST_DIR="$2"
                shift 2
                ;;
            -f|--format)
                FORMAT="$2"
                shift 2
                ;;
            -v|--verbose)
                VERBOSE=true
                shift
                ;;
            -c|--count)
                COUNT_ONLY=true
                shift
                ;;
            --filter)
                FILTER="$2"
                shift 2
                ;;
            -j|--jobs)
                JOBS="$2"
                shift 2
                ;;
            -*)
                log_error "Unknown option: $1"
                usage 1
                ;;
            *)
                TEST_FILES+=("$1")
                shift
                ;;
        esac
    done
}

#============================================================================
# Validation
#============================================================================

check_dependencies() {
    if ! command -v bats >/dev/null 2>&1; then
        log_error "bats is required but not installed"
        log_error "Install with: brew install bats-core (macOS) or npm install -g bats"
        exit 1
    fi
}

validate_format() {
    case "$FORMAT" in
        pretty|tap|junit)
            return 0
            ;;
        *)
            log_error "Invalid format: $FORMAT (must be pretty, tap, or junit)"
            return 1
            ;;
    esac
}

validate_test_dir() {
    if [ ! -d "$TEST_DIR" ]; then
        log_error "Test directory not found: $TEST_DIR"
        return 1
    fi
}

#============================================================================
# Test Discovery
#============================================================================

find_test_files() {
    if [ ${#TEST_FILES[@]} -gt 0 ]; then
        # Use specified files
        printf '%s\n' "${TEST_FILES[@]}"
    else
        # Find all .bats files in test directory
        find "$TEST_DIR" -type f -name "*.bats" | sort
    fi
}

#============================================================================
# Bats Execution
#============================================================================

run_bats() {
    local args=()

    # Build arguments
    if [ "$FORMAT" = "tap" ]; then
        args+=("--tap")
    elif [ "$FORMAT" = "junit" ]; then
        args+=("--formatter" "junit")
    else
        args+=("--pretty")
    fi

    if [ "$COUNT_ONLY" = "true" ]; then
        args+=("--count")
    fi

    if [ -n "$FILTER" ]; then
        args+=("--filter" "$FILTER")
    fi

    if [ "$JOBS" -gt 1 ]; then
        args+=("--jobs" "$JOBS")
    fi

    # Add test files
    while IFS= read -r test_file; do
        args+=("$test_file")
    done < <(find_test_files)

    # Check if we have test files
    if [ "${#args[@]}" -eq 0 ] || ! printf '%s\n' "${args[@]}" | grep -q '\.bats$'; then
        log_error "No test files found"
        exit 1
    fi

    # Run Bats
    log_info "Running Bats tests..."

    if [ "$VERBOSE" = "true" ]; then
        log_info "Command: bats ${args[*]}"
    fi

    if bats "${args[@]}"; then
        log_info "All tests passed"
        return 0
    else
        local exit_code=$?
        log_error "Tests failed (exit code: $exit_code)"
        return "$exit_code"
    fi
}

#============================================================================
# Main
#============================================================================

main() {
    parse_arguments "$@"

    # Validate configuration
    check_dependencies
    validate_format || exit 1

    if [ ${#TEST_FILES[@]} -eq 0 ]; then
        validate_test_dir || exit 1
    fi

    # Run tests
    if run_bats; then
        exit 0
    else
        exit 1
    fi
}

main "$@"

```

shell-scripting | SkillHub