Back to skills
SkillHub ClubShip Full StackFull Stack

monorepo-navigator

Monorepo Navigator

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
5,836
Hot score
99
Updated
March 20, 2026
Overall rating
C4.0
Composite score
4.0
Best-practice grade
C58.6

Install command

npx @skill-hub/cli install alirezarezvani-claude-skills-monorepo-navigator

Repository

alirezarezvani/claude-skills

Skill path: engineering/monorepo-navigator

Monorepo Navigator

Open repository

Best for

Primary workflow: Ship Full Stack.

Technical facets: Full Stack.

Target audience: everyone.

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: alirezarezvani.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install monorepo-navigator into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/alirezarezvani/claude-skills before adding monorepo-navigator to shared team environments
  • Use monorepo-navigator for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

Original source / Raw SKILL.md

---
name: "monorepo-navigator"
description: "Monorepo Navigator"
---

# Monorepo Navigator

**Tier:** POWERFUL  
**Category:** Engineering  
**Domain:** Monorepo Architecture / Build Systems  

---

## Overview

Navigate, manage, and optimize monorepos. Covers Turborepo, Nx, pnpm workspaces, and Lerna. Enables cross-package impact analysis, selective builds/tests on affected packages only, remote caching, dependency graph visualization, and structured migrations from multi-repo to monorepo. Includes Claude Code configuration for workspace-aware development.

---

## Core Capabilities

- **Cross-package impact analysis** — determine which apps break when a shared package changes
- **Selective commands** — run tests/builds only for affected packages (not everything)
- **Dependency graph** — visualize package relationships as Mermaid diagrams
- **Build optimization** — remote caching, incremental builds, parallel execution
- **Migration** — step-by-step multi-repo → monorepo with zero history loss
- **Publishing** — changesets for versioning, pre-release channels, npm publish workflows
- **Claude Code config** — workspace-aware CLAUDE.md with per-package instructions

---

## When to Use

Use when:
- Multiple packages/apps share code (UI components, utils, types, API clients)
- Build times are slow because everything rebuilds when anything changes
- Migrating from multiple repos to a single repo
- Need to publish packages to npm with coordinated versioning
- Teams work across multiple packages and need unified tooling

Skip when:
- Single-app project with no shared packages
- Team/project boundaries are completely isolated (polyrepo is fine)
- Shared code is minimal and copy-paste overhead is acceptable

---

## Tool Selection

| Tool | Best For | Key Feature |
|---|---|---|
| **Turborepo** | JS/TS monorepos, simple pipeline config | Best-in-class remote caching, minimal config |
| **Nx** | Large enterprises, plugin ecosystem | Project graph, code generation, affected commands |
| **pnpm workspaces** | Workspace protocol, disk efficiency | `workspace:*` for local package refs |
| **Lerna** | npm publishing, versioning | Batch publishing, conventional commits |
| **Changesets** | Modern versioning (preferred over Lerna) | Changelog generation, pre-release channels |

Most modern setups: **pnpm workspaces + Turborepo + Changesets**

---

## Turborepo
→ See references/monorepo-tooling-reference.md for details

## Workspace Analyzer

```bash
python3 scripts/monorepo_analyzer.py /path/to/monorepo
python3 scripts/monorepo_analyzer.py /path/to/monorepo --json
```

Also see `references/monorepo-patterns.md` for common architecture and CI patterns.

## Common Pitfalls

| Pitfall | Fix |
|---|---|
| Running `turbo run build` without `--filter` on every PR | Always use `--filter=...[origin/main]` in CI |
| `workspace:*` refs cause publish failures | Use `pnpm changeset publish` — it replaces `workspace:*` with real versions automatically |
| All packages rebuild when unrelated file changes | Tune `inputs` in turbo.json to exclude docs, config files from cache keys |
| Shared tsconfig causes one package to break all type-checks | Use `extends` properly — each package extends root but overrides `rootDir` / `outDir` |
| git history lost during migration | Use `git filter-repo --to-subdirectory-filter` before merging — never move files manually |
| Remote cache not working in CI | Check TURBO_TOKEN and TURBO_TEAM env vars; verify with `turbo run build --summarize` |
| CLAUDE.md too generic — Claude modifies wrong package | Add explicit "When working on X, only touch files in apps/X" rules per package CLAUDE.md |

---

## Best Practices

1. **Root CLAUDE.md defines the map** — document every package, its purpose, and dependency rules
2. **Per-package CLAUDE.md defines the rules** — what's allowed, what's forbidden, testing commands
3. **Always scope commands with --filter** — running everything on every change defeats the purpose
4. **Remote cache is not optional** — without it, monorepo CI is slower than multi-repo CI
5. **Changesets over manual versioning** — never hand-edit package.json versions in a monorepo
6. **Shared configs in root, extended in packages** — tsconfig.base.json, .eslintrc.base.js, jest.base.config.js
7. **Impact analysis before merging shared package changes** — run affected check, communicate blast radius
8. **Keep packages/types as pure TypeScript** — no runtime code, no dependencies, fast to build and type-check


---

## Referenced Files

> The following files are referenced in this skill and included for context.

### scripts/monorepo_analyzer.py

```python
#!/usr/bin/env python3
"""Detect monorepo tooling, workspaces, and internal dependency graph."""

from __future__ import annotations

import argparse
import glob
import json
import os
from pathlib import Path
from typing import Dict, List, Set


def load_json(path: Path) -> Dict:
    try:
        return json.loads(path.read_text(encoding="utf-8"))
    except Exception:
        return {}


def detect_repo_type(root: Path) -> List[str]:
    detected: List[str] = []
    if (root / "turbo.json").exists():
        detected.append("Turborepo")
    if (root / "nx.json").exists():
        detected.append("Nx")
    if (root / "pnpm-workspace.yaml").exists():
        detected.append("pnpm-workspaces")
    if (root / "lerna.json").exists():
        detected.append("Lerna")

    pkg = load_json(root / "package.json")
    if "workspaces" in pkg and "npm-workspaces" not in detected:
        detected.append("npm-workspaces")
    return detected


def parse_pnpm_workspace(root: Path) -> List[str]:
    workspace_file = root / "pnpm-workspace.yaml"
    if not workspace_file.exists():
        return []

    patterns: List[str] = []
    in_packages = False
    for line in workspace_file.read_text(encoding="utf-8", errors="ignore").splitlines():
        stripped = line.strip()
        if stripped.startswith("packages:"):
            in_packages = True
            continue
        if in_packages and stripped.startswith("-"):
            item = stripped[1:].strip().strip('"').strip("'")
            if item:
                patterns.append(item)
        elif in_packages and stripped and not stripped.startswith("#") and not stripped.startswith("-"):
            in_packages = False
    return patterns


def parse_package_workspaces(root: Path) -> List[str]:
    pkg = load_json(root / "package.json")
    workspaces = pkg.get("workspaces")
    if isinstance(workspaces, list):
        return [str(item) for item in workspaces]
    if isinstance(workspaces, dict) and isinstance(workspaces.get("packages"), list):
        return [str(item) for item in workspaces["packages"]]
    return []


def expand_workspace_patterns(root: Path, patterns: List[str]) -> List[Path]:
    paths: Set[Path] = set()
    for pattern in patterns:
        for match in glob.glob(str(root / pattern)):
            p = Path(match)
            if p.is_dir() and (p / "package.json").exists():
                paths.add(p.resolve())
    return sorted(paths)


def load_workspace_packages(workspaces: List[Path]) -> Dict[str, Dict]:
    packages: Dict[str, Dict] = {}
    for ws in workspaces:
        data = load_json(ws / "package.json")
        name = data.get("name") or ws.name
        packages[name] = {
            "path": str(ws),
            "dependencies": data.get("dependencies", {}),
            "devDependencies": data.get("devDependencies", {}),
            "peerDependencies": data.get("peerDependencies", {}),
        }
    return packages


def build_dependency_graph(packages: Dict[str, Dict]) -> Dict[str, List[str]]:
    package_names = set(packages.keys())
    graph: Dict[str, List[str]] = {}
    for name, meta in packages.items():
        deps: Set[str] = set()
        for section in ("dependencies", "devDependencies", "peerDependencies"):
            dep_map = meta.get(section, {})
            if isinstance(dep_map, dict):
                for dep_name in dep_map.keys():
                    if dep_name in package_names:
                        deps.add(dep_name)
        graph[name] = sorted(deps)
    return graph


def format_tree_paths(root: Path, workspaces: List[Path]) -> List[str]:
    out: List[str] = []
    for ws in workspaces:
        out.append(str(ws.relative_to(root)))
    return out


def parse_args() -> argparse.Namespace:
    parser = argparse.ArgumentParser(description="Analyze monorepo type, workspaces, and internal dependency graph.")
    parser.add_argument("path", help="Monorepo root path")
    parser.add_argument("--json", action="store_true", help="Output JSON")
    return parser.parse_args()


def main() -> int:
    args = parse_args()
    root = Path(args.path).expanduser().resolve()
    if not root.exists() or not root.is_dir():
        raise SystemExit(f"Path is not a directory: {root}")

    types = detect_repo_type(root)
    patterns = parse_pnpm_workspace(root)
    if not patterns:
        patterns = parse_package_workspaces(root)

    workspaces = expand_workspace_patterns(root, patterns)
    packages = load_workspace_packages(workspaces)
    graph = build_dependency_graph(packages)

    report = {
        "root": str(root),
        "detected_types": types,
        "workspace_patterns": patterns,
        "workspace_paths": format_tree_paths(root, workspaces),
        "package_count": len(packages),
        "dependency_graph": graph,
    }

    if args.json:
        print(json.dumps(report, indent=2))
    else:
        print("Monorepo Analysis")
        print(f"Root: {report['root']}")
        print(f"Detected: {', '.join(types) if types else 'none'}")
        print(f"Workspace patterns: {', '.join(patterns) if patterns else 'none'}")
        print("")
        print("Workspaces")
        for ws in report["workspace_paths"]:
            print(f"- {ws}")
        if not report["workspace_paths"]:
            print("- none detected")
        print("")
        print("Internal dependency graph")
        for pkg, deps in graph.items():
            print(f"- {pkg} -> {', '.join(deps) if deps else '(no internal deps)'}")

    return 0


if __name__ == "__main__":
    raise SystemExit(main())

```

### references/monorepo-patterns.md

```markdown
# Monorepo Patterns

## Common Layouts

### apps + packages

- `apps/*`: deployable applications
- `packages/*`: shared libraries, UI kits, utilities
- `tooling/*`: lint/build config packages

### domains + shared

- `domains/*`: bounded-context product areas
- `shared/*`: cross-domain code with strict API contracts

### service monorepo

- `services/*`: backend services
- `libs/*`: shared service contracts and SDKs

## Dependency Rules

- Prefer one-way dependencies from apps/services to packages/libs.
- Keep cross-app imports disallowed unless explicitly approved.
- Keep `types` packages runtime-free to avoid unexpected coupling.

## Build/CI Patterns

- Use affected-only CI (`--filter` or equivalent).
- Enable remote cache for build and test tasks.
- Split lint/typecheck/test tasks to isolate failures quickly.

## Release Patterns

- Use Changesets or equivalent for versioning.
- Keep package publishing automated and reproducible.
- Use prerelease channels for unstable shared package changes.

```



---

## Skill Companion Files

> Additional files collected from the skill directory layout.

### references/monorepo-tooling-reference.md

```markdown
# monorepo-navigator reference

## Turborepo

### turbo.json pipeline config

```json
{
  "$schema": "https://turbo.build/schema.json",
  "globalEnv": ["NODE_ENV", "DATABASE_URL"],
  "pipeline": {
    "build": {
      "dependsOn": ["^build"],    // build deps first (topological order)
      "outputs": [".next/**", "dist/**", "build/**"],
      "env": ["NEXT_PUBLIC_API_URL"]
    },
    "test": {
      "dependsOn": ["^build"],    // need built deps to test
      "outputs": ["coverage/**"],
      "cache": true
    },
    "lint": {
      "outputs": [],
      "cache": true
    },
    "dev": {
      "cache": false,             // never cache dev servers
      "persistent": true          // long-running process
    },
    "type-check": {
      "dependsOn": ["^build"],
      "outputs": []
    }
  }
}
```

### Key commands

```bash
# Build everything (respects dependency order)
turbo run build

# Build only affected packages (requires --filter)
turbo run build --filter=...[HEAD^1]   # changed since last commit
turbo run build --filter=...[main]     # changed vs main branch

# Test only affected
turbo run test --filter=...[HEAD^1]

# Run for a specific app and all its dependencies
turbo run build --filter=@myorg/web...

# Run for a specific package only (no dependencies)
turbo run build --filter=@myorg/ui

# Dry-run — see what would run without executing
turbo run build --dry-run

# Enable remote caching (Vercel Remote Cache)
turbo login
turbo link
```

### Remote caching setup

```bash
# .turbo/config.json (auto-created by turbo link)
{
  "teamid": "team_xxxx",
  "apiurl": "https://vercel.com"
}

# Self-hosted cache server (open-source alternative)
# Run ducktape/turborepo-remote-cache or Turborepo's official server
TURBO_API=http://your-cache-server.internal \
TURBO_TOKEN=your-token \
TURBO_TEAM=your-team \
turbo run build
```

---

## Nx

### Project graph and affected commands

```bash
# Install
npx create-nx-workspace@latest my-monorepo

# Visualize the project graph (opens browser)
nx graph

# Show affected packages for the current branch
nx affected:graph

# Run only affected tests
nx affected --target=test

# Run only affected builds
nx affected --target=build

# Run affected with base/head (for CI)
nx affected --target=test --base=main --head=HEAD
```

### nx.json configuration

```json
{
  "$schema": "./node_modules/nx/schemas/nx-schema.json",
  "targetDefaults": {
    "build": {
      "dependsOn": ["^build"],
      "cache": true
    },
    "test": {
      "cache": true,
      "inputs": ["default", "^production"]
    }
  },
  "namedInputs": {
    "default":    ["{projectRoot}/**/*", "sharedGlobals"],
    "production": ["default", "!{projectRoot}/**/*.spec.ts", "!{projectRoot}/jest.config.*"],
    "sharedGlobals": []
  },
  "parallel": 4,
  "cacheDirectory": "/tmp/nx-cache"
}
```

---

## pnpm Workspaces

### pnpm-workspace.yaml

```yaml
packages:
  - 'apps/*'
  - 'packages/*'
  - 'tools/*'
```

### workspace:* protocol for local packages

```json
// apps/web/package.json
{
  "name": "@myorg/web",
  "dependencies": {
    "@myorg/ui":     "workspace:*",   // always use local version
    "@myorg/utils":  "workspace:^",   // local, but respect semver on publish
    "@myorg/types":  "workspace:~"
  }
}
```

### Useful pnpm workspace commands

```bash
# Install all packages across workspace
pnpm install

# Run script in a specific package
pnpm --filter @myorg/web dev

# Run script in all packages
pnpm --filter "*" build

# Run script in a package and all its dependencies
pnpm --filter @myorg/web... build

# Add a dependency to a specific package
pnpm --filter @myorg/web add react

# Add a shared dev dependency to root
pnpm add -D typescript -w

# List workspace packages
pnpm ls --depth -1 -r
```

---

## Cross-Package Impact Analysis

When a shared package changes, determine what's affected before you ship.

```bash
# Using Turborepo — show affected packages
turbo run build --filter=...[HEAD^1] --dry-run 2>&1 | grep "Tasks to run"

# Using Nx
nx affected:apps --base=main --head=HEAD    # which apps are affected
nx affected:libs --base=main --head=HEAD    # which libs are affected

# Manual analysis with pnpm
# Find all packages that depend on @myorg/utils:
grep -r '"@myorg/utils"' packages/*/package.json apps/*/package.json

# Using jq for structured output
for pkg in packages/*/package.json apps/*/package.json; do
  name=$(jq -r '.name' "$pkg")
  if jq -e '.dependencies["@myorg/utils"] // .devDependencies["@myorg/utils"]' "$pkg" > /dev/null 2>&1; then
    echo "$name depends on @myorg/utils"
  fi
done
```

---

## Dependency Graph Visualization

Generate a Mermaid diagram from your workspace:

```bash
# Generate dependency graph as Mermaid
cat > scripts/gen-dep-graph.js << 'EOF'
const { execSync } = require('child_process');
const fs = require('fs');

// Parse pnpm workspace packages
const packages = JSON.parse(
  execSync('pnpm ls --depth -1 -r --json').toString()
);

let mermaid = 'graph TD\n';
packages.forEach(pkg => {
  const deps = Object.keys(pkg.dependencies || {})
    .filter(d => d.startsWith('@myorg/'));
  deps.forEach(dep => {
    const from = pkg.name.replace('@myorg/', '');
    const to = dep.replace('@myorg/', '');
    mermaid += `  ${from} --> ${to}\n`;
  });
});

fs.writeFileSync('docs/dep-graph.md', '```mermaid\n' + mermaid + '```\n');
console.log('Written to docs/dep-graph.md');
EOF
node scripts/gen-dep-graph.js
```

**Example output:**

```mermaid
graph TD
  web --> ui
  web --> utils
  web --> types
  mobile --> ui
  mobile --> utils
  mobile --> types
  admin --> ui
  admin --> utils
  api --> types
  ui --> utils
```

---

## Claude Code Configuration (Workspace-Aware CLAUDE.md)

Place a root CLAUDE.md + per-package CLAUDE.md files:

```markdown
# /CLAUDE.md — Root (applies to all packages)

## Monorepo Structure
- apps/web       — Next.js customer-facing app
- apps/admin     — Next.js internal admin
- apps/api       — Express REST API
- packages/ui    — Shared React component library
- packages/utils — Shared utilities (pure functions only)
- packages/types — Shared TypeScript types (no runtime code)

## Build System
- pnpm workspaces + Turborepo
- Always use `pnpm --filter <package>` to scope commands
- Never run `npm install` or `yarn` — pnpm only
- Run `turbo run build --filter=...[HEAD^1]` before committing

## Task Scoping Rules
- When modifying packages/ui: also run tests for apps/web and apps/admin (they depend on it)
- When modifying packages/types: run type-check across ALL packages
- When modifying apps/api: only need to test apps/api

## Package Manager
pnpm — version pinned in packageManager field of root package.json
```

```markdown
# /packages/ui/CLAUDE.md — Package-specific

## This Package
Shared React component library. Zero business logic. Pure UI only.

## Rules
- All components must be exported from src/index.ts
- No direct API calls in components — accept data via props
- Every component needs a Storybook story in src/stories/
- Use Tailwind for styling — no CSS modules or styled-components

## Testing
- Component tests: `pnpm --filter @myorg/ui test`
- Visual regression: `pnpm --filter @myorg/ui test:storybook`

## Publishing
- Version bumps via changesets only — never edit package.json version manually
- Run `pnpm changeset` from repo root after changes
```

---

## Migration: Multi-Repo → Monorepo

```bash
# Step 1: Create monorepo scaffold
mkdir my-monorepo && cd my-monorepo
pnpm init
echo "packages:\n  - 'apps/*'\n  - 'packages/*'" > pnpm-workspace.yaml

# Step 2: Move repos with git history preserved
mkdir -p apps packages

# For each existing repo:
git clone https://github.com/myorg/web-app
cd web-app
git filter-repo --to-subdirectory-filter apps/web  # rewrites history into subdir
cd ..
git remote add web-app ./web-app
git fetch web-app --tags
git merge web-app/main --allow-unrelated-histories

# Step 3: Update package names to scoped
# In each package.json, change "name": "web" to "name": "@myorg/web"

# Step 4: Replace cross-repo npm deps with workspace:*
# apps/web/package.json: "@myorg/ui": "1.2.3" → "@myorg/ui": "workspace:*"

# Step 5: Add shared configs to root
cp apps/web/.eslintrc.js .eslintrc.base.js
# Update each package's config to extend root:
# { "extends": ["../../.eslintrc.base.js"] }

# Step 6: Add Turborepo
pnpm add -D turbo -w
# Create turbo.json (see above)

# Step 7: Unified CI (see CI section below)
# Step 8: Test everything
turbo run build test lint
```

---

## CI Patterns

### GitHub Actions — Affected Only

```yaml
# .github/workflows/ci.yml
name: "ci"

on:
  push:
    branches: [main]
  pull_request:

jobs:
  affected:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
        with:
          fetch-depth: 0          # full history needed for affected detection

      - uses: pnpm/action-setup@v3
        with:
          version: 9

      - uses: actions/setup-node@v4
        with:
          node-version: 20
          cache: pnpm

      - run: pnpm install --frozen-lockfile

      # Turborepo remote cache
      - uses: actions/cache@v4
        with:
          path: .turbo
          key: ${{ runner.os }}-turbo-${{ github.sha }}
          restore-keys: ${{ runner.os }}-turbo-

      # Only test/build affected packages
      - name: "build-affected"
        run: turbo run build --filter=...[origin/main]
        env:
          TURBO_TOKEN: ${{ secrets.TURBO_TOKEN }}
          TURBO_TEAM: ${{ vars.TURBO_TEAM }}

      - name: "test-affected"
        run: turbo run test --filter=...[origin/main]

      - name: "lint-affected"
        run: turbo run lint --filter=...[origin/main]
```

### GitLab CI — Parallel Stages

```yaml
# .gitlab-ci.yml
stages: [install, build, test, publish]

variables:
  PNPM_CACHE_FOLDER: .pnpm-store

cache:
  key: pnpm-$CI_COMMIT_REF_SLUG
  paths: [.pnpm-store/, .turbo/]

install:
  stage: install
  script:
    - pnpm install --frozen-lockfile
  artifacts:
    paths: [node_modules/, packages/*/node_modules/, apps/*/node_modules/]
    expire_in: 1h

build:affected:
  stage: build
  needs: [install]
  script:
    - turbo run build --filter=...[origin/main]
  artifacts:
    paths: [apps/*/dist/, apps/*/.next/, packages/*/dist/]

test:affected:
  stage: test
  needs: [build:affected]
  script:
    - turbo run test --filter=...[origin/main]
  coverage: '/Statements\s*:\s*(\d+\.?\d*)%/'
  artifacts:
    reports:
      coverage_report:
        coverage_format: cobertura
        path: "**/coverage/cobertura-coverage.xml"
```

---

## Publishing with Changesets

```bash
# Install changesets
pnpm add -D @changesets/cli -w
pnpm changeset init

# After making changes, create a changeset
pnpm changeset
# Interactive: select packages, choose semver bump, write changelog entry

# In CI — version packages + update changelogs
pnpm changeset version

# Publish all changed packages
pnpm changeset publish

# Pre-release channel (for alpha/beta)
pnpm changeset pre enter beta
pnpm changeset
pnpm changeset version   # produces 1.2.0-beta.0
pnpm changeset publish --tag beta
pnpm changeset pre exit  # back to stable releases
```

### Automated publish workflow (GitHub Actions)

```yaml
# .github/workflows/release.yml
name: "release"

on:
  push:
    branches: [main]

jobs:
  release:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: pnpm/action-setup@v3
      - uses: actions/setup-node@v4
        with:
          node-version: 20
          registry-url: https://registry.npmjs.org

      - run: pnpm install --frozen-lockfile

      - name: "create-release-pr-or-publish"
        uses: changesets/action@v1
        with:
          publish: pnpm changeset publish
          version: pnpm changeset version
          commit: "chore: release packages"
          title: "chore: release packages"
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
```

---

```

monorepo-navigator | SkillHub