Back to skills
SkillHub ClubShip Full StackFull Stack

rlm

Process files too large to fit in context (>100KB, >2000 lines). Uses Python REPL for structural analysis, LLM queries for semantic reasoning, and subagents for final synthesis. Triggers - large file, big document, massive log, full codebase, entire repo, long transcript, context window exceeded.

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
13
Hot score
85
Updated
March 20, 2026
Overall rating
C1.5
Composite score
1.5
Best-practice grade
C62.8

Install command

npx @skill-hub/cli install whamp-pi-rlm-rlm

Repository

Whamp/pi-rlm

Skill path: skills/rlm

Process files too large to fit in context (>100KB, >2000 lines). Uses Python REPL for structural analysis, LLM queries for semantic reasoning, and subagents for final synthesis. Triggers - large file, big document, massive log, full codebase, entire repo, long transcript, context window exceeded.

Open repository

Best for

Primary workflow: Ship Full Stack.

Technical facets: Full Stack.

Target audience: Development teams looking for install-ready agent workflows..

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: Whamp.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install rlm into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/Whamp/pi-rlm before adding rlm to shared team environments
  • Use rlm for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

rlm | SkillHub