rlm
Process files too large to fit in context (>100KB, >2000 lines). Uses Python REPL for structural analysis, LLM queries for semantic reasoning, and subagents for final synthesis. Triggers - large file, big document, massive log, full codebase, entire repo, long transcript, context window exceeded.
Packaged view
This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.
Install command
npx @skill-hub/cli install whamp-pi-rlm-rlm
Repository
Skill path: skills/rlm
Process files too large to fit in context (>100KB, >2000 lines). Uses Python REPL for structural analysis, LLM queries for semantic reasoning, and subagents for final synthesis. Triggers - large file, big document, massive log, full codebase, entire repo, long transcript, context window exceeded.
Open repositoryBest for
Primary workflow: Ship Full Stack.
Technical facets: Full Stack.
Target audience: Development teams looking for install-ready agent workflows..
License: Unknown.
Original source
Catalog source: SkillHub Club.
Repository owner: Whamp.
This is still a mirrored public skill entry. Review the repository before installing into production workflows.
What it helps with
- Install rlm into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
- Review https://github.com/Whamp/pi-rlm before adding rlm to shared team environments
- Use rlm for development workflows
Works across
Favorites: 0.
Sub-skills: 0.
Aggregator: No.