gpu-cli
Safely run local `gpu` commands via a guarded wrapper (`runner.sh`) with preflight checks and budget/time caps.
Packaged view
This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.
Install command
npx @skill-hub/cli install openclaw-skills-gpu-cli
Repository
Skill path: skills/angusbezzina/gpu-cli
Safely run local `gpu` commands via a guarded wrapper (`runner.sh`) with preflight checks and budget/time caps.
Open repositoryBest for
Primary workflow: Ship Full Stack.
Technical facets: Full Stack.
Target audience: everyone.
License: Unknown.
Original source
Catalog source: SkillHub Club.
Repository owner: openclaw.
This is still a mirrored public skill entry. Review the repository before installing into production workflows.
What it helps with
- Install gpu-cli into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
- Review https://github.com/openclaw/skills before adding gpu-cli to shared team environments
- Use gpu-cli for development workflows
Works across
Favorites: 0.
Sub-skills: 0.
Aggregator: No.
Original source / Raw SKILL.md
---
name: gpu-cli
description: Safely run local `gpu` commands via a guarded wrapper (`runner.sh`) with preflight checks and budget/time caps.
argument-hint: runner.sh gpu [subcommand] [flags]
allowed-tools: Bash(runner.sh*), Read
---
# GPU CLI Skill (Stable)
Use this skill to run the local `gpu` binary from your agent. It only allows invoking the bundled `runner.sh` (which internally calls `gpu`) and read-only file access.
What it does
- Runs `gpu` commands you specify (e.g., `runner.sh gpu status --json`, `runner.sh gpu run python train.py`).
- Recommends a preflight: `gpu doctor --json` then `gpu status --json`.
- Streams results back to chat; use `--json` for structured outputs.
Safety & scope
- Allowed tools: `Bash(runner.sh*)`, `Read`. No network access requested by the skill; `gpu` handles its own networking.
- Avoid chaining or redirection; provide a single `runner.sh gpu …` command.
- You pay your provider directly; this may start paid pods.
Quick prompts
- "Run `runner.sh gpu status --json` and summarize pod state".
- "Run `runner.sh gpu doctor --json` and summarize failures".
- "Run `runner.sh gpu inventory --json --available` and recommend a GPU under $0.50/hr".
- "Run `runner.sh gpu run echo hello` then post the output".
Notes
- For image/video/LLM work, ask the agent to include appropriate flags (e.g., `--gpu-type "RTX 4090"`, `-p 8000:8000`, or `--rebuild`).
---
## Skill Companion Files
> Additional files collected from the skill directory layout.
### README.md
```markdown
GPU CLI ClawHub Skill (Stable)
Purpose: Safely run local `gpu` commands from OpenClaw/ClawHub agents with guardrails (dry‑run preview, command whitelist, budget/time caps, and clear remediation), without modifying GPU CLI itself.
What this skill does
- Executes only `gpu …` commands via a wrapper (`runner.sh`) with a strict whitelist and injection checks.
- Performs preflight checks (`gpu --version`, `gpu doctor --json`).
- Defaults to dry‑run previews; can enforce confirm + caps.
- Provides optional cost/time caps before execution; attempts cleanup (`gpu stop -y`) on timeout/cancel.
- Maps common exit codes to helpful guidance (auth, daemon restart, transient retry).
Non-goals
- No telemetry, no secret handling; uses the installed GPU CLI and your provider keys in OS keychain.
- Does not change GPU CLI behavior; all safeguards are in this skill only.
Quick usage (from an agent)
- Trigger: "/gpu" or phrases like "Use GPU CLI to …" (as configured on ClawHub)
- Example: "Use GPU CLI to run gpu status --json"
- Example: "Use GPU CLI to run gpu run python train.py on an RTX 4090"
Files
- `manifest.yaml` — ClawHub skill metadata, permissions, triggers, settings.
- `runner.sh` — Execution wrapper with guardrails.
- `selftest.sh` — Local checks for preflight and injection denial.
- `templates/prompts.md` — Curated prompts for common tasks.
Notes for publishers
- Mark channel as Stable, add logo/screenshots/demo, and link docs at `apps/portal/content/docs/ai-agent-skill.mdx`.
- Keep permissions minimal: Bash + Read, workspace‑scoped; network off for the skill itself.
```
### _meta.json
```json
{
"owner": "angusbezzina",
"slug": "gpu-cli",
"displayName": "GPU CLI: Remote GPU Compute for ML Training and Inference",
"latest": {
"version": "1.1.1",
"publishedAt": 1773339458169,
"commit": "https://github.com/openclaw/skills/commit/9dbee3f8874a549c116b78b0f9ef16b77ccff592"
},
"history": [
{
"version": "1.1.0",
"publishedAt": 1772036192971,
"commit": "https://github.com/openclaw/skills/commit/ef2ed58dadb0975d83eaabdf32d0600e77d86135"
}
]
}
```