Back to skills
SkillHub ClubShip Full StackFull StackIntegration

coala-client

How to use the coala-client CLI for chat with LLMs, MCP servers, and skills. Use when the user asks how to use coala, run coala chat, add MCP servers, import CWL toolsets, list or call MCP tools, or import or load skills.

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
3,125
Hot score
99
Updated
March 20, 2026
Overall rating
C4.0
Composite score
4.0
Best-practice grade
S96.0

Install command

npx @skill-hub/cli install openclaw-skills-coala

Repository

openclaw/skills

Skill path: skills/hubentu/coala

How to use the coala-client CLI for chat with LLMs, MCP servers, and skills. Use when the user asks how to use coala, run coala chat, add MCP servers, import CWL toolsets, list or call MCP tools, or import or load skills.

Open repository

Best for

Primary workflow: Ship Full Stack.

Technical facets: Full Stack, Integration.

Target audience: everyone.

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: openclaw.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install coala-client into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/openclaw/skills before adding coala-client to shared team environments
  • Use coala-client for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

Original source / Raw SKILL.md

---
name: coala-client
description: How to use the coala-client CLI for chat with LLMs, MCP servers, and skills. Use when the user asks how to use coala, run coala chat, add MCP servers, import CWL toolsets, list or call MCP tools, or import or load skills.
homepage: https://github.com/coala-info/coala_client
metadata: {"clawdbot":{"emoji":"🧬","requires":{"bins":["coala-client"]},"install":[{"id":"uv","kind":"uv","package":"coala-client","bins":["coala-client"],"label":"Install coala-client (uv)"}]}}
---

# Coala Client

Part of the coala ecosystem. CLI for chat with OpenAI-compatible LLMs (OpenAI, Gemini, Ollama) and MCP (Model Context Protocol) servers. Supports importing CWL toolsets as MCP servers, importing skills.

## Config paths

- MCP config and toolsets: `~/.config/coala/mcps/`  
  - `mcp_servers.json` β€” server definitions  
  - `<toolset>/` β€” per-toolset dirs with `run_mcp.py` and CWL files  
- Skills: `~/.config/coala/skills/` (one subfolder per imported source)  
- Env: `~/.config/coala/env` (optional; key=value for providers and MCP env)

## Quick start

1. **Init (first time)**  
   `coala init` β€” creates `~/.config/coala/mcps/mcp_servers.json` and `env`.

2. **Set API key**  
   e.g. `export OPENAI_API_KEY=...` or `export GEMINI_API_KEY=...`. Ollama needs no key.

3. **Chat**  
   `coala` or `coala chat` β€” interactive chat with MCP tools.  
   `coala ask "question"` β€” single prompt with MCP.

4. **Options**  
   `-p, --provider` (openai|gemini|ollama|custom), `-m, --model`, `--no-mcp`.

## MCP: CWL toolsets

No API key needed for MCP import, list, or call β€” only for chat/ask with an LLM.

- **Import** (creates toolset under `~/.config/coala/mcps/<TOOLSET>/` and registers server):  
  `coala mcp-import <TOOLSET> <SOURCES...>` or alias `coala mcp ...`  
  SOURCES: local `.cwl` files, a `.zip`, or http(s) URLs to a .cwl or .zip.  
  Requires the `coala` package where the MCP server runs (for `run_mcp.py`).

- **List**  
  `coala mcp-list` β€” list server names.  
  `coala mcp-list <SERVER_NAME>` β€” print each tool’s schema (name, description, inputSchema).

- **Call**  
  `coala mcp-call <SERVER>.<TOOL> --args '<JSON>'`  
  Example: `coala mcp-call gene-variant.ncbi_datasets_gene --args '{"data": [{"gene": "TP53", "taxon": "human"}]}'`

## Skills

- **Import** (into `~/.config/coala/skills/`, one subfolder per source):  
  `coala skill <SOURCES...>`  
  SOURCES: GitHub tree URL (e.g. `https://github.com/owner/repo/tree/main/skills`), zip URL, or local zip/dir.

- **In chat**  
  `/skill` β€” list installed skills.  
  `/skill <name>` β€” load skill from `~/.config/coala/skills/<name>/` (e.g. SKILL.md) into context.


## Chat commands

- `/help`, `/exit`, `/quit`, `/clear`  
- `/tools` β€” list MCP tools  
- `/servers` β€” list connected MCP servers  
- `/skill` β€” list skills; `/skill <name>` β€” load a skill  
- `/model` β€” show model info  
- `/switch <provider>` β€” switch provider  

## MCP on/off

- **All off:** `coala --no-mcp` (or `coala ask "..." --no-mcp`).  
- **One server off:** remove its entry from `~/.config/coala/mcps/mcp_servers.json`.  
- **On:** default when `--no-mcp` is not used; add or restore servers in `mcp_servers.json`.

## Providers and env

Set provider via `-p` or env `PROVIDER`. Set keys and URLs per provider (e.g. `OPENAI_API_KEY`, `GEMINI_API_KEY`, `OLLAMA_BASE_URL`). Optional: put vars in `~/.config/coala/env`.  
`coala config` β€” print current config paths and provider/model info.


---

## Skill Companion Files

> Additional files collected from the skill directory layout.

### _meta.json

```json
{
  "owner": "hubentu",
  "slug": "coala",
  "displayName": "Coala Client",
  "latest": {
    "version": "0.1.2",
    "publishedAt": 1771597607749,
    "commit": "https://github.com/openclaw/skills/commit/adf023ed517c0cc3cee408efcff2cc28219f1978"
  },
  "history": []
}

```

coala-client | SkillHub