Back to skills
SkillHub ClubShip Full StackFull Stack

openclaw-memories

Agent memory with ALMA meta-learning, LLM fact extraction, and full-text search. Observer calls remote LLM APIs (OpenAI/Anthropic/Gemini). ALMA and Indexer work offline.

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
3,111
Hot score
99
Updated
March 20, 2026
Overall rating
C0.0
Composite score
0.0
Best-practice grade
C56.0

Install command

npx @skill-hub/cli install openclaw-skills-openclaw-memory-2

Repository

openclaw/skills

Skill path: skills/arosstale/openclaw-memory-2

Agent memory with ALMA meta-learning, LLM fact extraction, and full-text search. Observer calls remote LLM APIs (OpenAI/Anthropic/Gemini). ALMA and Indexer work offline.

Open repository

Best for

Primary workflow: Ship Full Stack.

Technical facets: Full Stack.

Target audience: everyone.

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: openclaw.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install openclaw-memories into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/openclaw/skills before adding openclaw-memories to shared team environments
  • Use openclaw-memories for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

Original source / Raw SKILL.md

---
name: openclaw-memories
description: Agent memory with ALMA meta-learning, LLM fact extraction, and full-text search. Observer calls remote LLM APIs (OpenAI/Anthropic/Gemini). ALMA and Indexer work offline.
---

# OpenClaw Memory System

Three components for agent memory:

1. **ALMA** — Evolves memory designs through mutation + evaluation (offline)
2. **Observer** — Extracts structured facts from conversations via LLM API (requires API key)
3. **Indexer** — Full-text search over workspace Markdown files (offline)

## Environment Variables

Observer requires one of:
- `OPENAI_API_KEY`
- `ANTHROPIC_API_KEY`
- Or pass `apiKey` in config

ALMA and Indexer require no keys or network access.

## How It Works

### ALMA (Algorithm Learning via Meta-learning Agents)
Proposes memory system designs, evaluates them, keeps the best. Uses gaussian mutation and simulated annealing to explore the design space.

```
alma.propose() → design
alma.evaluate(design.id, metrics) → score  
alma.best() → top design
alma.top(5) → leaderboard
```

### Observer
Sends conversation history to an LLM, gets back structured facts:
- Kind: world fact / biographical / opinion / observation
- Priority: high / medium / low
- Entities: mentioned people/places
- Confidence: 0.0–1.0 for opinions

Fails gracefully — returns empty array if LLM is unavailable.

### Indexer
Chunks workspace Markdown files and indexes them for search:
- `MEMORY.md` — core facts
- `memory/YYYY-MM-DD.md` — daily logs
- `bank/entities/*.md` — entity summaries
- `bank/opinions.md` — beliefs with confidence

```
indexer.index() → count of chunks indexed
indexer.search('query') → ranked results
indexer.rebuild() → re-index from scratch
```

## Install

```bash
npm install @artale/openclaw-memory
```

## Limitations

- Indexer uses an in-memory mock database, not real SQLite FTS5. Search works but ranking is simplified.
- Observer calls remote APIs — not offline. Only ALMA and Indexer work without network.
- No dashboard — removed in v2 for simplicity.

## Source

5 files, 578 lines, 0 runtime dependencies.

https://github.com/arosstale/openclaw-memory


---

## Skill Companion Files

> Additional files collected from the skill directory layout.

### README.md

```markdown
# openclaw-memory

Memory system for OpenClaw agents. Three components:

- **ALMA** — meta-learning optimizer that evolves memory designs
- **Observer** — extracts structured facts from conversations via LLM (OpenAI/Anthropic/Gemini)
- **Indexer** — full-text search over workspace Markdown files

## Install

```bash
npm install @artale/openclaw-memory
```

## Usage

```typescript
import { ALMAAgent, ObserverAgent, MemoryIndexer } from '@artale/openclaw-memory';

// ALMA: evolve memory designs
const alma = new ALMAAgent({
  constraints: { chunkSize: { min: 100, max: 1000, type: 'number' } }
});
const design = alma.propose();
alma.evaluate(design.id, { accuracy: 0.9, speed: 0.8 });
console.log(alma.best());

// Observer: extract facts (requires LLM API key)
const observer = new ObserverAgent({
  provider: 'anthropic',  // or 'openai' or 'gemini'
  apiKey: process.env.ANTHROPIC_API_KEY,
});
const facts = await observer.extract([
  { role: 'user', content: 'Alice prefers TypeScript over Python' }
]);

// Indexer: search workspace files
const indexer = new MemoryIndexer({ workspace: './my-workspace' });
indexer.index();
const results = indexer.search('TypeScript');
```

## Environment Variables

Observer requires an LLM API key (one of):
- `OPENAI_API_KEY` — for OpenAI provider
- `ANTHROPIC_API_KEY` — for Anthropic provider

ALMA and Indexer work fully offline with zero dependencies.

## OpenClaw Skill

Drop into your workspace to use as a skill:
```bash
cd ~/.openclaw/workspace/skills
git clone https://github.com/arosstale/openclaw-memory
```

## Architecture

- **5 source files, 578 lines, 0 runtime dependencies**
- In-memory database (no native modules, works everywhere)
- Observer calls remote LLM APIs when configured — not offline
- ALMA and Indexer are fully offline

## License

MIT

```

### _meta.json

```json
{
  "owner": "arosstale",
  "slug": "openclaw-memory-2",
  "displayName": "Openclaw Memories",
  "latest": {
    "version": "2.0.1",
    "publishedAt": 1772024821130,
    "commit": "https://github.com/openclaw/skills/commit/7b1efa117f7a994dfe88c87ccc1a057c47401984"
  },
  "history": []
}

```

openclaw-memories | SkillHub