Back to skills
SkillHub ClubAnalyze Data & AIFull StackBackendData / AI

nima-core

Noosphere Integrated Memory Architecture โ€” Complete cognitive stack for AI agents: persistent memory, emotional intelligence, dream consolidation, hive mind, precognitive recall, and lucid moments. 4 embedding providers, LadybugDB graph backend, zero-config install. nima-core.ai

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
3,070
Hot score
99
Updated
March 20, 2026
Overall rating
C4.0
Composite score
4.0
Best-practice grade
C60.3

Install command

npx @skill-hub/cli install openclaw-skills-nima-core

Repository

openclaw/skills

Skill path: skills/dmdorta1111/nima-core

Noosphere Integrated Memory Architecture โ€” Complete cognitive stack for AI agents: persistent memory, emotional intelligence, dream consolidation, hive mind, precognitive recall, and lucid moments. 4 embedding providers, LadybugDB graph backend, zero-config install. nima-core.ai

Open repository

Best for

Primary workflow: Analyze Data & AI.

Technical facets: Full Stack, Backend, Data / AI.

Target audience: everyone.

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: openclaw.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install nima-core into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/openclaw/skills before adding nima-core to shared team environments
  • Use nima-core for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

Original source / Raw SKILL.md

---
name: nima-core
description: "Noosphere Integrated Memory Architecture โ€” Complete cognitive stack for AI agents: persistent memory, emotional intelligence, dream consolidation, hive mind, precognitive recall, and lucid moments. 4 embedding providers, LadybugDB graph backend, zero-config install. nima-core.ai"
version: 3.2.0
metadata:
  {
    "openclaw": {
      "emoji": "๐Ÿง ",
      "source": "https://github.com/lilubot/nima-core",
      "homepage": "https://nima-core.ai",
      "requires": { "bins": ["python3", "node"] },
      "install": [
        {
          "id": "shell",
          "kind": "shell",
          "script": "install.sh",
          "label": "Install NIMA Core (creates ~/.nima, pip-installs dependencies, copies OpenClaw hooks)"
        }
      ],
      "permissions": {
        "reads":   ["~/.nima/", "~/.openclaw/extensions/nima-*/"],
        "writes":  ["~/.nima/", "~/.openclaw/extensions/nima-*/"],
        "network": [
          "voyage.ai (only if NIMA_EMBEDDER=voyage)",
          "openai.com (only if NIMA_EMBEDDER=openai or ANTHROPIC_API_KEY set)",
          "anthropic.com (only if ANTHROPIC_API_KEY set โ€” memory pruner)"
        ]
      },
      "optional_env": {
        "NIMA_DATA_DIR":         "Override default ~/.nima data directory",
        "NIMA_EMBEDDER":         "voyage|openai|ollama|local (default: local โ€” zero external calls)",
        "VOYAGE_API_KEY":        "Required when NIMA_EMBEDDER=voyage",
        "OPENAI_API_KEY":        "Required when NIMA_EMBEDDER=openai",
        "NIMA_OLLAMA_MODEL":     "Model name when NIMA_EMBEDDER=ollama",
        "NIMA_VOICE_TRANSCRIBER":"whisper|local (for voice notes)",
        "WHISPER_MODEL":         "tiny|base|small|medium|large",
        "ANTHROPIC_API_KEY":     "For memory pruner LLM distillation (opt-in only)",
        "HIVE_ENABLED":          "1 to enable multi-agent memory sharing via shared DB",
        "HIVE_REDIS_URL":        "Redis URL for real-time hive pub/sub (optional, HIVE_ENABLED=1)"
      },
      "external_calls": "All external API calls are opt-in via explicit env vars. Default mode uses local embeddings with zero network calls. install.sh does pip install nima-core and optional real-ladybug; review before running in shared/production environments."
    }
  }
---

# NIMA Core 3.2

**Noosphere Integrated Memory Architecture** โ€” A complete cognitive stack for AI agents: persistent memory, emotional intelligence, dream consolidation, hive mind, and precognitive recall.

**Website:** https://nima-core.ai ยท **GitHub:** https://github.com/lilubot/nima-core

## Quick Start

```bash
pip install nima-core && nima-core
```

Your bot now has persistent memory. Zero config needed.

## What's New in v3.0

### Complete Cognitive Stack

NIMA evolved from a memory plugin into a full cognitive architecture:

| Module | What It Does | Version |
|--------|-------------|---------|
| **Memory Capture** | 3-layer capture (input/contemplation/output), 4-phase noise filtering | v2.0 |
| **Semantic Recall** | Vector + text hybrid search, ecology scoring, token-budgeted injection | v2.0 |
| **Dynamic Affect** | Panksepp 7-affect emotional state (SEEKING, RAGE, FEAR, LUST, CARE, PANIC, PLAY) | v2.1 |
| **VADER Analyzer** | Contextual sentiment โ€” caps boost, negation, idioms, degree modifiers | v2.2 |
| **Memory Pruner** | LLM distillation of old conversations โ†’ semantic gists, 30-day suppression limbo | v2.3 |
| **Dream Consolidation** | Nightly synthesis โ€” extracts insights and patterns from episodic memory | v2.4 |
| **Hive Mind** | Multi-agent memory sharing via shared DB + optional Redis pub/sub | v2.5 |
| **Precognition** | Temporal pattern mining โ†’ predictive memory pre-loading | v2.5 |
| **Lucid Moments** | Spontaneous surfacing of emotionally-resonant memories | v2.5 |
| **Darwinian Memory** | Clusters similar memories, ghosts duplicates via cosine + LLM verification | v3.0 |
| **Installer** | One-command setup โ€” LadybugDB, hooks, directories, embedder config | v3.0 |

### v3.0 Highlights
- All cognitive modules unified under a single package
- Installer (`install.sh`) for zero-friction setup
- All OpenClaw hooks bundled and ready to drop in
- README rewritten, all versions aligned to `3.0.4`

## Architecture

```text
OPENCLAW HOOKS
โ”œโ”€โ”€ nima-memory/          Capture hook (3-layer, 4-phase noise filter)
โ”‚   โ”œโ”€โ”€ index.js          Hook entry point
โ”‚   โ”œโ”€โ”€ ladybug_store.py  LadybugDB storage backend
โ”‚   โ”œโ”€โ”€ embeddings.py     Multi-provider embedding (Voyage/OpenAI/Ollama/local)
โ”‚   โ”œโ”€โ”€ backfill.py       Historical transcript import
โ”‚   โ””โ”€โ”€ health_check.py   DB integrity checks
โ”œโ”€โ”€ nima-recall-live/     Recall hook (before_agent_start)
โ”‚   โ”œโ”€โ”€ lazy_recall.py    Current recall engine
โ”‚   โ””โ”€โ”€ ladybug_recall.py LadybugDB-native recall
โ”œโ”€โ”€ nima-affect/          Affect hook (message_received)
โ”‚   โ”œโ”€โ”€ vader-affect.js   VADER sentiment analyzer
โ”‚   โ””โ”€โ”€ emotion-lexicon.js Emotion keyword lexicon
โ””โ”€โ”€ shared/               Resilient wrappers, error handling

PYTHON CORE (nima_core/)
โ”œโ”€โ”€ cognition/
โ”‚   โ”œโ”€โ”€ dynamic_affect.py         Panksepp 7-affect system
โ”‚   โ”œโ”€โ”€ emotion_detection.py      Text emotion extraction
โ”‚   โ”œโ”€โ”€ affect_correlation.py     Cross-affect analysis
โ”‚   โ”œโ”€โ”€ affect_history.py         Temporal affect tracking
โ”‚   โ”œโ”€โ”€ affect_interactions.py    Affect coupling dynamics
โ”‚   โ”œโ”€โ”€ archetypes.py             Personality baselines (Guardian, Explorer, etc.)
โ”‚   โ”œโ”€โ”€ personality_profiles.py   JSON personality configs
โ”‚   โ””โ”€โ”€ response_modulator_v2.py  Affect โ†’ response modulation
โ”œโ”€โ”€ dream_consolidation.py        Nightly memory synthesis engine
โ”œโ”€โ”€ memory_pruner.py              Episodic distillation + suppression
โ”œโ”€โ”€ hive_mind.py                  Multi-agent memory sharing
โ”œโ”€โ”€ precognition.py               Temporal pattern mining
โ”œโ”€โ”€ lucid_moments.py              Spontaneous memory surfacing
โ”œโ”€โ”€ connection_pool.py            SQLite pool (WAL, thread-safe)
โ”œโ”€โ”€ logging_config.py             Singleton logger
โ””โ”€โ”€ metrics.py                    Thread-safe counters/timings
```

## Privacy & Permissions

- โœ… All data stored locally in `~/.nima/`
- โœ… Default: local embeddings = **zero external calls**
- โœ… No NIMA-owned servers, no proprietary tracking, no analytics sent to external services
- โš ๏ธ Opt-in networking: HiveMind (Redis pub/sub), Precognition (LLM endpoints), LadybugDB migrations โ€” see Optional Features below
- ๐Ÿ”’ Embedding API calls only when explicitly enabling (VOYAGE_API_KEY, OPENAI_API_KEY, etc.)

### Optional Features with Network Access

| Feature | Env Var | Network Calls To | Default |
|---------|----------|------------------|---------|
| Cloud embeddings | `NIMA_EMBEDDER=voyage` | voyage.ai | Off |
| Cloud embeddings | `NIMA_EMBEDDER=openai` | openai.com | Off |
| Memory pruner | `ANTHROPIC_API_KEY` set | anthropic.com | Off |
| Ollama embeddings | `NIMA_EMBEDDER=ollama` | localhost:11434 | Off |
| HiveMind | `HIVE_ENABLED=true` | Redis pub/sub | Off |
| Precognition | Using external LLM | Configured endpoint | Off |

## Security

### What Gets Installed

| Component | Location | Purpose |
|-----------|----------|---------|
| Python core (`nima_core/`) | `~/.nima/` | Memory, affect, cognition |
| OpenClaw hooks | `~/.openclaw/extensions/nima-*/` | Capture, recall, affect |
| SQLite database | `~/.nima/memory/graph.sqlite` | Persistent storage |
| Logs | `~/.nima/logs/` | Debug logs (optional) |

### Credential Handling

| Env Var | Required? | Network Calls? | Purpose |
|---------|-----------|----------------|---------|
| `NIMA_EMBEDDER=local` | No | โŒ | Default โ€” offline embeddings |
| `VOYAGE_API_KEY` | Only if using Voyage | โœ… voyage.ai | Cloud embeddings |
| `OPENAI_API_KEY` | Only if using OpenAI | โœ… openai.com | Cloud embeddings |
| `ANTHROPIC_API_KEY` | Only if using pruner | โœ… anthropic.com | Memory distillation |
| `NIMA_OLLAMA_MODEL` | Only if using Ollama | โŒ (localhost) | Local GPU embeddings |

**Recommendation:** Start with `NIMA_EMBEDDER=local` (default). Only enable cloud providers when you need better embedding quality.

### Safety Features

- **Input filtering** โ€” System messages, heartbeats, and duplicates are filtered before capture
- **FTS5 injection prevention** โ€” Parameterized queries prevent SQL injection
- **Path traversal protection** โ€” All file paths are sanitized
- **Temp file cleanup** โ€” Automatic cleanup of temporary files
- **API timeouts** โ€” Network calls have reasonable timeouts (30s Voyage, 10s local)

### Best Practices

1. **Review before installing** โ€” Inspect `install.sh` and hook files before running
2. **Backup config** โ€” Backup `~/.openclaw/openclaw.json` before adding hooks
3. **Don't run as root** โ€” Installation writes to user home directories
4. **Use containerized envs** โ€” Test in a VM or container first if unsure
5. **Rotate API keys** โ€” If using cloud embeddings, rotate keys periodically
6. **Monitor logs** โ€” Check `~/.nima/logs/` for suspicious activity

### Data Locations

```text
~/.nima/
โ”œโ”€โ”€ memory/
โ”‚   โ”œโ”€โ”€ graph.sqlite       # SQLite backend (default)
โ”‚   โ”œโ”€โ”€ ladybug.lbug       # LadybugDB backend (optional)
โ”‚   โ”œโ”€โ”€ embedding_cache.db # Cached embeddings
โ”‚   โ””โ”€โ”€ embedding_index.npy# Vector index
โ”œโ”€โ”€ affect/
โ”‚   โ””โ”€โ”€ affect_state.json  # Current emotional state
โ””โ”€โ”€ logs/                  # Debug logs (if enabled)

~/.openclaw/extensions/
โ”œโ”€โ”€ nima-memory/           # Capture hook
โ”œโ”€โ”€ nima-recall-live/     # Recall hook
โ””โ”€โ”€ nima-affect/          # Affect hook
```

**Controls:**
```json
{
  "plugins": {
    "entries": {
      "nima-memory": {
        "skip_subagents": true,
        "skip_heartbeats": true,
        "noise_filtering": { "filter_system_noise": true }
      }
    }
  }
}
```

## Configuration

### Embedding Providers

| Provider | Setup | Dims | Cost |
|----------|-------|------|------|
| **Local** (default) | `NIMA_EMBEDDER=local` | 384 | Free |
| **Voyage AI** | `NIMA_EMBEDDER=voyage` + `VOYAGE_API_KEY` | 1024 | $0.12/1M tok |
| **OpenAI** | `NIMA_EMBEDDER=openai` + `OPENAI_API_KEY` | 1536 | $0.13/1M tok |
| **Ollama** | `NIMA_EMBEDDER=ollama` + `NIMA_OLLAMA_MODEL` | 768 | Free |

### Database Backend

| | SQLite (default) | LadybugDB (recommended) |
|--|-----------------|------------------------|
| Text Search | 31ms | **9ms** (3.4x faster) |
| Vector Search | External | **Native HNSW** (18ms) |
| Graph Queries | SQL JOINs | **Native Cypher** |
| DB Size | ~91 MB | **~50 MB** (44% smaller) |

Upgrade: `pip install real-ladybug && python -c "from nima_core.storage import migrate; migrate()"`

### All Environment Variables

```bash
# Embedding (default: local)
NIMA_EMBEDDER=local|voyage|openai|ollama
VOYAGE_API_KEY=pa-xxx
OPENAI_API_KEY=sk-xxx
NIMA_OLLAMA_MODEL=nomic-embed-text

# Data paths
NIMA_DATA_DIR=~/.nima
NIMA_DB_PATH=~/.nima/memory/ladybug.lbug

# Memory pruner
NIMA_DISTILL_MODEL=claude-haiku-4-5
ANTHROPIC_API_KEY=sk-ant-xxx

# Logging
NIMA_LOG_LEVEL=INFO
NIMA_DEBUG_RECALL=1
```

## Hooks

| Hook | Fires | Does |
|------|-------|------|
| `nima-memory` | After save | Captures 3 layers โ†’ filters noise โ†’ stores in graph DB |
| `nima-recall-live` | Before LLM | Searches memories โ†’ scores by ecology โ†’ injects as context (3000 token budget) |
| `nima-affect` | On message | VADER sentiment โ†’ Panksepp 7-affect state โ†’ archetype modulation |

### Installation

```bash
./install.sh
openclaw gateway restart
```

Or manual:
```bash
cp -r openclaw_hooks/nima-memory ~/.openclaw/extensions/
cp -r openclaw_hooks/nima-recall-live ~/.openclaw/extensions/
cp -r openclaw_hooks/nima-affect ~/.openclaw/extensions/
```

## Advanced Features

### Dream Consolidation
Nightly synthesis extracts insights and patterns from episodic memory:
```bash
python -m nima_core.dream_consolidation
# Or schedule via OpenClaw cron at 2 AM
```

### Memory Pruner
Distills old conversations into semantic gists, suppresses raw noise:
```bash
python -m nima_core.memory_pruner --min-age 14 --live
python -m nima_core.memory_pruner --restore 12345  # undo within 30 days
```

### Hive Mind
Multi-agent memory sharing:
```python
from nima_core import HiveMind
hive = HiveMind(db_path="~/.nima/memory/ladybug.lbug")
context = hive.build_agent_context("research task", max_memories=8)
hive.capture_agent_result("agent-1", "result summary", "model-name")
```

### Precognition
Temporal pattern mining โ†’ predictive memory pre-loading:
```python
from nima_core import NimaPrecognition
precog = NimaPrecognition(db_path="~/.nima/memory/ladybug.lbug")
precog.run_mining_cycle()
```

### Lucid Moments
Spontaneous surfacing of emotionally-resonant memories (with safety: trauma filtering, quiet hours, daily caps):
```python
from nima_core import LucidMoments
lucid = LucidMoments(db_path="~/.nima/memory/ladybug.lbug")
moment = lucid.surface_moment()
```

### Affect System
Panksepp 7-affect emotional intelligence with personality archetypes:
```python
from nima_core import DynamicAffectSystem
affect = DynamicAffectSystem(identity_name="my_bot", baseline="guardian")
state = affect.process_input("I'm excited about this!")
# Archetypes: guardian, explorer, trickster, empath, sage
```

## API

```python
from nima_core import (
    DynamicAffectSystem,
    get_affect_system,
    HiveMind,
    NimaPrecognition,
    LucidMoments,
)

# Affect (thread-safe singleton)
affect = get_affect_system(identity_name="lilu")
state = affect.process_input("Hello!")

# Hive Mind
hive = HiveMind()
context = hive.build_agent_context("task description")

# Precognition
precog = NimaPrecognition()
precog.run_mining_cycle()

# Lucid Moments
lucid = LucidMoments()
moment = lucid.surface_moment()
```

## Changelog

See [CHANGELOG.md](./CHANGELOG.md) for full version history.

### Recent Releases
- **v3.0.4** (Feb 23, 2026) โ€” Darwinian memory engine, new CLIs, installer, bug fixes
- **v2.5.0** (Feb 21, 2026) โ€” Hive Mind, Precognition, Lucid Moments
- **v2.4.0** (Feb 20, 2026) โ€” Dream Consolidation engine
- **v2.3.0** (Feb 19, 2026) โ€” Memory Pruner, connection pool, Ollama support
- **v2.2.0** (Feb 19, 2026) โ€” VADER Affect, 4-phase noise remediation, ecology scoring
- **v2.0.0** (Feb 13, 2026) โ€” LadybugDB backend, security hardening, 348 tests

## License

MIT โ€” free for any AI agent, commercial or personal.


---

## Referenced Files

> The following files are referenced in this skill and included for context.

### CHANGELOG.md

```markdown
# Changelog

All notable changes to NIMA Core will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [3.2.0] - 2026-03-04

### Added
- **`storage/` module** โ€” New cognitive memory primitives:
  - `storage/temporal_decay.py` โ€” ACT-R base-level activation scorer. Tracks per-memory access history and computes decay-weighted activation scores using formula `B_i = ln(ฮฃ t_j^(โˆ’d))`. Enables time-aware memory retrieval (recently-accessed memories rank higher).
  - `storage/hebbian_updater.py` โ€” Hebbian edge-weight manager for the memory graph. Strengthens associations between co-activated memories ("neurons that fire together, wire together") and decays unused edges. Integrates with `lazy_recall.py` for graph-augmented re-ranking.
  - `storage/__init__.py` โ€” Clean module exports for both classes.
- `lazy_recall.py` now fully activates ACT-R temporal decay and Hebbian boost (both previously imported but had no backing module).

### Security
- All SQL uses parameterised `?` placeholders throughout `storage/` โ€” no user-controlled data is ever interpolated into SQL text.
- `node_id` validated as non-empty string before any DB operation in `temporal_decay.py`.
- Integer type enforcement for node IDs in `hebbian_updater.py` (graph node IDs are always SQLite row integers).

### Changed
- Default database paths now use `NIMA_HOME` env var (default `~/.nima`) for portability across bots and installations. Previously pointed to `lilu_core/storage/data/` (Lilu-specific).

## [3.1.0] - 2026-02-26

### Fixed
- **CRITICAL: LadybugDB SIGSEGV on SET/CREATE/DELETE** โ€” Root cause identified: `LOAD VECTOR` extension must be called before any mutation on tables with `FLOAT[512]` columns (like MemoryNode). Without it, Kรนzu crashes with SIGSEGV. Added `LOAD VECTOR` calls to: `memory_pruner.py`, `lucid_moments.py`, `dream_db_sync.py`, and all LadybugDB connection helpers.

### Added
- **`nima_core/dream_db_sync.py`** โ€” New module that syncs dream consolidation outputs (insights, patterns, dream runs, narratives) from JSON files to both SQLite and LadybugDB. Called automatically after dream consolidation and pruning.
- **Ghost-marking pipeline** โ€” Memory pruner now syncs suppression registry โ†’ LadybugDB ghost marks after each pruning run. Batched in groups of 200 IDs.
- **SQLite dual-write in `ladybug_store.py`** โ€” Every memory stored to LadybugDB is also written to SQLite with optional Voyage embedding for semantic search (requires `VOYAGE_API_KEY` env var).
- **Dream system SQLite tables** in `scripts/init_db.py`:
  - `nima_insights` โ€” Dream-generated insights with confidence scores
  - `nima_patterns` โ€” Cross-domain recurring patterns
  - `nima_dream_runs` โ€” Dream consolidation run history
  - `nima_suppressed_memories` โ€” Pruned memory records
  - `nima_pruner_runs` โ€” Pruner execution log
  - `nima_lucid_moments` โ€” Surfaced memory moments

### Changed
- **`dream_consolidation.py`** โ€” Now calls `dream_db_sync.sync_all()` after each consolidation run to persist results to both databases.
- **`memory_pruner.py`** โ€” After pruning, syncs ghost marks to LadybugDB via `dream_db_sync.sync_pruner_to_ladybug()`.

## [3.0.8] - 2026-02-24

### Added
- **Security section in SKILL.md** โ€” Comprehensive security documentation covering:
  - What gets installed and where
  - Credential handling table (which env vars make network calls)
  - Safety features (input filtering, injection prevention, timeouts)
  - Best practices (review before install, don't run as root, use containers)
  - Data location reference
- **`.nimaignore` file** โ€” Specification for excluding content from memory capture
  - Supports glob patterns (like .gitignore)
  - Defines filters for system messages, heartbeats, passwords/secrets
  - Note: Pattern matching implementation in hook is pending
- **`scripts/init_db.py`** โ€” Extracted database initialization from install.sh
  - Standalone script with argparse
  - Verbose mode for debugging
  - Proper error handling

### Changed
- **`install.sh` refactored** โ€” Cleaner, more verbose output
  - Uses `scripts/init_db.py` instead of inline Python
  - Shows each step clearly
  - Data directory configurable via `NIMA_DATA_DIR`

### Security
- **Transparency** โ€” All install actions logged, no hidden operations
- **Defense in depth** โ€” Multiple layers of input filtering
- **Minimal permissions** โ€” No root required, user home only

## [3.0.7] - 2026-02-23

## [3.0.6] - 2026-02-23

### Fixed
- **CRITICAL:** SyntaxError in lucid_moments.py line 447 โ€” unterminated f-string with literal newline, preventing NIMA from loading

## [3.0.5] - 2026-02-23

### Changed
- SKILL.md: remove internal post-mortem language, fix NIMA_DATA_DIR example (~/.nima/memory โ†’ ~/.nima), update changelog to v3.0.4, add Darwinian Memory + Installer to module table

## [3.0.4] - 2026-02-23

### Fixed
- **Version alignment:** Synced `__init__.py`, `README.md` badge, and all three OpenClaw hook `package.json` files to match canonical `setup.py` version `3.0.4`
- **nima-affect missing package.json:** Added `package.json` to `openclaw_hooks/nima-affect/` โ€” consistent with `nima-memory` and `nima-recall-live` hook format
- Hook versions were scattered across `2.0.2`, `2.0.3`, `2.0.11` โ€” all unified to `3.0.4`

## [3.0.3] - 2026-02-22

### Changed
- Minor internal refinements post `3.0.2` publish
- `setup.py` bumped to `3.0.3` โ†’ `3.0.4` for subsequent release

## [3.0.2] - 2026-02-22

### Fixed
- **CRITICAL:** ClawHub package was missing entire `nima_core/cognition/` directory (10 files) due to `.clawhubignore` glob pattern bug โ€” `*` excluded subdirectory contents even when parent was re-included
- **CRITICAL:** All OpenClaw hook files missing from package (`openclaw_hooks/nima-memory/*.py`, `openclaw_hooks/nima-recall-live/*.py`, `openclaw_hooks/nima-affect/*`) โ€” same `.clawhubignore` root cause
- Fixed `.clawhubignore` to use `!dir/**` pattern for recursive re-inclusion

### Changed
- README.md fully rewritten โ€” consolidated all features (v2.0โ€“v3.0), added package contents tree, simplified configuration docs, removed outdated sections
- Version badges updated to 3.0.2

## [3.0.0] - 2026-02-22

### Changed
- Version alignment across all modules to 3.0.0
- Package audit and dependency cleanup
- SKILL.md version bump

### Known Issues
- Package published to ClawHub was incomplete (fixed in 3.0.2)

## [2.5.0] - 2026-02-21

### Added
- **Hive Mind** (`nima_core/hive_mind.py`) โ€” Proposal #7: Memory Entanglement.
  - `HiveMind` class: inject shared memory context into sub-agent prompts + capture results back to LadybugDB.
  - `HiveBus` class: Redis pub/sub message bus for real-time agent-to-agent communication. Channels: `hive` (broadcast), `role:{role}`, `agent:{id}`, `results:{swarm_id}`.
  - Optional: requires `redis-py` (`pip install nima-core[hive]`).
- **Precognition** (`nima_core/precognition.py`) โ€” Proposal #4: Precognitive Memory Injection.
  - `NimaPrecognition` class: mine temporal patterns from LadybugDB, generate predictions via any OpenAI-compatible LLM, inject relevant precognitions into agent prompts.
  - Configurable: `db_path`, `llm_base_url`, `llm_model`, `voyage_api_key`, `lookback_days`.
  - Full cycle: `run_mining_cycle()` โ†’ `mine_patterns()` โ†’ `generate_precognitions()` โ†’ `store_precognitions()`.
  - Semantic dedup via SHA-256 pattern hashing; optional Voyage embeddings.
- **Lucid Moments** (`nima_core/lucid_moments.py`) โ€” Proposal #8: Spontaneous Memory Surfacing.
  - `LucidMoments` class: surface emotionally-resonant memories unbidden via any delivery callback.
  - Scoring: age window (3โ€“30 days), layer bonus, content richness, warm keywords.
  - Safety: trauma keyword filter, quiet hours, min gap, daily cap.
  - Enrichment: LLM transforms raw memories into natural "this just came to me" messages.
  - Fully configurable: quiet hours, `min_gap_hours`, `max_per_day`, `warm_keywords`, `persona_prompt`.

### Changed
- `setup.py`: version 2.4.0 โ†’ 2.5.0, added `[hive]` extra for `redis>=4.0.0`.
- `__init__.py`: lazy-imports for all three new modules (graceful if LadybugDB/redis unavailable).

## [2.4.0] - 2026-02-20

### Added
- **Dream Consolidation** (`nima_core/dream_consolidation.py`) โ€” nightly memory synthesis engine.
  - Extracts `Insight` and `Pattern` objects from episodic memories via LLM.
  - VSA-style `blend_dream_vector` for semantic compression.
  - `DreamConsolidator` class with configurable LLM endpoint, lookback window, temperature.
  - `nima-dream` CLI entry point for scripted/cron usage.
- **Dream session state** โ€” `DreamSession` dataclass tracks what was consolidated.

## [2.3.0] - 2026-02-19

### Added
- **Memory Pruner** (`nima_core/memory_pruner.py`) โ€” Episodic distillation engine. Distills old conversation turns into semantic gists via LLM, suppresses raw noise in 30-day limbo. Configurable: `NIMA_DISTILL_MODEL`, `NIMA_DB_PATH`, `NIMA_DATA_DIR`, `NIMA_CAPTURE_CLI`. Pure stdlib (no `anthropic` package needed).
- **Logging** (`nima_core/logging_config.py`) โ€” Singleton logger with file + console handlers. `NIMA_LOG_LEVEL` env var.
- **Metrics** (`nima_core/metrics.py`) โ€” Thread-safe counters, timings, gauges. `Timer` context manager. Tagged metric support.
- **Connection Pool** (`nima_core/connection_pool.py`) โ€” SQLite connection pool with WAL mode, max 5 connections, thread-safe.
- **Ollama embedding support** โ€” `NIMA_EMBEDDER=ollama` with `NIMA_OLLAMA_MODEL` configuration.

### Fixed
- `__init__.py` โ€” `__all__` NameError (used before definition)
- Memory pruner โ€” Cypher injection prevention via layer whitelist
- Connection pool โ€” Thread-safe `_waiters` counter, no double-decrement
- Logging โ€” Correct log directory path (`NIMA_DATA_DIR/logs`, not parent)
- Metrics โ€” Tagged metrics no longer overwrite each other in `get_summary()`

### Changed
- Version bump: 2.2.0 โ†’ 2.3.0
- Python requirement: 3.8+ (was 3.11+)
- All hardcoded paths replaced with env vars for portability

## [2.2.0] - 2026-02-19

### Added
- **VADER Affect Analyzer** โ€” Contextual sentiment replacing lexicon-based detection
- **4-Phase Noise Remediation** โ€” Empty validation โ†’ heartbeat filter โ†’ dedup โ†’ metrics
- **Resilient hook wrappers** โ€” Auto-retry with exponential backoff and jitter
- **Ecology scoring** โ€” Memory strength, decay, recency, surprise, dismissal in recall
- **Suppression registry** โ€” File-based memory suppression with 30-day limbo

### Fixed
- Null contemplation layer crash
- Duplicate VADER/emotion lexicon keys
- Negation logic (proper 2-word window)
- Hardcoded venv paths โ†’ dynamic `os.homedir()`
- `--who` CLI filter (was a no-op)
- `maxRetries` clamped โ‰ฅ 1 in resilient wrapper
- Debug logging gated behind `NIMA_DEBUG_RECALL`
- Division by zero in cleanup script
- Ruff E701 lint issues

### Changed
- Recall token budget: 500 โ†’ 3000
- Shebang: hardcoded path โ†’ `#!/usr/bin/env python3`
- Turn IDs: full millisecond timestamps

## [2.1.0] - 2026-02-17

### Added
- Pre-release of VADER and noise remediation (shipped in v2.2.0)

## [2.0.3] - 2026-02-15

### Security
- Fixed path traversal vulnerability in affect_history.py (CRITICAL)
- Fixed temp file resource leaks in 3 files (HIGH)

### Fixed
- Corrected non-existent `json.JSONEncodeError` โ†’ `TypeError`/`ValueError`
- Improved exception handling โ€” replaced 5 generic catches with specific types

### Improved
- Better error visibility and debugging throughout

## [2.0.1] - 2026-02-14

### Fixed
- Thread-safe singleton with double-checked locking

### Security
- Clarified metadata requirements (Node.js, env vars)
- Added security disclosure for API key usage

## [2.0.0] - 2026-02-13

### Added
- **LadybugDB backend** with HNSW vector search (18ms query time)
- **Native graph traversal** with Cypher queries
- **nima-query CLI** for unified database queries
- SQL/FTS5 injection prevention
- Path traversal protection
- Temp file cleanup
- API timeouts (Voyage 30s, LadybugDB 10s)
- 348 unit tests with full coverage

### Performance
- 3.4x faster text search (9ms vs 31ms)
- 44% smaller database (50MB vs 91MB)
- 6x smaller context tokens (~30 vs ~180)

### Fixed
- Thread-safe singleton initialization

## [1.2.1] - 2026-02-10

### Added
- 8 consciousness systems (ฮฆ, Global Workspace, self-awareness)
- Sparse Block VSA memory
- ConsciousnessCore unified interface

## [1.2.0] - 2026-02-08

### Added
- 4 Layer-2 composite affect engines
- Async affective processing
- Voyage AI embedding support

## [1.1.9] - 2026-02-05

### Fixed
- nima-recall hook spawning new Python process every bootstrap
- Performance: ~50-250x faster hook recall

---

## Release Notes Format

Each release includes:
- **Added** โ€” New features
- **Changed** โ€” Changes to existing functionality
- **Deprecated** โ€” Soon-to-be removed features
- **Removed** โ€” Removed features
- **Fixed** โ€” Bug fixes
- **Security** โ€” Security improvements

```



---

## Skill Companion Files

> Additional files collected from the skill directory layout.

### README.md

```markdown
<p align="center">
  <img src="assets/banner.png" alt="NIMA Core" width="700" />
</p>

<h1 align="center">NIMA Core</h1>

<p align="center">
  <strong>Noosphere Integrated Memory Architecture</strong><br/>
  Persistent memory, emotional intelligence, and semantic recall for AI agents.
</p>

<p align="center">
  <a href="https://nima-core.ai"><b>๐ŸŒ nima-core.ai</b></a> ยท 
  <a href="https://github.com/lilubot/nima-core">GitHub</a> ยท 
  <a href="https://clawhub.com/skills/nima-core">ClawHub</a> ยท 
  <a href="./CHANGELOG.md">Changelog</a>
</p>

<p align="center">
  <img src="https://img.shields.io/badge/version-3.2.0-blue" alt="Version" />
  <img src="https://img.shields.io/badge/python-3.9%2B-green" alt="Python" />
  <img src="https://img.shields.io/badge/node-18%2B-green" alt="Node" />
  <img src="https://img.shields.io/badge/license-MIT-brightgreen" alt="License" />
</p>

---

> *"Your AI wakes up fresh every session. NIMA gives it a past."*

NIMA Core is the memory system that makes AI agents **remember**. It captures conversations, encodes them as searchable memories with emotional context, and injects relevant history before every response โ€” so your bot sounds like it's been paying attention all along.

**Works with any OpenClaw bot. One install script. Zero config to start.**

---

## โšก 30-Second Install

```bash
pip install nima-core && nima-core
```

That's it. The setup wizard handles everything:
- Creates `~/.nima/` directory
- Installs OpenClaw hooks
- Configures your embedding provider
- Restarts the gateway

**Or clone and install manually:**

```bash
git clone https://github.com/lilubot/nima-core.git
cd nima-core
./install.sh
openclaw gateway restart
```

Your bot now has persistent memory. Every conversation is captured, indexed, and recalled automatically.

---

## ๐Ÿ†• What's New in v3.0

### Complete Cognitive Architecture

NIMA is no longer just memory โ€” it's a **full cognitive stack** for AI agents:

| Module | What It Does | Since |
|--------|-------------|-------|
| **Memory Capture** | 3-layer capture (input/contemplation/output) with 4-phase noise filtering | v2.0 |
| **Semantic Recall** | Vector + text hybrid search, ecology scoring, token-budgeted injection | v2.0 |
| **Dynamic Affect** | Panksepp 7-affect emotional state tracking (SEEKING, RAGE, FEAR, LUST, CARE, PANIC, PLAY) | v2.1 |
| **Memory Pruner** | LLM distillation of old conversations into semantic gists, 30-day suppression limbo | v2.3 |
| **Dream Consolidation** | Nightly synthesis โ€” extracts insights and patterns from episodic memory via LLM | v2.4 |
| **Hive Mind** | Multi-agent memory sharing via shared LadybugDB + optional Redis pub/sub | v2.5 |
| **Precognition** | Temporal pattern mining โ†’ predictive memory pre-loading | v2.5 |
| **Lucid Moments** | Spontaneous surfacing of emotionally-resonant memories | v2.5 |

### v3.0.2 Bug Fixes
- **Fixed:** ClawHub package was missing `nima_core/cognition/` directory and all OpenClaw hook files due to `.clawhubignore` glob pattern bug
- **Fixed:** All subdirectories now correctly included in published package

### v3.0.0 Highlights
- Version alignment across all modules
- Full package audit and dependency cleanup

---

## ๐Ÿง  How It Works

```text
  User message arrives
         โ”‚
         โ–ผ
  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”     โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
  โ”‚ nima-memory  โ”‚โ”€โ”€โ”€โ”€โ–ถโ”‚ Capture โ†’ Filter โ†’ Store โ”‚
  โ”‚  (on save)   โ”‚     โ”‚ 4-phase noise remediationโ”‚
  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜     โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
         โ”‚
         โ–ผ
  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”     โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
  โ”‚ nima-recall  โ”‚โ”€โ”€โ”€โ”€โ–ถโ”‚ Search โ†’ Score โ†’ Inject  โ”‚
  โ”‚ (before LLM) โ”‚     โ”‚ Text + Vector + Ecology  โ”‚
  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜     โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
         โ”‚
         โ–ผ
  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”     โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
  โ”‚ nima-affect  โ”‚โ”€โ”€โ”€โ”€โ–ถโ”‚ VADER โ†’ Panksepp 7-Affectโ”‚
  โ”‚ (on message) โ”‚     โ”‚ Emotional state tracking โ”‚
  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜     โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
         โ”‚
         โ–ผ
  Agent responds with memory + emotional awareness
```

**Three hooks, fully automatic:**

| Hook | Fires | Does |
|------|-------|------|
| `nima-memory` | After each message | Captures text โ†’ filters noise โ†’ stores in graph DB |
| `nima-recall-live` | Before agent responds | Searches relevant memories โ†’ injects as context |
| `nima-affect` | On each message | Detects emotion โ†’ updates 7-dimensional affect state |

---

## ๐Ÿ“ฆ Package Contents

```text
nima-core/
โ”œโ”€โ”€ SKILL.md                          # ClawHub skill definition
โ”œโ”€โ”€ README.md                         # This file
โ”œโ”€โ”€ CHANGELOG.md                      # Full version history
โ”œโ”€โ”€ install.sh                        # One-command installer
โ”œโ”€โ”€ setup.py                          # pip install support
โ”œโ”€โ”€ requirements.txt                  # Core dependencies
โ”‚
โ”œโ”€โ”€ nima_core/                        # Python core library
โ”‚   โ”œโ”€โ”€ __init__.py                   # Lazy imports, version, public API
โ”‚   โ”œโ”€โ”€ connection_pool.py            # SQLite connection pool (WAL, thread-safe)
โ”‚   โ”œโ”€โ”€ logging_config.py             # Singleton logger
โ”‚   โ”œโ”€โ”€ metrics.py                    # Thread-safe counters/timings
โ”‚   โ”œโ”€โ”€ memory_pruner.py              # Episodic distillation engine
โ”‚   โ”œโ”€โ”€ dream_consolidation.py        # Nightly memory synthesis
โ”‚   โ”œโ”€โ”€ hive_mind.py                  # Multi-agent memory sharing
โ”‚   โ”œโ”€โ”€ precognition.py               # Temporal pattern mining
โ”‚   โ”œโ”€โ”€ lucid_moments.py              # Spontaneous memory surfacing
โ”‚   โ””โ”€โ”€ cognition/                    # Emotional intelligence
โ”‚       โ”œโ”€โ”€ dynamic_affect.py         # Panksepp 7-affect system
โ”‚       โ”œโ”€โ”€ emotion_detection.py      # Text emotion extraction
โ”‚       โ”œโ”€โ”€ affect_correlation.py     # Cross-affect analysis
โ”‚       โ”œโ”€โ”€ affect_history.py         # Temporal affect tracking
โ”‚       โ”œโ”€โ”€ affect_interactions.py    # Affect coupling dynamics
โ”‚       โ”œโ”€โ”€ archetypes.py             # Personality baselines
โ”‚       โ”œโ”€โ”€ personality_profiles.py   # JSON personality configs
โ”‚       โ”œโ”€โ”€ response_modulator_v2.py  # Affect โ†’ response modulation
โ”‚       โ””โ”€โ”€ exceptions.py             # Custom exceptions
โ”‚
โ”œโ”€โ”€ openclaw_hooks/                   # OpenClaw plugin hooks
โ”‚   โ”œโ”€โ”€ nima-memory/                  # Capture hook
โ”‚   โ”‚   โ”œโ”€โ”€ index.js                  # Hook entry point
โ”‚   โ”‚   โ”œโ”€โ”€ openclaw.plugin.json      # Plugin manifest
โ”‚   โ”‚   โ”œโ”€โ”€ ladybug_store.py          # LadybugDB storage backend
โ”‚   โ”‚   โ”œโ”€โ”€ embeddings.py             # Multi-provider embedding
โ”‚   โ”‚   โ”œโ”€โ”€ backfill.py               # Historical transcript import
โ”‚   โ”‚   โ”œโ”€โ”€ health_check.py           # DB integrity checks
โ”‚   โ”‚   โ””โ”€โ”€ ...                       # Migration, benchmarks, docs
โ”‚   โ”œโ”€โ”€ nima-recall-live/             # Recall hook
โ”‚   โ”‚   โ”œโ”€โ”€ index.js                  # Hook entry point
โ”‚   โ”‚   โ”œโ”€โ”€ lazy_recall.py            # Current recall engine
โ”‚   โ”‚   โ”œโ”€โ”€ ladybug_recall.py         # LadybugDB-native recall
โ”‚   โ”‚   โ””โ”€โ”€ build_embedding_index.py  # Offline index builder
โ”‚   โ”œโ”€โ”€ nima-affect/                  # Affect hook
โ”‚   โ”‚   โ”œโ”€โ”€ index.js                  # Hook entry point
โ”‚   โ”‚   โ”œโ”€โ”€ vader-affect.js           # VADER sentiment analyzer
โ”‚   โ”‚   โ””โ”€โ”€ emotion-lexicon.js        # Emotion keyword lexicon
โ”‚   โ””โ”€โ”€ shared/                       # Shared utilities
โ”‚       โ”œโ”€โ”€ resilient.js              # Auto-retry with backoff
โ”‚       โ””โ”€โ”€ error-handling.js         # Graceful error wrappers
```

---

## ๐Ÿ”ง Configuration

### Embedding Providers

NIMA needs an embedding model to create searchable memory vectors. **Pick one:**

| Provider | Setup | Dims | Cost | Best For |
|----------|-------|------|------|----------|
| **๐Ÿ  Local** (default) | `NIMA_EMBEDDER=local` + `pip install sentence-transformers` | 384 | Free | Privacy, offline, dev |
| **๐Ÿš€ Voyage AI** | `NIMA_EMBEDDER=voyage` + `VOYAGE_API_KEY` | 1024 | $0.12/1M tok | Production (best quality/cost) |
| **๐Ÿค– OpenAI** | `NIMA_EMBEDDER=openai` + `OPENAI_API_KEY` | 1536 | $0.13/1M tok | If you already use OpenAI |
| **๐Ÿฆ™ Ollama** | `NIMA_EMBEDDER=ollama` + `NIMA_OLLAMA_MODEL` | 768 | Free | Local GPU |

> **Don't have a preference?** Leave `NIMA_EMBEDDER` unset โ€” defaults to `local` with `all-MiniLM-L6-v2`. Free, offline, no API keys.

### Database Backend

| | SQLite (default) | LadybugDB (recommended) |
|--|-----------------|------------------------|
| **Setup** | Zero config | `pip install real-ladybug` |
| **Text Search** | 31ms | **9ms** (3.4x faster) |
| **Vector Search** | External only | **Native HNSW** (18ms) |
| **Graph Queries** | SQL JOINs | **Native Cypher** |
| **DB Size** | ~91 MB | **~50 MB** (44% smaller) |

```bash
# Upgrade to LadybugDB when ready:
pip install real-ladybug
python -c "from nima_core.storage import migrate; migrate()"
```

#### LadybugDB Schema

When using the recommended **LadybugDB** backend โ€” a custom graph database (Kรนzu-based) using Cypher query language (format: `.lbug` binary) โ€” the schema is as follows.

> **v3.2.0:** Schema now includes `DreamNode`, `InsightNode`, `PatternNode` node types and the `derived_from` relationship.  Run `python scripts/init_ladybug.py` to initialize a fresh database with the complete schema.

**Node Tables:**

| Table | Description |
|-------|-------------|
| **MemoryNode** | Primary memory storage |
| | `id INT64 PRIMARY KEY` |
| | `timestamp INT64` โ€” Unix ms |
| | `layer STRING` โ€” Memory type (see Layer Types below) |
| | `text STRING` โ€” Full memory content |
| | `summary STRING` โ€” Truncated to 200 chars |
| | `who STRING` โ€” Person associated (David, Lilu, etc.) |
| | `affect_json STRING` โ€” Emotion state at capture (JSON) |
| | `session_key STRING` โ€” Source session |
| | `conversation_id STRING` โ€” Conversation context |
| | `turn_id STRING` โ€” Turn within conversation |
| | `fe_score DOUBLE` โ€” Free Energy score (importance proxy) |
| | `embedding FLOAT[512]` โ€” Voyage AI semantic embedding |
| | `strength FLOAT` โ€” Decay strength (default 1.0) |
| | `decay_rate FLOAT` โ€” Forgetting rate (default 0.01) |
| | `last_accessed INT64` โ€” Unix ms of last recall (ACT-R) |
| | `is_ghost BOOL` โ€” Soft-deleted / suppressed flag |
| | `dismissal_count INT64` โ€” Times dismissed by pruner |
| | `memory_type STRING` โ€” Semantic category (e.g. `fact`, `event`) |
| | `importance DOUBLE` โ€” Importance score for ranking |
| | `emotions STRING` โ€” JSON array of detected emotion tags |
| | `themes STRING` โ€” JSON array of topic/theme tags |
| | `source_agent STRING` โ€” Hive bot that stored this memory |
| | `model STRING` โ€” LLM model used during capture |
| **Turn** | Conversation turn structure |
| | `id INT64 PRIMARY KEY` |
| | `turn_id STRING` |
| | `timestamp INT64` |
| | `affect_json STRING` |
| **DreamNode** | Nightly dream consolidation narratives *(v3.2.0+)* |
| | `id STRING PRIMARY KEY` |
| | `date STRING` โ€” Date of dream run (ISO 8601) |
| | `narrative STRING` โ€” Full narrative markdown |
| | `source_count INT64` โ€” Number of memories processed |
| | `created_at STRING` |
| **InsightNode** | Extracted insights from dream consolidation *(v3.2.0+)* |
| | `id STRING PRIMARY KEY` |
| | `content STRING` |
| | `type STRING` โ€” `insight`, `pattern`, `synthesis`, etc. |
| | `confidence FLOAT` โ€” 0.0โ€“1.0 |
| | `sources STRING` โ€” JSON array of source memory IDs |
| | `domains STRING` โ€” JSON array of topic domains |
| | `timestamp STRING` |
| | `importance FLOAT` |
| | `validated BOOL` โ€” Human-reviewed flag |
| **PatternNode** | Recurring patterns across memories *(v3.2.0+)* |
| | `id STRING PRIMARY KEY` |
| | `name STRING` |
| | `description STRING` |
| | `occurrences INT64` |
| | `domains STRING` โ€” JSON array |
| | `examples STRING` โ€” JSON array of example memory IDs |
| | `first_seen STRING` |
| | `last_seen STRING` |
| | `strength DOUBLE` โ€” 0.0โ€“1.0 |

**Relationship Tables:**

| Relationship | From โ†’ To | Properties |
|--------------|-----------|------------|
| `relates_to` | MemoryNode โ†’ MemoryNode | `relation STRING`, `weight DOUBLE` |
| `has_input` | Turn โ†’ MemoryNode | โ€” |
| `has_contemplation` | Turn โ†’ MemoryNode | โ€” |
| `has_output` | Turn โ†’ MemoryNode | โ€” |
| `derived_from` | InsightNode โ†’ MemoryNode, DreamNode โ†’ MemoryNode | โ€” |

##### Layer Types (valid `layer` values)

| Layer | Description |
|-------|-------------|
| `episodic` | Raw conversation turns (input/output) |
| `semantic` | Extracted facts, preferences, knowledge |
| `dream` | Consolidated insights from nightly synthesis |
| `insight` | Key realization or connection |
| `pattern` | Recurring behavioral pattern |
| `synthesis` | Cross-domain connection |
| `consolidation` | Memory pruner output (distilled) |
| `precognition` | Predicted future session |
| `lucid` | Spontaneously surfaced memory |
| `input` | User input from a conversation turn |
| `output` | Agent output from a conversation turn |
| `contemplation` | Agent's internal thought process |
| `legacy_vsa` | Older memory type from VSA-based systems |

##### Relation Types (valid `relation` values in `relates_to`)

| Relation | Meaning |
|----------|---------|
| `related_to` | General association |
| `caused_by` | Causal chain |
| `reminds_of` | Analogy or similarity |
| `contradicts` | Opposing view |
| `supports` | Reinforcing evidence |
| `elicits` | Emotion trigger |
| `refers_to` | Topic reference |
| `part_of` | Compositional hierarchy |
| `triggered` | An input that triggered a contemplation |
| `produced` | A contemplation that produced an output |
| `responded_to` | An output that responded to an input |

##### Indexes (for performance)

```cypher
// Recommended indexes
CREATE INDEX idx_memory_node_layer ON MemoryNode(layer);
CREATE INDEX idx_memory_node_who ON MemoryNode(who);
CREATE INDEX idx_memory_node_timestamp ON MemoryNode(timestamp);
CREATE INDEX idx_memory_node_session ON MemoryNode(session_key);
CREATE INDEX idx_turn_timestamp ON Turn(timestamp);
```

##### Schema Version Tracking

Schema migrations are tracked in the database:

```cypher
// Create schema-version tracking table
CREATE NODE TABLE IF NOT EXISTS _nima_schema (
    version INT64 PRIMARY KEY,
    applied_at INT64,
    description STRING
);
```

```cypher
// Check current schema version
MATCH (s:_nima_schema)
RETURN s.version, s.description ORDER BY s.version DESC LIMIT 1;
// Current version: 003
```

##### Example Cypher Queries

```cypher
// Get recent memories for a person
MATCH (t:Turn)-[:has_input|has_output]->(m:MemoryNode {who: 'David'})
WHERE t.timestamp > 1700000000000
RETURN m ORDER BY m.timestamp DESC LIMIT 10;

// Find memories related to a topic
MATCH (m:MemoryNode)
WHERE m.text CONTAINS 'consciousness' OR m.summary CONTAINS 'consciousness'
RETURN m ORDER BY m.fe_score DESC LIMIT 5;

// Get conversation thread
MATCH (t:Turn)-[:has_input|has_output]->(m:MemoryNode {conversation_id: 'abc123'})
RETURN m ORDER BY t.timestamp;

// Find emotionally significant memories
MATCH (m:MemoryNode)
WHERE m.fe_score > 0.7
RETURN m ORDER BY m.fe_score DESC LIMIT 10;

// Get all related memories (graph traversal)
MATCH (m1:MemoryNode {id: 123})-[:relates_to]->(m2:MemoryNode)
RETURN m2;

// Find dream consolidations
MATCH (m:MemoryNode {layer: 'dream'})
RETURN m ORDER BY m.timestamp DESC LIMIT 5;

// Memories by time range
MATCH (m:MemoryNode)
WHERE m.timestamp >= 1704067200000 AND m.timestamp < 1704153600000
RETURN m ORDER BY m.timestamp;

// Get memory counts by layer
MATCH (m:MemoryNode)
RETURN m.layer, count(m) AS count ORDER BY count DESC;
```

**Supporting Files** (same `~/.nima/memory/` directory):

| File | What it is |
|------|------------|
| `graph.sqlite` | 50MB โ€” Graphiti temporal knowledge graph (separate system) |
| `embedding_index.npy` | 478MB โ€” NumPy vector index for semantic search |
| `embedding_cache.db` | SQLite โ€” Cached embeddings keyed by content hash |
| `precognitions.sqlite` | SQLite โ€” Predicted future session patterns |
| `faiss.index` | 16MB โ€” FAISS vector index (older, may be superseded) |
| `.nimaignore` | Ignore patterns for memory capture (see project root) |

### Environment Variables

```bash
# Embedding (default: local โ€” no keys needed)
NIMA_EMBEDDER=local|voyage|openai|ollama
VOYAGE_API_KEY=pa-xxx
OPENAI_API_KEY=sk-xxx
NIMA_OLLAMA_MODEL=nomic-embed-text

# Data paths
NIMA_DATA_DIR=~/.nima/memory
NIMA_DB_PATH=~/.nima/memory/ladybug.lbug

# Memory pruner (optional)
NIMA_DISTILL_MODEL=claude-haiku-4-5
ANTHROPIC_API_KEY=sk-ant-xxx

# Logging
NIMA_LOG_LEVEL=INFO
NIMA_DEBUG_RECALL=1
```

---

## ๐Ÿ”Œ Hook Installation

### Quick Install
```bash
./install.sh
openclaw gateway restart
```

### Manual Install
```bash
# Copy hooks to extensions
cp -r openclaw_hooks/nima-memory ~/.openclaw/extensions/
cp -r openclaw_hooks/nima-recall-live ~/.openclaw/extensions/
cp -r openclaw_hooks/nima-affect ~/.openclaw/extensions/

# Add to openclaw.json
{
  "plugins": {
    "allow": ["nima-memory", "nima-recall-live", "nima-affect"]
  }
}

# Restart
openclaw gateway restart
```

### Verify
```bash
openclaw status          # Hooks loaded?
ls ~/.nima/memory/       # Memories captured?
cat ~/.nima/affect/affect_state.json  # Affect state?
```

---

## ๐ŸŽญ Affect System

Tracks emotional state using **Panksepp's 7 primary affects**:

| Affect | Feels Like | Triggers |
|--------|-----------|----------|
| **SEEKING** | Curiosity, anticipation | Questions, new topics |
| **RAGE** | Frustration, boundaries | Conflict, demands |
| **FEAR** | Caution, vigilance | Threats, uncertainty |
| **LUST** | Desire, motivation | Goals, enthusiasm |
| **CARE** | Nurturing, empathy | Sharing, vulnerability |
| **PANIC** | Distress, sensitivity | Loss, rejection |
| **PLAY** | Joy, humor, bonding | Jokes, creativity |

### Archetype Presets

```python
from nima_core import DynamicAffectSystem
affect = DynamicAffectSystem(identity_name="my_bot", baseline="guardian")
```

| Archetype | Vibe | High | Low |
|-----------|------|------|-----|
| **Guardian** | Protective, warm | CARE, SEEKING | PLAY |
| **Explorer** | Curious, bold | SEEKING, PLAY | FEAR |
| **Trickster** | Witty, irreverent | PLAY, SEEKING | CARE |
| **Empath** | Deeply feeling | CARE, PANIC | RAGE |
| **Sage** | Balanced, wise | SEEKING | All balanced |

---

## ๐ŸŒ™ Dream Consolidation

Nightly synthesis extracts insights and patterns from recent memories:

```bash
# Run manually
python -m nima_core.dream_consolidation

# Or schedule via OpenClaw cron (runs at 2 AM)
```

### How It Works
1. Pulls recent episodic memories from LadybugDB
2. LLM extracts `Insight` and `Pattern` objects
3. VSA-style vector blending compresses semantics
4. Stores consolidated dream memories back to DB
5. Prunes raw material after successful consolidation

---

## ๐Ÿ Hive Mind

Share memory across multiple agents:

```python
from nima_core import HiveMind

hive = HiveMind(db_path="~/.nima/memory/ladybug.lbug")

# Inject context into a sub-agent's prompt
context = hive.build_agent_context("research quantum computing", max_memories=8)

# Capture results back
hive.capture_agent_result("researcher-1", "Found 3 key papers...", "claude-sonnet-4-5")
```

Optional Redis pub/sub for real-time agent communication:
```bash
pip install nima-core[hive]
```

---

## ๐Ÿ”ฎ Precognition

Mine temporal patterns and pre-load relevant memories before the user asks:

```python
from nima_core import NimaPrecognition

precog = NimaPrecognition(db_path="~/.nima/memory/ladybug.lbug")
precog.run_mining_cycle()  # Extract patterns โ†’ generate predictions โ†’ store
```

---

## ๐Ÿ’ก Lucid Moments

Spontaneously surface emotionally-resonant memories:

```python
from nima_core import LucidMoments

lucid = LucidMoments(db_path="~/.nima/memory/ladybug.lbug")
moment = lucid.surface_moment()  # Returns a natural "this just came to me..." message
```

Safety: trauma keyword filtering, quiet hours, daily caps, minimum gap enforcement.

---

## ๐Ÿงน Memory Pruner

Distill old conversations into compact semantic summaries:

```bash
# Preview
python -m nima_core.memory_pruner --min-age 14

# Live run
python -m nima_core.memory_pruner --min-age 14 --live

# Restore from suppression
python -m nima_core.memory_pruner --restore 12345
```

No database writes โ€” suppression is file-based, fully reversible within 30 days.

---

## โฐ Scheduling Setup (OpenClaw Cron)

NIMA's autonomous features โ€” Lucid Moments, Dream Consolidation, Memory Pruner, and Embedding Index โ€” are designed to run on a schedule. Here's the recommended cron configuration for OpenClaw.

### Lucid Moments โ€” 4ร— daily

Surfaces emotionally-resonant memories to your user at natural intervals.

```json
{
  "name": "lucid-memory-moments",
  "schedule": { "kind": "cron", "expr": "0 10,14,18,20 * * *", "tz": "America/New_York" },
  "sessionTarget": "isolated",
  "payload": {
    "kind": "agentTurn",
    "message": "Run the lucid moments check:\n1. Run: cd ~/.openclaw/workspace && .venv/bin/python3 lilu_core/cognition/lucid_moments.py --status\n2. If timing is good (says 'Ready'), run: .venv/bin/python3 lilu_core/cognition/lucid_moments.py\n3. If a pending file was written, read: cat ~/.openclaw/workspace/memory/pending_lucid_moment.txt\n4. Send that exact text to the user via the message tool\n5. If timing wasn't right, do nothing silently\nSend the message as a natural surfaced memory โ€” no framing or prefix.",
    "timeoutSeconds": 120
  },
  "delivery": { "mode": "none" }
}
```

### Dream Consolidation โ€” nightly at 2 AM

Consolidates the last 24h of memories and extracts patterns.

```json
{
  "name": "lilu_dream_consolidation",
  "schedule": { "kind": "cron", "expr": "0 2 * * *", "tz": "America/New_York" },
  "sessionTarget": "isolated",
  "payload": {
    "kind": "agentTurn",
    "message": "cd ~/.openclaw/workspace && .venv/bin/python3 lilu_core/lilu.py dream --hours 24. Report what memories were consolidated and any patterns found.",
    "timeoutSeconds": 600
  },
  "delivery": { "mode": "announce" }
}
```

### Memory Pruner โ€” nightly at 2 AM

Distills old episodic turns into compact semantic memories.

```json
{
  "name": "nima-memory-pruner",
  "schedule": { "kind": "cron", "expr": "0 2 * * *", "tz": "America/New_York" },
  "sessionTarget": "isolated",
  "payload": {
    "kind": "agentTurn",
    "message": "cd ~/.openclaw/workspace && .venv/bin/python3 lilu_core/cognition/memory_pruner.py --min-age 7 --live --max-sessions 10. Report sessions distilled and turns suppressed.",
    "timeoutSeconds": 300
  },
  "delivery": { "mode": "announce" }
}
```

### Embedding Index Rebuild โ€” nightly at 3 AM

Keeps vector recall indexes fresh.

```json
{
  "name": "nima-embedding-index",
  "schedule": { "kind": "cron", "expr": "0 3 * * *" },
  "sessionTarget": "main",
  "payload": {
    "kind": "systemEvent",
    "text": "Rebuild embedding index for NIMA memory recall"
  }
}
```

### Precognition

Precognition runs automatically on every incoming message via the `nima-affect` OpenClaw plugin โ€” no separate cron needed. The predicted session patterns it generates are injected into context before each agent response.

---

## ๐Ÿ“Š Performance

| Operation | SQLite | LadybugDB |
|-----------|--------|-----------|
| Text search | 31ms | **9ms** |
| Vector search | โ€” | **18ms** |
| Full recall cycle | ~50ms | **~30ms** |
| Context overhead | ~180 tokens | **~30 tokens** |

---

## ๐Ÿ”’ Privacy

- โœ… All data stored locally in `~/.nima/`
- โœ… Local embedding mode = **zero external calls**
- โœ… No NIMA-owned servers, no proprietary tracking, no analytics to external services
- โš ๏ธ Opt-in: HiveMind (Redis), Precognition (LLM), cloud embeddings โ€” see SKILL.md for details
- ๐Ÿ”’ Embedding API calls only when explicitly enabling via env vars

---

## ๐Ÿ”„ Upgrading

### From v2.x โ†’ v3.x

```bash
git pull origin main
pip install -e .  # or: pip install nima-core --upgrade
openclaw gateway restart
```

No breaking changes โ€” v3.0 is a package consolidation release. All v2.x configs continue to work.

### From v1.x โ†’ v2.x

```bash
cp -r ~/.nima ~/.nima.backup
rm -rf ~/.openclaw/extensions/nima-*
cp -r openclaw_hooks/* ~/.openclaw/extensions/
pip install real-ladybug  # optional
openclaw gateway restart
```

---

## ๐Ÿค Contributing

PRs welcome. Python 3.9+ compatibility, conventional commits.

```bash
git clone https://github.com/lilubot/nima-core.git
cd nima-core
pip install -e ".[vector]"
python -m pytest tests/
```

---

## License

MIT License โ€” free for any AI agent, commercial or personal.

---

<p align="center">
  <a href="https://nima-core.ai"><b>๐ŸŒ nima-core.ai</b></a><br/>
  Built by the NIMA Core Team
</p>

```

### _meta.json

```json
{
  "owner": "dmdorta1111",
  "slug": "nima-core",
  "displayName": "Nima Core",
  "latest": {
    "version": "3.3.0",
    "publishedAt": 1772664487311,
    "commit": "https://github.com/openclaw/skills/commit/45f4a690aba8a3e916063d9b5c6de40bab1f39fc"
  },
  "history": [
    {
      "version": "3.1.4",
      "publishedAt": 1772144958808,
      "commit": "https://github.com/openclaw/skills/commit/78693a747dbf4cbdfc3be89d412a2764c98bffef"
    },
    {
      "version": "3.1.1",
      "publishedAt": 1771957651332,
      "commit": "https://github.com/openclaw/skills/commit/a7b8643b21c25d4dd916036e71cd7970e94aa356"
    },
    {
      "version": "3.0.9",
      "publishedAt": 1771944682255,
      "commit": "https://github.com/openclaw/skills/commit/f8cc7869740ca954e77479a6e5ccf81ebdce20f4"
    },
    {
      "version": "3.0.8",
      "publishedAt": 1771943101021,
      "commit": "https://github.com/openclaw/skills/commit/1cab65558240df864803a1e7be80ebfd3ca62e6a"
    },
    {
      "version": "3.0.7",
      "publishedAt": 1771890084978,
      "commit": "https://github.com/openclaw/skills/commit/b7d4334e56ce500a7067e9ccc0e982abdd464420"
    },
    {
      "version": "3.0.5",
      "publishedAt": 1771790484757,
      "commit": "https://github.com/openclaw/skills/commit/ba321856c2508265b08053dbcb36d96593830438"
    },
    {
      "version": "2.5.0",
      "publishedAt": 1771714446264,
      "commit": "https://github.com/openclaw/skills/commit/1cb3ff0381c4490f48c0f84dcb8935120d1cf8c5"
    },
    {
      "version": "2.4.0",
      "publishedAt": 1771563321891,
      "commit": "https://github.com/openclaw/skills/commit/5abaa6d6d5fbc2d789b261a5021ebcb6a1a93e00"
    },
    {
      "version": "2.0.12",
      "publishedAt": 1771359509622,
      "commit": "https://github.com/openclaw/skills/commit/7ab3078715475f79888c616fc1de339819d5d20f"
    },
    {
      "version": "2.0.11",
      "publishedAt": 1771276176203,
      "commit": "https://github.com/openclaw/skills/commit/556382037a1905731f1934c5d1b48c3238b5b48a"
    },
    {
      "version": "2.0.5",
      "publishedAt": 1771200937155,
      "commit": "https://github.com/openclaw/skills/commit/ac87a44e7cc8ec08b9bd4c1a5e05bdbf152a76b7"
    },
    {
      "version": "2.0.3",
      "publishedAt": 1771191526487,
      "commit": "https://github.com/openclaw/skills/commit/dfe10e538a9161aab465807bb7a97cbb4c1dbcea"
    },
    {
      "version": "1.2.1",
      "publishedAt": 1770921145653,
      "commit": "https://github.com/openclaw/skills/commit/8beb64fbc65575b10800539edcaac3caf75ac0fc"
    },
    {
      "version": "1.2.0",
      "publishedAt": 1770843883117,
      "commit": "https://github.com/openclaw/skills/commit/0edab9ccb7a7a97134251b34dfa2e2368c209360"
    },
    {
      "version": "1.1.8",
      "publishedAt": 1770681806068,
      "commit": "https://github.com/openclaw/skills/commit/d9a8720e6d1253976f3c6bf7f46c083f33139321"
    },
    {
      "version": "1.1.3",
      "publishedAt": 1770672761613,
      "commit": "https://github.com/openclaw/skills/commit/5a2f4192c832ab874151c848e466f9df0e469692"
    },
    {
      "version": "1.0.1",
      "publishedAt": 1770519020241,
      "commit": "https://github.com/openclaw/skills/commit/04ac6b51cc35469a308eea76136e2ea89f69f9c0"
    },
    {
      "version": "1.0.0",
      "publishedAt": 1770450849405,
      "commit": "https://github.com/openclaw/skills/commit/82820a2650a491916128db5b3cbb354cc54aacfe"
    }
  ]
}

```

nima-core | SkillHub