Cairn
Knowledge, grounded in your data.
Local AI knowledge retrieval. Feed documents and codebases, ask questions, get answers with source attribution. Fully offline RAG powered by the Flower Garden.
Pipeline
From documents to answers.
Cairn's retrieval pipeline turns unstructured documents into grounded, cited responses. Every answer traces back to specific source passages.
STAGE LIBRARY
Ingest (load docs, chunk) Dahlia (structured storage)
Embed (vectorize chunks) Camellia (vector memory)
Index (keyword index) Aster (full-text search)
Retrieve (hybrid search) Camellia + Aster
Rerank (score + filter)
Generate (LLM with context)
Cite (source attribution) Capabilities
What Cairn does.
Retrieval-augmented generation without the cloud dependency.
Hybrid Search
Semantic + Keyword
Combines Camellia's vector similarity with Aster's BM25 keyword ranking. Catches what either alone would miss.
Source Citation
Grounded Answers
Every response links back to specific document passages. Verify claims against source material without leaving the interface.
Multi-format Ingest
Documents In
Markdown, PDF, code files, plain text. Chunked and embedded on ingest. Add entire directories or individual files.
Local Inference
Offline RAG
Runs against any Crucible or llama.cpp model. Your documents never leave your machine. No API keys needed.
Pluggable Storage
Flower Garden
Built on Dahlia for document storage, Camellia for embeddings, Aster for keyword indexing. Swap backends without changing code.
Incremental Updates
Living Knowledge
Add, update, or remove documents without rebuilding the entire index. Changed files are re-chunked and re-embedded automatically.
Get started
Feed it. Ask it.
# Initialize a knowledge base
cairn init my-docs
# Ingest a directory of documents
cairn ingest ./docs --recursive
# Ask a question
cairn ask "How does the authentication middleware work?"
# Get answer with source citations
# > Based on docs/auth/middleware.md (lines 42-67):
# > The middleware validates JWT tokens against...