Documentation Index
Fetch the complete documentation index at: https://docs.wolffi.sh/llms.txt
Use this file to discover all available pages before exploring further.
Installation
Prerequisites
Node.js (Required)
Node.js (Required)
Wolffish capability plugins are
.mjs files that execute in a Node.js runtime. Install Node.js 20+ from nodejs.org or via a version manager like nvm or fnm.Ollama (Required)
Ollama (Required)
Wolffish depends on Ollama for local LLM inference — it’s not optional. Install it from ollama.com.After installing, pull a model:Wolffish detects Ollama automatically on launch and won’t function without it.
Cloud Providers (Optional)
Cloud Providers (Optional)
For higher quality responses, configure API keys for one or both:
- Anthropic (Claude): Get an API key at console.anthropic.com
- OpenAI (GPT): Get an API key at platform.openai.com
Download
- macOS
- Windows
- Linux
Download the latest
.dmg from the releases page.- Open the
.dmgfile - Drag Wolffish to your Applications folder
- Launch Wolffish from Applications
On first launch, macOS may warn about an unidentified developer. Right-click the app and select “Open” to bypass this.
Build from Source
Verify Installation
On first launch, Wolffish creates~/.wolffish/workspace/ with default configuration files. You should see the onboarding screen prompting you to select an Ollama model.
To verify everything is working:
- Select a model (e.g.,
gemma3:4b) - Send a test message like “Hello, what can you do?”
- You should see a streaming response