Keel logo

Keel

An AI assistant whose memory belongs to you

2026-05-10

Product Introduction

  1. Definition: Keel is a local-first, open-source desktop application for macOS and Windows that functions as a personal AI assistant with user-owned memory. Technically, it is a context-aware interface for Large Language Models (LLMs) that reads from and writes to a local markdown file system.
  2. Core Value Proposition: Keel exists to provide a private, vendor-agnostic AI assistant experience. Its core proposition is data sovereignty and model flexibility, ensuring your conversation context and knowledge base remain in plain text files on your disk, not in a proprietary cloud, while allowing you to switch between AI providers like Claude, GPT, OpenRouter, or local Ollama models without lock-in.

Main Features

  1. Local-First Markdown Workspace: The application creates and manages a dedicated folder (e.g., ~/Keel) on your local drive. All data—including chat history, captured notes, tasks, and project wikis—is stored as plain .md markdown files. This enables editing with any text editor, backup with standard tools (like Git, Dropbox, or Time Machine), and full user ownership.
  2. Intelligent Context Engine: Keel continuously indexes your local markdown workspace. When you converse with your chosen LLM, the engine dynamically assembles the most relevant context from your files (notes, previous chats, project details) to inform the AI's responses. It also performs "auto-capture," writing valuable outputs like decisions, tasks, and summaries back into the appropriate markdown files automatically.
  3. Multi-Provider LLM Agnosticism: The application acts as a universal harness for multiple AI backends. Users can configure and seamlessly switch between API providers (Anthropic's Claude, OpenAI's GPT, OpenRouter's 300+ models) or locally run models via Ollama. The system can fall back to alternate providers if one is unavailable, ensuring uptime without compromising the local context.
  4. Integrated Knowledge & Task Management: Beyond chat, Keel includes built-in capabilities for structured knowledge and productivity. It can create queryable wiki bases from project folders (/create-kb), manage tasks with due dates and reminders backed by markdown, transcribe audio locally using Whisper.cpp, run scheduled prompt jobs, and surface a dashboard of open tasks and daily briefs.

Problems Solved

  1. Pain Point: Vendor Lock-in and Data Portability. Traditional AI assistants silo your conversation history and context within their cloud platforms, making it difficult to export, back up, or switch services without losing your "brain."
  2. Pain Point: Lack of Privacy and Control. Cloud-based AI services often use conversation data for training, involve telemetry, and store sensitive information on external servers outside user control.
  3. Target Audience: Privacy-conscious professionals (developers, researchers, writers, product managers), indie hackers and knowledge workers who rely on AI for daily tasks but demand data ownership, and organizations requiring offline-capable or auditable AI tools.
  4. Use Cases: Daily Planning & Review (automated morning briefs and end-of-day summaries grounded in your notes); Project Knowledge Management (creating and maintaining a searchable wiki from markdown and PDFs for any project); Meeting Intelligence (local audio transcription and automatic extraction of decisions/action items into project notes).

Unique Advantages

  1. Differentiation: Unlike cloud-based assistants (ChatGPT, Claude.ai) or note-taking apps with AI features, Keel's architecture is fundamentally local-first and file-based. It prioritizes being a thin, intelligent layer over your existing file system rather than a walled-garden database. Competitors like Obsidian with AI plugins may offer local storage but typically lack Keel's deep, automated context assembly and multi-model agnosticism built into a unified interface.
  2. Key Innovation: The separation of context (memory) from computation (the LLM). Keel innovates by treating the LLM as a replaceable "tenant" in a user-owned "house" (the markdown workspace). This decoupling allows for unprecedented flexibility in model choice and guarantees persistence of your intellectual capital independent of any AI service provider's policies or existence.

Frequently Asked Questions (FAQ)

  1. Where does Keel store my data? Keel stores all your data locally on your computer in a folder you specify (default ~/Keel) as plain markdown (.md) and text files. No data is stored on Keel's servers by default.
  2. Can I use Keel completely offline? Yes, you can use Keel fully offline by configuring it to use a local LLM via Ollama. Features like audio transcription also use the local Whisper.cpp model, ensuring no data leaves your machine.
  3. Is Keel really free and open source? Yes, Keel is open-source software released under the permissive MIT license. You can review, modify, and distribute its code. The application itself is free to download and use, with no account or subscription required.
  4. How does Keel handle my API keys for services like Claude or GPT? Your API keys are stored locally in the application's secure settings on your machine. Keel uses these keys to make direct API calls to your chosen provider but does not transmit them to any other server.
  5. Can I sync my Keel workspace across multiple computers? Keel does not have built-in cloud sync. However, because your workspace is a standard folder of plain text files, you can easily sync it using any third-party service you trust, such as Dropbox, Google Drive, Syncthing, or Git, giving you full control over your sync method and privacy.

Subscribe to Our Newsletter

Get weekly curated tool recommendations and stay updated with the latest product news