Product Introduction
- Definition: Clippy is a macOS-native AI productivity assistant leveraging local large language models (LLMs) for contextual text generation. It operates as an overlay that integrates directly with active text fields across applications.
- Core Value Proposition: Eliminates manual copy-pasting by inserting AI-generated responses contextually into workflows via voice/text commands, prioritizing privacy through fully local processing.
Main Features
- Contextual In-Place Editing:
- How it works: Monitors active text fields (e.g., emails, docs) using macOS accessibility APIs. Local LLMs (e.g., quantized Mistral, Llama 2) process requests and paste outputs directly into the target field.
- Technologies: On-device model inference (via Ollama/LM Studio), system-level event monitoring.
- Voice Command Execution:
- How it works: Converts speech to text via local Whisper.cpp models, processes commands through LLMs, and executes actions (e.g., "Create Linear ticket for bug fix").
- Technologies: Offline speech-to-text, natural language understanding (NLU), Linear/Google Calendar API integrations.
- Stealth Mode UI:
- How it works: Non-intrusive popup triggered by global shortcut (⌘+Shift+C). Auto-hides post-task completion using macOS window management protocols.
- Technologies: Swift/Objective-C frameworks for low-latency overlay rendering.
Problems Solved
- Pain Point: Disrupted workflows from switching apps to use AI tools; privacy risks of cloud-based LLMs handling sensitive data.
- Target Audience:
- Privacy-focused professionals (legal, healthcare).
- Developers/content creators needing rapid text generation.
- Executives managing calendars/tasks via voice.
- Use Cases:
- Drafting client emails in Gmail without leaving the tab.
- Creating Linear tickets during coding sessions via voice.
- Editing Google Calendar events hands-free during meetings.
Unique Advantages
- Differentiation:
- Outperforms cloud-dependent tools (e.g., ChatGPT desktop) with offline functionality and zero data transmission.
- Surpasses snippet tools (e.g., TextExpander) via contextual LLM intelligence.
- Key Innovation:
- First macOS assistant combining local LLMs, system-level context awareness, and voice-controlled app integrations (Linear/Calendar) in a single lightweight (<100MB) package.
Frequently Asked Questions (FAQ)
- Does Clippy require an internet connection?
No, Clippy processes all data locally using on-device LLMs and speech models, ensuring full offline functionality. - How does Clippy maintain user privacy?
All AI inference occurs on-device; no data is sent to external servers, complying with GDPR/hipaa standards for sensitive information. - Which macOS versions are supported?
Compatible with macOS Sonoma (14.0+) and Apple Silicon (M1/M2/M3) or Intel x86_64 architectures. - Can Clippy integrate with other apps besides Linear/Calendar?
Current v0.4.0-alpha.4 supports Linear and Google Calendar; future updates will add APIs for custom integrations. - What local LLMs does Clippy support?
Optimized for quantized models (e.g., Mistral 7B, Llama 3 8B) via Ollama, with automatic prompt templating for context awareness.
