Skip to content

Clara – A privacy-first, client-side AI assistant WebUI for Ollama. No backend, no data leaks. Keep your conversations completely yours, If you like it, Don't Forget to Give a Star 🌟

License

Notifications You must be signed in to change notification settings

tectijuana/ClaraVerse

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Clara Logo

Clara

Clara Logo

Privacy-First AI Assistant & Agent Builder

Chat with AI, create intelligent agents, and turn them into fully functional appsβ€”powered entirely by open-source models running on your own device.

Clara

Try Clara Online | Download Desktop App

Clara - Browser-Based AI for Chat, Agents & Image Generation Locally | Product Hunt

πŸ”’ Privacy First

  • Local Execution: Clara connects directly to Ollama and uses open-source language and image generation modelsβ€”all running on your device.
  • No Third-Party Clouds: Your data never leaves your machine. Zero telemetry, zero tracking.
  • Open-Source Technology: Built to leverage community-driven innovations, giving you full control over your AI stack.

✨ Key Features

AI Assistant

Chat with any Ollama-compatible model, including multimodal models that understand images:

Clara Assistant

🎨 Image Generation

Create amazing images from text prompts using Stable Diffusion models with ComfyUI integration:

Clara Image Generation

πŸ—οΈ Intelligent Agent Builder

Design custom AI agents with a node-based editor, then convert them into standalone apps without leaving Clara:

Clara Agent Builder

πŸ–ΌοΈ Image Gallery

Browse, search, and manage all generated images in one convenient gallery:

Clara Gallery

πŸš€ Installation Options

1. Docker (Recommended for Windows & Linux)

# Pull the image
docker pull claraverse/clara-ollama:latest

# Run with auto-restart
docker run -d --restart always -p 8069:8069 claraverse/clara-ollama:latest

Then visit http://localhost:8069 in your browser.

2. Native Desktop Apps

macOS (Signed)

  • Download .dmg installer
  • Universal binary (works on both Intel and Apple Silicon)
  • Fully signed and notarized for enhanced security

Linux (Signed)

Windows

  • We recommend using the Docker version for best performance and security
  • If you need the native app: Download .exe installer
  • I dont have money for signing it 😒

3. Web Version

Prerequisites

  1. Install Ollama (Required for all versions except Docker) Download from Ollama's website
  2. Connect Default Ollama endpoint: http://localhost:11434

πŸ“± Download Desktop App

For faster performance and offline convenience, download the native desktop version:

Mac Distribution Note

For Mac Users Installing This App

If you see a message that the app is damaged or can't be opened:

  1. Right-click (or Control+click) on the app in Finder
  2. Select "Open" from the context menu
  3. Click "Open" on the security dialog
  4. If still blocked, go to System Preferences > Security & Privacy > General and click "Open Anyway"

This happens because the app is not notarized with Apple. This is perfectly safe, but macOS requires this extra step for unsigned applications.

For Developers

Building for macOS:

  • Development build (no notarization): npm run electron:build-mac-dev
  • Production build (with notarization, requires Apple Developer Program):
    1. Set environment variables APPLE_ID, APPLE_ID_PASSWORD (app-specific password), and APPLE_TEAM_ID
    2. Run npm run electron:build-mac

To get an Apple Team ID, join the Apple Developer Program.

πŸ‘©β€πŸ’» Dev Zone

Development Setup

# Clone the repository
git clone https://github.com/badboysm890/ClaraVerse.git
cd clara-ollama

# Install dependencies
npm install

# Start development server (web)
npm run dev

# Start development server (desktop)
npm run electron:dev

Remote Ollama Connection

If Ollama runs on another machine:

  1. Enable CORS in Ollama (~/.ollama/config.json):
    {
      "origins": ["*"]
    }
  2. In Clara settings, specify: http://{IP_ADDRESS}:11434

Building for Production

# Build web version
npm run build

# Build desktop app
npm run electron:build

🀝 Support & Contact

Have questions or need help? Reach out via praveensm890@gmail.com.

About

Clara – A privacy-first, client-side AI assistant WebUI for Ollama. No backend, no data leaks. Keep your conversations completely yours, If you like it, Don't Forget to Give a Star 🌟

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 92.3%
  • TypeScript 6.9%
  • JavaScript 0.6%
  • PowerShell 0.1%
  • CSS 0.1%
  • Shell 0.0%