A chat client for the Anthropic LLM API that can use MCP tools over WebSockets.
- git
- npm
- Clone the repository:
git clone https://github.com/nick1udwig/kibitz.git
cd kibitz
- Install dependencies:
npm install
- Run the development server:
npm run dev
- Open http://localhost:3000 in your browser.
- Open the Settings panel in the UI
- Enter your Anthropic API key (Get one here).
- Optionally set a system prompt
- Configure MCPs by running them using ws-mcp and then connecting to them in the Settings page
Note configuration is PER-PROJECT. When creating a new project, it will use some, but not all, of the current project's configuration: the API key, model, and system prompt will be copied over, but MCP servers will not.
- Add a base to the endpoint by building with the
NEXT_PUBLIC_BASE_PATH
(MUST start with a/
), - Change the default WS-MCP server URI by specifying
NEXT_PUBLIC_DEFAULT_WS_URI
(MUST start with a/
),
like so:
NEXT_PUBLIC_BASE_PATH=/kibitz:kibitz:nick.kino NEXT_PUBLIC_DEFAULT_WS_URI=/fwd-ws:kibitz:nick.kino npm run build
and then copy the contents of out/
into the package's pkg/ui/
dir.