- Background
- Install Mattermost +
mattermost-plugin-ai
- Usage
- Supported Backends
- Community Resources
- Contributing
- License
π Join the "AI Exchange" community server channel where Mattermost's open source community is sharing the latest AI resources and innovations!
The Mattermost AI plugin adds functionality to use a wide variety of open source self hosted and vendor-hosted LLMs like OpenAI and GPT4All within Mattermost.
This plugin is currently experimental. Contributions and suggestions are welcome, see below!
The Mattermost AI Plugin is used as part of the Mattermost OpenOps framework for responsible development of AI-enhanced workflows with the ability to maintain full data control and data portability across different AI backends.
- Download the latest release from https://github.com/mattermost/mattermost-plugin-ai/releases
- Upload it to your server via System Console > Plugin Management.
- Enable the plugin and configure the settings as desired.
- Clone and enter this repository:
git clone https://github.com/mattermost/mattermost-plugin-ai && cd mattermost-plugin-ai
- Install
mattermost-plugin-ai
on Mattermost:`MM_SERVICESETTINGS_SITEURL=http://localhost:8065 MM_ADMIN_USERNAME=<YOUR_USERNAME> MM_ADMIN_PASSWORD=<YOUR_PASSWORD> make deploy`
- Access Mattermost and configure the plugin:
- Open Mattermost at
http://localhost:8065
- Select View in Browser
- In the top left Mattermost menu, click System Console β‘οΈ Mattermost AI Plugin
- Enable the plugin and configure plugin settings as desired. See Supported Backends.
See our demo setup OpenOps for an easy to start demo.
Chat with an LLM right inside the Mattermost interface. Answer are streamed so you don't have to wait:
Use the post menu or the /summarize
command to get a summary of the thread in a Direct Message from the AI Bot:
Respond to the bot post to ask follow up questions:
Just mention @ai anywhere in Mattermost to ask it to respond. It will be given the context of the thread you are participating in:
Create meeting summaries! Designed to work with the calls plugin's recording feature.
Context such as the current channel and user are supplied to the LLM when you make requests. Allowing customization of responses.
The LLM can lookup other users on the system if you ask about them.
OpenAI exclusive for now since it requires the function API.
You can ask about other channels and the LLM can ingest posts from that channel. For example you can ask it to summarize the last few posts in a channel. Note, depending on if you have CRT enabled this may not behave as you expect.
OpenAI exclusive for now since it requires the function API.
The LLM can attempt to lookup specific GitHub issues. For example you can paste a GitHub link into the chat and ask questions about it. Only the title and description for now.
OpenAI exclusive for now since it requires the function API.
Just for fun! Use the post menu to ask the bot to react to the post. It will try to pick an appropriate reaction.
reactforme.mp4
Bot posts have π π icons that collect user feedback. The idea would be to use this as input for RLHF fine tuning or prompt development.
All backends are configured in the system console settings page for the plugin.
Make sure to select your preferred backend under AI Large Language Model service
on the system console page after configuring.
To set this up get an OpenAI api key. You will need to sign up for an account if you don't have one already. You can go to https://platform.openai.com/account/api-keys to create a new one.
Configure the key in the system console and add a model like gpt-4
(better) or gpt-3.5-turbo
(faster and cheaper)
You will need to have an invite to the Anthropic API.
If you do you can create an APi key here: https://console.anthropic.com/account/keys
Configure the API key in the system console and configure a default model like claude-v1
.
You will need to ask Azure to enable OpenAI in your Azure account before you can use this API.
This api requires functions to be supported, and they are for now only on models version 0613
with API 2023-07-01-preview
. They are avaiable on limited datacenters right now. For moment of writing this docs avaiable regions for gpt-35-turbo v0613 are: Canada East, East US, France Central, Japan East, North Central US, UK South. More info in azure docs
Once you have been approved, you can create a new OpenAI resource. With the resource created you get access to the API key and the endpoint url clicking in keys and endpoints option of the menu.
Finally you have to deploy the model that you are going to use, normally gpt-35-turbo, clicking in "Model deployments", and managing the models from there. (TIP: don't select auto-update on your deployed model, it will auto-downgrade it to 0301 within about 5-10 minutes)
Configure the API key and the endpoint url for OpenAI Compatible in the system console and configure a default model like gpt-35-turbo
.
Can support any backend that is OpenAI compatable such as LocalAI which we use in the OpenOps demo.
Ask Sage is currently supported as an experimental stage feature. Token-based security is not yet available via the Ask Sage API, and server configuration would require securing the Mattermost server configuration data store, which will contain username and password in plaintext.
To configure, you need to purchase a commercial account from https://asksage.ai, enter the account's username
and password
on the System Console page and set the default model such as gpt-4
or gpt-3.5-turbo
.
The Ask Sage API doesn't yet support streaming, so there is less feedback to Mattermost users on intermediate information.
- "AI Exchange" channel on Mattermost Community server (for Mattermost community interested in AI)
- OpenOps General Discussion on Mattermost Forum
- OpenOps Troubleshooting Discussion on Mattermost Forum
- OpenOps Q&A on Mattermost Forum
- Mattermost Troubleshooting Discussion on Mattermost Forum
- Mattermost "Peer-to-peer Help" channel on Mattermost Community server
Thank you for your interest in contributing to our open source project! β€οΈ To get started, please read the contributor guidelines for this repository and check out the Help Wanted ticket list.
This repository is licensed under Apache-2.