All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
- Updated settings to allow for multiple in preparation for integrations with other LLM providers
- Added "settingsVersion" to detect when breaking changes are made to settings
- Added support for using local models with Ollama
- Added option to clear API key in settings
- Added text in settings to indicate when API key is set
- Fixed issue with recent prompts not being saved for "Generate at cursor" command
- Removed setting commands for API key and LLM model
- Changed the API key setting to not
- Changed the minAppVersion to 1.5.12
- Various small changes to comply with Obsidian plugin guidelines
- Command for generating content from a YouTube video URL
- Added linting and pre-commit hooks
- Centered loading spinner in prompt modal again
- Removed fixed with for recent prompts dropdown to make it nicer on mobile
- Redesign of UI for prompt modal
- Recent prompts are now shown in a dropdown
- Added subtitles
- Optimized code for adding generation commands
- Moved some settings around in the settings panel
- Streamlined types for generation in preparation for more API integrations (avoiding OpenAI specific types)
- Added command for toggling streaming responses
- Added command for switching between LLMs
- Added command and setting for toggling recent prompts
- Refactor: Moved openai generation to separate class to prepare for more API integrations
- Changed some texts
- Added command for setting API key
- Fixed issue with recent prompts being set to a single one when limit was reached
- This changelog
- Added support for streaming responses from LLM
- Added "Ctrl + Enter" command to submit while in the prompt modal
- Fixed typo in settings variable name
- Initial release