What's change:
- Modified
context.length
extraction method to support more diverse LLMs. - Implemented context window functionality through dialogue process, allowing reference to interactive LLM processes.
- Developed functional capabilities on the context window:
- Controlled number of stored dialogue rounds via
MAX_CONVERSATIONS
parameter inapp.js
. - Added slot for real-time monitoring and dynamic control of short-term memory capacity based on dialogue history.
- Controlled number of stored dialogue rounds via
Release Notes:
- Improved compatibility with various large language models.
- Enhanced context window functionality for interactive processes.
- Optimized LLM performance through dynamic memory management.
Full Changelog: v1.0.2...v1.0.3