Skip to content

v1.0.3 : Enhance Context Functionality and LLM Adaptability

Latest
Compare
Choose a tag to compare
@ml2068 ml2068 released this 07 Sep 06:53
· 18 commits to main since this release
b30ad3e

What's change:

  • Modified context.length extraction method to support more diverse LLMs.
  • Implemented context window functionality through dialogue process, allowing reference to interactive LLM processes.
  • Developed functional capabilities on the context window:
    1. Controlled number of stored dialogue rounds via MAX_CONVERSATIONS parameter in app.js.
    2. Added slot for real-time monitoring and dynamic control of short-term memory capacity based on dialogue history.

Release Notes:

  • Improved compatibility with various large language models.
  • Enhanced context window functionality for interactive processes.
  • Optimized LLM performance through dynamic memory management.

Full Changelog: v1.0.2...v1.0.3