diff --git a/.gitignore b/.gitignore index 975950e5..6bfb6984 100644 --- a/.gitignore +++ b/.gitignore @@ -144,8 +144,11 @@ certificates/* # Ignore the contents of the "src/client/assets" directory src/client/assets/* -# Ignore the contents of the "src/client/bundles" directory +# Ignore the contents of the "src/client/bundles" directory except for +# specific files. src/client/bundles/* +!src/client/bundles/terminal-chat-gpt/ +!src/client/bundles/terminal-chat-gpt/terminal_chat_gpt.py # Ignore the contents of the "src/client/config" directory src/client/config/* @@ -158,10 +161,16 @@ src/client/log/* # Ignore the contents of the "src/client/serial_communication" directory src/client/serial_communication/* -# Ignore the contents of the "src/server/skills_db" directory except for the file -# named "open_the_web_browser.json". +# Ignore the contents of the "src/server/skills_db" directory except for +# specific files. src/server/skills_db/* +!src/server/skills_db/bash/ +!src/server/skills_db/bash/f/ +!src/server/skills_db/bash/f/firefox/ !src/server/skills_db/bash/f/firefox/open_the_web_browser.json +!src/server/skills_db/bash/o/ +!src/server/skills_db/bash/o/openai/ +!src/server/skills_db/bash/o/openai/open_terminal_chat_gpt.json # Ignore the contents of the "src/tests" directory src/tests/* diff --git a/INSTALL b/INSTALL index 653fbd97..aef5d0df 100644 --- a/INSTALL +++ b/INSTALL @@ -497,13 +497,13 @@ $ pip3.12 install pytest Install Python in an Anaconda Python environment ------------------------------------------------- -Alternatively, you can replace Python 3.12 with Anaconda Python. Anaconda Python includes some well-optimized Python packages, so the speech recognition AI is supposed to work faster with Anaconda Python than with the official running Python, but everything depends on many factors, such as package versions and system resource availability. +With this alternative, you can skip using your current system Python (for example, Python 3.11) by installing 3.12 in an Anaconda Python environment for Commbase. Anaconda Python includes some well-optimized Python packages, so the speech recognition AI is supposed to work faster with Anaconda Python than with the official running Python. However, everything depends on many factors, such as package versions and system resource availability. More information on Anaconda: https://www.anaconda.com/ After the installation of Anaconda Python, you should see the string "(base)" in the foobar. -In case your system has a single Python installation using Anaconda Python if it is set up for running only in the Bash shell, you must enter the Bash shell before starting Commbase, starting anaconda-navigator, or starting any other Python script/program incapable of starting with the current Python's runtime that is preinstalled within your Operating System distribution: +If your system has a single Python installation using Anaconda Python, and it is set up to run only in the Bash shell, you must enter the Bash shell before starting Commbase, anaconda-navigator, or any other Python script/program that cannot start with the current Python runtime preinstalled within your Operating System distribution. Example of foobar before entering Bash: @@ -511,7 +511,7 @@ USER-NAME@HOSTNAME:~$ bash Example of foobar with Anaconda Python, identified by the string "(base)", after entering Bash: -(base) USER-NAME@shark:~$ +(base) USER-NAME@HOSTNAME:~$ To install PyPi-like packages in the (base) in Anaconda Python, use the command "conda": @@ -549,8 +549,6 @@ To verify the package was installed, in your terminal window or an Anaconda Pyth (commbase_env) $ conda list | grep ipython -Note: This documentation and Commbase code would be updated for future releases of Python to make everything Commbase compatible. - To deactivate the current environment and go back to (base) use: (commbase_env) $ conda deactivate @@ -605,7 +603,7 @@ Verify that Whisper is installed properly. $ conda list | grep openai-whisper -Install the rest pip packages, individually, from the commmbase_env environment. +Install the rest pip packages, individually, within the commmbase_env environment. $ pip install daal @@ -725,7 +723,7 @@ It contains the sh command. - espeak: -It is an alternative to festival. +Espeak is an alternative to Festival and is required by pyttsx3 on Linux. - festival. @@ -1160,4 +1158,4 @@ The output should be the Commbase introduction. From here, you can program new custom skills to customize your voice assistant or to create J.A.R.V.I.S. from Iron Man. -Check out the document User's Guide in the directory docs/ for a complete list of options. +Check out the document User's Guide in the directory assets/docs/ for a complete list of options. diff --git a/README.md b/README.md index af98369f..f8b270e5 100644 --- a/README.md +++ b/README.md @@ -16,7 +16,13 @@
-Commbase +Commbase + +
+ +
+ +[Watch the video](https://youtu.be/YMljUy37aSc?si=TiScnkugTpUjtXMn)
@@ -24,25 +30,181 @@
-Commbase, short for Communicative Commands Base, is a comprehensive platform designed to empower computers and computer-based devices with cutting-edge conversational AI capabilities. While it includes a flexible and extensible โœจ development framework โœจ at its core, Commbase goes beyond mere development tools. It serves as both a module and library bundler, incorporating third-party Large Language Models (LLMs) to enable advanced features such as "distributed AIs" or "multi-agent systems" (MAS). Commbase adapts across a diverse range of devices, from computers to smart appliances, robots, vehicles, Iron Man-like suits, starships, and beyond, ensuring universal accessibility and functionality. Operating effortlessly on either a single machine or a pair of machines in a client-server architecture, Commbase offers unparalleled adaptability and scalability. +๐Ÿค–โœจ Dialog-first platform for developing interactive AI-Powered assistant-like applications on Linux
+## Introduction + +**Commbase**, short for Communicative Commands Base, is a comprehensive platform designed to empower computers and computer-based devices with cutting-edge conversational AI capabilities. While it includes a flexible and extensible โœจ development framework โœจ at its core, Commbase goes beyond mere development tools. It serves as both a module and library bundler, incorporating third-party Language Models (LLMs, etc.) to enable advanced features such as "distributed AIs" or "multi-agent systems" (MAS). Commbase adapts across a diverse range of devices, from computers to smart appliances, robots, vehicles, Iron Man-like suits, starships, and beyond, ensuring universal accessibility and functionality. Operating effortlessly on either a single machine or a pair of machines in a client-server architecture, Commbase offers unparalleled adaptability and scalability. + +## Links and Resources +- GitHub Repository: https://github.com/mydroidandi/commbase +- Documentation: https://github.com/estebanways/commbase/tree/main/assets/docs +- Website: https://commbase-website.vercel.app/ + ## Features -| Topic | Description | +### Feature Insights + +| Topic | Features | |-------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------| | โš™๏ธ Control & Accessibility | Offers multiple methods to start and stop Commbase, including init systems, cron jobs, terminal commands, or voice commands. Provides a terminal-based interface and option for custom UIs. | | ๐Ÿ–ฅ๏ธ Multi-pane Support | Facilitates multitasking and organization within Commbase by providing a versatile workspace with options for navigating between different functionalities. | | ๐Ÿ“ Logging & Recording | Enables transparent monitoring of system activity, troubleshooting, and security maintenance through the recording of terminal and voice commands, along with secure access control. | | ๐Ÿ”ง Configuration | Simplifies customization of Commbase to meet specific user needs and preferences by offering easy-to-adjust centralized configuration files for built-in options. | -| ๐Ÿค– Advanced Capabilities | Demonstrates sophistication and cutting-edge AI functionalities by integrating Large Language Models (LLMs) and advanced assistant capabilities such as language understanding and vision. | +| ๐Ÿค– Advanced Capabilities | Demonstrates sophistication and cutting-edge AI functionalities by integrating Language Models and advanced assistant capabilities such as language understanding and vision. | + +### Detailed Feature Overview + +| Topic | Features | +|-------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| ๐Ÿ› ๏ธ Installation & Setup | * Detailed installation steps, structure, and contribution guidelines are available on the GitHub repository, ensuring secure and transparent setup processes. | +| ๐Ÿ’ป Commands & Interfaces | * Help option to see available built-in commands for terminal use or development.
* Built-in commands support both voice and terminal command calls. | +| โš™๏ธ Control & Accessibility | * Easy to start and stop using init systems, cron jobs, terminal commands, or voice commands.
* Terminal-based interface, with the option to create custom UIs. | +| ๐Ÿ”€ Navigation | * Seamlessly switch among windows using keybindings to access dashboards, server connections, etc. | +| ๐Ÿ–ฅ๏ธ Multi-pane Support | * Multiple panes are provided, covering functionalities like the chat room, STT engine (Speech-to-Text), server pane, client pane, and additional options. | +| ๐Ÿ“‚ Custom Applications | * Load custom terminal applications and seamlessly switch back to the chat room in the same pane. | +| ๐ŸŽจ Personalization | * Customizable application, user, and assistant names. | +| ๐Ÿ“ Logging & Recording | * Terminal and voice command recorder-transmitter area easy to install even in other hosts for remote control through networks, with enabled secure access control and logging. | +| ๐Ÿ”ง Configuration | * Effortlessly adjust and customize your settings with our platform's centralized configuration files. No code is required to configure built-in options. | +| ๐Ÿ—ฃ๏ธ STT & TTS Engines | * The STT Engine operates in Whisper reactive mode by default, with the ability to switch to Whisper proactive mode at any time.
* TTS Engines are dedicated for text-to-speech functionalities, so projects can effectively speak in various languages, enhancing accessibility and global reach. | +| ๐ŸŒ Internationalization | * Support to global audiences with seamless internationalization.
* Tailor your user experience by modifying i18n files to customize voice commands. | +| ๐Ÿ“œ Custom Scripting | * Effortlessly create custom server-side command scripts with a single file catalog database (scalable to any RDBMS), our up-to-date well-documented API, and JSON simplicity.
* Write command scripts in any programming language. | +| โ™ป๏ธ Reusable Functions | * Access to a comprehensive library of reusable functions to enhance custom command scripts with streamlined efficiency and tailored functionalities. | +| ๐Ÿค– Advanced Capabilities | * LLMs (Large Language Models), SLMs (Small Language Models), and other Language Models integration.
* Enhanced assistant capabilities with advanced language understanding, including additional features like vision. | +| ๐Ÿ“Š Activity Management | * Efficiently manage your system's activity and user interactions by seamlessly integrating Commbase logs with custom app logs.
* Logs include customizable backups of past assistant-user conversations. | +| ๐Ÿ›ก๏ธ Platform Stability | * The use of Linux enhances the stability, reliability, and security of our software, ensuring robust and flexible performance in various environments. | + +## Bundles + +Bundles in Commbase refer to packaged collections of software components, applications, dependencies, or data that are bundled together for ease of distribution, installation, or deployment in various computing environments. + +In the context of Commbase, users are allowed to modify existing bundles or create custom ones. This flexibility allows users to tailor the functionality and features to their specific needs, whether by adjusting configurations, integrating custom plugins or extensions, or creating entirely new functionalities within the software framework. + + + commbase-data-exchange + + +**commbase-data-exchange** + +An application for exchanging JSON data over HTTP and WebSocket connections, allowing clients to save and retrieve JSON data in real-time. [More Info](/bundles/commbase-data-exchange/) + + + commbase-genai-llm-google-gemini-p + + +**commbase-genai-llm-google-gemini-p** + +This app allows you to chat with Gemini, a powerful large language model. Ask questions, get creative writing inspiration, or receive help with tasks. [More Info](/bundles/commbase-genai-llm-google-gemini-p/) + + + commbase-genai-llm-meta-llama-p + + +**commbase-genai-llm-meta-llama-p** + +This app allows you to interact with Llama, a large language model known for its factual language understanding and reasoning abilities. Ask questions, get help with research, or gain insights from complex information. [More Info](/bundles/commbase-genai-llm-meta-llama-p/) + + + commbase-genai-llm-openai-gpt-p + + +**commbase-genai-llm-openai-gpt-p** + +An app designed to facilitate seamless interaction with the powerful OpenAI's GPT language model. This app empowers users to engage in natural language conversations with the model. [More Info](/bundles/commbase-genai-llm-openai-gpt-p/) + + + commbase-genai-slm-ollama-phi3-mini-memory + + +**commbase-genai-slm-ollama-phi3-mini-memory** + +Interacts with the Commbase platform by generating AI-driven responses using the Ollama API with the Phi3 mini model from Microsoft. This is a midweight implementation with session or short-term memory for language generation. [More Info](/bundles/commbase-genai-slm-ollama-phi3-mini-memory/) + + + commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone + + +**commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone** + +Interacts with the Commbase platform by generating AI-driven responses using the Ollama API with the Phi3 mini model from Microsoft. This is a heavyweight implementation with memory for language generation, featuring remote RAG and Picone for persistent memory. [More Info](/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/) + + + commbase-genai-slm-ollama-phi3-mini-nomemory + + +**commbase-genai-slm-ollama-phi3-mini-nomemory** + +Interacts with the Commbase platform by generating AI-driven responses using the Ollama API with the Phi3 mini model from Microsoft. This is a lightweight implementation without memory for language generation. [More Info](/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/) + + + commbase-recorder-transmitter-b + + +**commbase-recorder-transmitter-b** + +commbase-recorder-transmitter-b is a voice recorder and recording transmitter for Commbase. It also supports sending the recorded audio file to a remote host using scp when the recording is stopped with another key press. [More Info](/bundles/commbase-recorder-transmitter-b/) -For a detailed feature overview, follow this [`link.`](/assets/docs/Detailed%20Feature%20Overview.md) + + commbase-recorder-transmitter-s + + +**commbase-recorder-transmitter-s** + +commbase-recorder-transmitter-s is a voice recorder and recording transmitter for Commbase. It also supports sending the recorded audio file to a remote host using scp when the recording is stopped with another key press. [More Info](/bundles/commbase-recorder-transmitter-s/) + + + commbase-stt-whisper-proactive-p + + +**commbase-stt-whisper-proactive-p** + +This is an OpenAI's Whisper-based proactive ASR (Automatic Speech Recognition) engine designed to convert spoken language into written text without requiring an internet connection, making it a reliable and secure solution for any application that needs speech-to-text functionality. [More Info](/bundles/commbase-stt-whisper-proactive-p/) + + + commbase-stt-whisper-reactive-p + + +**commbase-stt-whisper-reactive-p** + +This is an OpenAI's Whisper-based reactive ASR (Automatic Speech Recognition) engine designed to convert spoken language into written text without requiring an internet connection, making it a reliable and secure solution for any application that needs speech-to-text functionality. [More Info](/bundles/commbase-stt-whisper-reactive-p/) + + + commbase-tts-gTTS + + +**commbase-tts-gTTS** + +A Commbase-ready TTS executable using gTTS (Google Text-to-Speech), a Python library and CLI tool to interface with Google Translate text-to-speech API. [More Info](/bundles/commbase-tts-gTTS/) + + + commbase-tts-pyttsx3 + + +**commbase-tts-pyttsx3** + +commbase-tts-pyttsx3 is a Python application that utilizes text-to-speech capabilities to read out the content of a file or any input text provided. It uses the pyttsx3 library for text-to-speech functionality. This application supports multiple languages, allowing you to generate speech in various languages. [More Info](/bundles/commbase-tts-pyttsx3/) + + + libcommbase + + +**libcommbase** + +libcommbase is a core collection of libraries that provides foundational functionalities to Commbase and offers functionalities to Commbase-based applications. It centralizes common functions that can be shared across multiple conversational AI-powered assistant projects. [More Info](/bundles/libcommbase/) ## Usage -[`INSTALL`](./INSTALL) Commbase and then run the next command in a terminal to start the program: +
+ +Commbase + +
+ +
+ +[**INSTALL**](./INSTALL) Commbase and then run the next command in a terminal to start the program: `commbase start` @@ -56,13 +218,13 @@ Check out the User's Guide for a complete list of options. ## Installation -For information on how to install Commbase, follow the steps in the file [`INSTALL`](./INSTALL). +For information on how to install Commbase, follow the steps in the file [**INSTALL**](./INSTALL). ## User's Guide For information on how to set up, use, and program Commbase read the User's Guide. -* [`docs`](/assets/docs) +* [docs](/assets/docs) ## Structure @@ -81,20 +243,35 @@ This project is coded and organized like this: โ”‚ โ”œโ”€โ”€ commbase-data-exchange โ”‚ โ”‚ โ”œโ”€โ”€ client โ”‚ โ”‚ โ”‚ โ””โ”€โ”€ templates +โ”‚ โ”‚ โ”œโ”€โ”€ examples โ”‚ โ”‚ โ””โ”€โ”€ server โ”‚ โ”‚ โ”œโ”€โ”€ client_data โ”‚ โ”‚ โ””โ”€โ”€ templates -โ”‚ โ”œโ”€โ”€ commbase-llm-google-gemini-p -โ”‚ โ”œโ”€โ”€ commbase-llm-meta-llama-p -โ”‚ โ”œโ”€โ”€ commbase-llm-openai-gpt-p +โ”‚ โ”œโ”€โ”€ commbase-genai-llm-google-gemini-p +โ”‚ โ”‚ โ””โ”€โ”€ examples +โ”‚ โ”œโ”€โ”€ commbase-genai-llm-meta-llama-p +โ”‚ โ”‚ โ””โ”€โ”€ examples +โ”‚ โ”œโ”€โ”€ commbase-genai-llm-openai-gpt-p +โ”‚ โ”‚ โ””โ”€โ”€ examples +โ”‚ โ”œโ”€โ”€ commbase-genai-slm-ollama-phi3-mini-memory +โ”‚ โ”‚ โ””โ”€โ”€ examples +โ”‚ โ”œโ”€โ”€ commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone +โ”‚ โ”‚ โ”œโ”€โ”€ examples +โ”‚ โ”‚ โ””โ”€โ”€ resources +โ”‚ โ”‚ โ””โ”€โ”€ src +โ”‚ โ”œโ”€โ”€ commbase-genai-slm-ollama-phi3-mini-nomemory +โ”‚ โ”‚ โ””โ”€โ”€ examples โ”‚ โ”œโ”€โ”€ commbase-recorder-transmitter-b +โ”‚ โ”‚ โ””โ”€โ”€ examples โ”‚ โ”œโ”€โ”€ commbase-recorder-transmitter-s +โ”‚ โ”‚ โ””โ”€โ”€ examples โ”‚ โ”œโ”€โ”€ commbase-stt-whisper-proactive-p โ”‚ โ”‚ โ””โ”€โ”€ examples โ”‚ โ”œโ”€โ”€ commbase-stt-whisper-reactive-p โ”‚ โ”‚ โ”œโ”€โ”€ client_data โ”‚ โ”‚ โ””โ”€โ”€ examples โ”‚ โ”œโ”€โ”€ commbase-tts-gTTS +โ”‚ โ”‚ โ””โ”€โ”€ examples โ”‚ โ”œโ”€โ”€ commbase-tts-pyttsx3 โ”‚ โ””โ”€โ”€ libcommbase โ”‚ โ”œโ”€โ”€ examples @@ -134,7 +311,7 @@ This project is coded and organized like this: โ”‚ โ”œโ”€โ”€ back-up โ”‚ โ””โ”€โ”€ bundles โ”‚ โ””โ”€โ”€ vu-meter -โ”œโ”€โ”€ src # This directory contains user-generated source code +โ”œโ”€โ”€ src # This directory contains user/dev-generated source code โ”‚ โ”œโ”€โ”€ client โ”‚ โ”‚ โ”œโ”€โ”€ assets # Stores different types of assets, such as images, fonts, and ASCII art โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ ascii @@ -142,6 +319,7 @@ This project is coded and organized like this: โ”‚ โ”‚ โ”œโ”€โ”€ bundles โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ audiobooks โ”‚ โ”‚ โ”‚ โ”‚ โ””โ”€โ”€ Robinson_Crusoe +โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ terminal-chat-gpt โ”‚ โ”‚ โ”‚ โ””โ”€โ”€ ... โ”‚ โ”‚ โ”œโ”€โ”€ config โ”‚ โ”‚ โ”œโ”€โ”€ i18n @@ -156,6 +334,7 @@ This project is coded and organized like this: โ”‚ โ”‚ โ”œโ”€โ”€ arduino โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ arduino_0 โ”‚ โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ commbase_hardware_notifications +โ”‚ โ”‚ โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ sentiment_component โ”‚ โ”‚ โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ speech_to_text_engine_component โ”‚ โ”‚ โ”‚ โ”‚ โ”‚ โ””โ”€โ”€ ... โ”‚ โ”‚ โ”‚ โ”‚ โ””โ”€โ”€ ... @@ -165,8 +344,11 @@ This project is coded and organized like this: โ”‚ โ”‚ โ””โ”€โ”€ skills_db โ”‚ โ”‚ โ”œโ”€โ”€ arduino โ”‚ โ”‚ โ”œโ”€โ”€ bash -โ”‚ โ”‚ โ”‚ โ””โ”€โ”€ f -โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ firefox +โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ f +โ”‚ โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ firefox +โ”‚ โ”‚ โ”‚ โ”‚ โ””โ”€โ”€ ... +โ”‚ โ”‚ โ”‚ โ””โ”€โ”€ o +โ”‚ โ”‚ โ”‚ โ”œโ”€โ”€ openai โ”‚ โ”‚ โ”‚ โ””โ”€โ”€ ... โ”‚ โ”‚ โ”œโ”€โ”€ c++ โ”‚ โ”‚ โ”œโ”€โ”€ java @@ -177,11 +359,100 @@ This project is coded and organized like this: โ”‚ โ”‚ โ””โ”€โ”€ ... โ”‚ โ””โ”€โ”€ tests โ”œโ”€โ”€ tests # Stores the Commbase tests -โ””โ”€โ”€ user # Stores previous chat conversations and other user files +โ””โ”€โ”€ user # Stores chat conversations, assistant's memories, and other user workspace files โ”œโ”€โ”€ analysis_zone - โ””โ”€โ”€ conversation_logs + โ”œโ”€โ”€ conversation_logs + โ”œโ”€โ”€ knowledge_base + โ”œโ”€โ”€ memory_store + โ”‚ โ”œโ”€โ”€ commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone + โ”‚ โ”‚ โ”œโ”€โ”€ build + โ”‚ โ”‚ โ””โ”€โ”€ src + โ”‚ โ””โ”€โ”€ ... + โ””โ”€โ”€ ... ``` +## Tech Stack + +Core technologies directly involved in the application's functionality. + +### Required in The Default Installation + +[OpenAI Whisper](https://openai.com/research/whisper) | [Ollama](https://ollama.com) | [Microsoft Phi3](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) | [Picone](https://www.pinecone.io/) | [PyAudio](https://people.csail.mit.edu/hubert/pyaudio/) | [Awk](https://www.gnu.org/software/gawk/) | [Git](https://git-scm.com) | [Jq](https://stedolan.github.io/jq/) | [SSH](https://www.openssh.com) | [Flask](https://flask.palletsprojects.com)| [NumPy](https://numpy.org) | [Pandas](https://pandas.pydata.org) | [Pyttsx3](https://pypi.org/project/pyttsx3/) | [FFmpeg](https://ffmpeg.org) | [GNU/Linux](https://www.gnu.org/gnu/linux-and-gnu.html) | [Systemd](https://systemd.io/) | [Bash](https://www.gnu.org/software/bash/) | [Bc](https://www.gnu.org/software/bc/) | [Python](https://python.org) | [Oh My Tmux](https://github.com/gpakosz/.tmux) | [Tmux](https://github.com/tmux/tmux) + +### Optional + +[Anaconda](https://anaconda.com) | [Google Gemini](https://deepmind.google/technologies/gemini/) | [Meta Llama](https://llama.meta.com/) | [OpenAI GPT](https://openai.com/api/) | [Pgvector](https://github.com/pgvector/pgvector) | [PyTorch](https://pytorch.org) | [TensorFlow](https://tensorflow.org) | [PostgreSQL](https://postgresql.org) | [Arduino C++](https://www.arduino.cc/reference/en/) | [Nvidia](https://nvidia.com) | [Dash](http://gondor.apana.org.au/~herbert/dash/) | [Festival](http://www.cstr.ed.ac.uk/projects/festival/) + +### Development + +[GitHub](https://github.com) | [Zsh](https://zsh.sourceforge.io) | [Bach](https://bach.sh/) | [Flake8](https://flake8.pycqa.org) | [Pytest](https://pytest.org) | [ShellCheck](https://www.shellcheck.net) + +## Contests + +We are proud to participate in the following contests: + + + + + + +
+ + BackdropBuild Contest + +
+ BackdropBuild Contest +
+ + Gemini API Developer Competition + +
+ Gemini API Developer Competition +
+ +## Sponsors + +We extend our heartfelt thanks to the following sponsors for their generous support: + + + + + + + + +
+ + BackdropBuild + +
+ BackdropBuild +

Providing valuable mentorship.

+
+ + Vercel + +
+ Vercel +

Contributing deployment resources for testing.

+
+ + GitHub + +
+ GitHub +

Providing access to advanced dev tools.

+
+ + Pinecone + +
+ Pinecone +

Supplying vector database services.

+
+ +If you would like to sponsor our project, please reach out to [stv.herrera@gmail.com](mailto:stv.herrera@gmail.com). + ## Approach and Philosophy Our software embraces minimalism, the DRY (Don't Repeat Yourself) principle, and KISS (Keep It Simple, Stupid), ensuring streamlined features and eliminating unnecessary repetition for improved efficiency and simplicity, in line with the Unix philosophy. @@ -192,6 +463,14 @@ The open-source nature of our project empowers developers to extend its capabili Additionally, our modular design facilitates flexible customization and extension of functionalities according to user needs. You can also incorporate your unique serial and wireless hardware into our platform, enabling tailored communication and interaction with devices for enhanced versatility and functionality. +## Dialog-First Software Development Paradigm + +Our development philosophy is influenced by the emerging paradigm of Dialog-First, as detailed in Esteban Herrera Castro's paper "[Dialog-First](/assets/docs/Dialog-first%20software%20development%20paradigm.pdf): The Future of AI Assistants and Voice-Controlled Software and Hardware." This paradigm prioritizes voice commands and conversational interfaces over traditional GUIs and manual controls, aiming to enhance user experience, accessibility, and efficiency. + +By focusing on voice-first solutions, we address the limitations of conventional interfaces, making interactions more intuitive and reducing cognitive load. This approach not only improves usability but also supports diverse user needs, including those with disabilities that may find traditional interfaces challenging. + +Furthermore, embracing Dialog-First allows our software to seamlessly integrate with various devices, from smartphones to specialized hardware, fostering innovation in fields like smart homes, healthcare, and beyond. This adaptability ensures that our platform remains at the forefront of technological advancements in voice-controlled software and hardware. + ## Contributing To contribute to Commbase, follow these steps: @@ -216,4 +495,4 @@ If you want to contact us you can reach us at . ## License -This project uses the following license: [`License`](./COPYING). +This project uses the following license: [GPL-3.0](./COPYING). diff --git a/assets/docs/Detailed Feature Overview.md b/assets/docs/Detailed Feature Overview.md deleted file mode 100644 index 4463ff51..00000000 --- a/assets/docs/Detailed Feature Overview.md +++ /dev/null @@ -1,26 +0,0 @@ -
- -### Commbase Commbase - -
- -# Detailed Feature Overview - -| Topic | Features | -|-------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| ๐Ÿ› ๏ธ Installation & Setup | * Detailed installation steps, structure, and contribution guidelines are available on the GitHub repository, ensuring secure and transparent setup processes. | -| ๐Ÿ’ป Commands & Interfaces | * Help option to see available built-in commands for terminal use or development.
* Built-in commands support both voice and terminal command calls. | -| โš™๏ธ Control & Accessibility | * Easy to start and stop using init systems, cron jobs, terminal commands, or voice commands.
* Terminal-based interface, with the option to create custom UIs. | -| ๐Ÿ”€ Navigation | * Seamlessly switch among windows using keybindings to access dashboards, server connections, etc. | -| ๐Ÿ–ฅ๏ธ Multi-pane Support | * Multiple panes are provided, covering functionalities like the chat room, STT engine (Speech-to-Text), server pane, client pane, and additional options. | -| ๐Ÿ“‚ Custom Applications | * Load custom terminal applications and seamlessly switch back to the chat room in the same pane. | -| ๐ŸŽจ Personalization | * Customizable application, user, and assistant names. | -| ๐Ÿ“ Logging & Recording | * Terminal and voice command recorder-transmitter area easy to install even in other hosts for remote control through networks, with enabled secure access control and logging. | -| ๐Ÿ”ง Configuration | * Effortlessly adjust and customize your settings with our platform's centralized configuration files. No code is required to configure built-in options. | -| ๐Ÿ—ฃ๏ธ STT & TTS Engines | * The STT Engine operates in Whisper reactive mode by default, with the ability to switch to Whisper proactive mode at any time.
* TTS Engines are dedicated for text-to-speech functionalities, so projects can effectively speak in various languages, enhancing accessibility and global reach. | -| ๐ŸŒ Internationalization | * Support to global audiences with seamless internationalization.
* Tailor your user experience by modifying i18n files to customize voice commands. | -| ๐Ÿ“œ Custom Scripting | * Effortlessly create custom server-side command scripts with a single file catalog database (scalable to any RDBMS), our up-to-date well-documented API, and JSON simplicity.
* Write command scripts in any programming language. | -| โ™ป๏ธ Reusable Functions | * Access to a comprehensive library of reusable functions to enhance custom command scripts with streamlined efficiency and tailored functionalities. | -| ๐Ÿค– Advanced Capabilities | * LLMs (Large Language Models) integration.
* Enhanced assistant capabilities with advanced language understanding, including additional features like vision. | -| ๐Ÿ“Š Activity Management | * Efficiently manage your system's activity and user interactions by seamlessly integrating Commbase logs with custom app logs.
* Logs include customizable backups of past assistant-user conversations. | -| ๐Ÿ›ก๏ธ Platform Stability | * The use of Linux enhances the stability, reliability, and security of our software, ensuring robust and flexible performance in various environments. | diff --git a/assets/docs/Dialog-first software development paradigm.pdf b/assets/docs/Dialog-first software development paradigm.pdf new file mode 100644 index 00000000..ead20baf Binary files /dev/null and b/assets/docs/Dialog-first software development paradigm.pdf differ diff --git a/assets/docs/User's Guide.md b/assets/docs/User's Guide.md index 720f650f..9e69fa4d 100644 --- a/assets/docs/User's Guide.md +++ b/assets/docs/User's Guide.md @@ -10,7 +10,7 @@ Welcome to the [Commbase](https://github.com/mydroidandi/commbase) User's Guide! ## What is Commbase? -Commbase is a module and library bundler, and a flexible โœจ development framework โœจ designed to empower computers and other devices with advanced conversational AI capabilities. It incorporates third-party LLMs to enable features such as "distributed AIs" or "multi-agent systems" (MAS). Commbase extends its reach to applications and diverse devices, including droids, smart appliances, and various types of vehicles. It runs on a single machine or a set of two machines in a client-server architecture. Commbase was originally developed as a single-file script by the computer scientist Esteban Herrera in 2022. +Commbase, short for Communicative Commands Base, is a comprehensive platform designed to empower computers and computer-based devices with cutting-edge conversational AI capabilities. While it includes a flexible and extensible โœจ development framework โœจ at its core, Commbase goes beyond mere development tools. It serves as both a module and library bundler, incorporating third-party Language Models (LLMs, etc.) to enable advanced features such as "distributed AIs" or "multi-agent systems" (MAS). Commbase adapts across a diverse range of devices, from computers to smart appliances, robots, vehicles, Iron Man-like suits, starships, and beyond, ensuring universal accessibility and functionality. Operating effortlessly on either a single machine or a pair of machines in a client-server architecture, Commbase offers unparalleled adaptability and scalability. Commbase was originally developed as a single-file script by computer scientist Esteban Herrera and was committed to GitHub in 2022. ## Content @@ -908,6 +908,16 @@ The default version of the file **commbase.conf** contains the next values: - Example value: -`bundles/commbase-recorder-transmitter-x/reccomm.sh` (Default): The path to a custom recorder-transmitter executable. +- **RUN_VOICE_RECORDER_IN_PANE_PATH**: + - **Description**: This variable specifies the path to the script that runs the voice recorder within a tmux pane. The script is located within the **bundles/libcommbase/libcommbase/routines/** directory. + - **Possible values**: + - `bundles/libcommbase/libcommbase/routines/run_voice_recorder_in_pane.sh` (Default): The default and typical value, indicating the relative path to the voice recorder script. + +- **COMMBASE_RECORDER_TRANSMITTER_QUIT_CHAR**: + - **Description**: This variable specifies the character used to quit the voice recorder or transmitter process. This character is sent as an input to signal the process to terminate. This variable value is used only fo reference in scripts like **bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/commbase-genai-slm-ollama-phi3-mini-nomemory.py**. Change it only if you change the value of the constant `QUIT_CHAR="q"` of the commbase recorder in the files **bundles/commbase-recorder-transmitter-b/reccom.sh**, **bundles/commbase-recorder-transmitter-s/reccom.sh**, or a custom file somewhere like **bundles/commbase-recorder-transmitter-x/**. Checkout the variables `CUSTOM_RECORDER_TRANSMITTER_FILE` and `RECORDER_TRANSMITTER_FILE`. + - **Possible values**: + - `q` (Default): If set to this value, the character `q` is used to quit the voice recorder or transmitter process. + - **STT_ENGINE_MODEL_DIRECTORY**: - Description: It specifies the path to the directory of the current STT engine's model in use. In order to facilitate internationalization, the directory resides in **bundles/libcommbase/resources/i18n/** and **src/client/i18n/**. Check out the variable `STT_ENGINE_MODEL_SUBDIRECTORY`. - Possible values: @@ -936,6 +946,11 @@ The default version of the file **commbase.conf** contains the next values: - Possible values: - `/bundles/commbase-stt-whisper-reactive-p/client_data/recording.wav` (Default): The Commbase STT Whisper reactive p client temporary audio recording file. The purpose of this file is for storing audio data obtained from the execution of the bash script **commrecorder.sh** in the directory **bundles/commbase-stt-whisper-reactive-p/**. **commbase_stt_whisper_reactive_p.py** monitors the modification time of this file and, upon detecting changes, transcribes the audio content using the Whisper ASR (Automatic Speech Recognition) model. The transcribed text is then printed and appended to the chatroom pane. +- **COMMBASE_STT_WHISPER_REACTIVE_P_AUTO_OPEN_RECORDER_AFTER_COMMAND**: + - Description: It determines whether the recorder should automatically open after a command is given when using the reactive Whisper STT engine. In this case, commands refers to any user command or Commbase internal command run in the recorder-transmitter terminal when it runs in the local host. + - Possible values: True or False. + - Example value: `True` (Default): Set to true, the recorder will automatically open after a command is given. + - **STT_ENGINE_PROCESSING_TIME_VISIBLE_ON**: - Description: It specifies whether yes or not the running STT displays the speech processing time in the STT engine pane after every processing is complete. - Possible values: True or False. @@ -1182,6 +1197,18 @@ The default version of the file **commbase.conf** contains the next values: - Example value: - `30` (Default): This value indicates that conversation files will be retained or considered valid for a maximum period of 30 days. Check out the variable `MAX_CONVERSATION_FILES_COUNT`. +CHAT_MEMORY_FILE="data/.chat_memory.txt" + +- **CHAT_MEMORY_FILE**: + - Description: This variable represents the path to the file where chat memory is stored. It is used to persist the history of prompts and responses across interactions within the chat system. + - Example value: + - `"data/.chat_memory.txt"` (Default): The current session chat log memory file path. + +- **ASSISTANT_DISCOURSE_FROM_LANGUAGE_MODEL_FILE**: + - Description: This variable specifies the path to the script that generates audible discourses from text that comes from the language model. The script is located within the **bundles/libcommbase/libcommbase/routines/** directory. + - Possible values: + - `bundles/libcommbase/libcommbase/routines/assistant_discourse_from_language_model.sh` (Default): The default and typical value, indicating the relative path to the assistant discourse script. + - **ANSWER_UNKNOWN_COMMANDS_USING_AI_CHATBOT_ON**: - Description: This variable is used in the function **bundles/libcommbase/libcommbase/routines/skills_else**. - Possible values: True or False. @@ -1268,65 +1295,107 @@ The default version of the file **commbase.conf** contains the next values: - Example value: - `3600` (Default): Set to 3600 seconds (equivalent to 1 hour and 0 minutes), indicates that the system checks for pending tasks approximately every 60 minutes. +- **ASSISTANT_RESPONSE_SENTIMENT_ANALYSIS_ON**: + - Description: A flag that determines whether the sentiment analysis feature for assistant responses is enabled or not. When set to `True`, the sentiment analysis is active, allowing the system to evaluate and act upon the emotional tone of the assistant's responses. Using Motors: If sentiment is positive, activate motors to simulate a hugging action. For example, robotic arms could move to embrace the user. Using Screens: Display a smiling face on a screen. For example, showing an animated smiley face on a digital display. + - Possible values: + - `True`: Enables sentiment analysis for assistant responses. + - `False`: Disables sentiment analysis for assistant responses. + - Example value: + - `True` (Default): Set to `True`, indicating that the system will perform sentiment analysis on the assistant's responses, facilitating actions based on the detected sentiment. + - **COMMBASE_HARDWARE_NOTIFICATIONS_ON**: - Description: Indicates whether Commbase hardware notifications are enabled or disabled. If set to "True", hardware notifications are enabled; otherwise, they are disabled. - Possible values: "True" or "False". Set to "True" only if you are making Commbase hardware, such as: hardware interfaces, a speech to text engine component, or any other hardware device based on Commbase (droids, smart appliances, vehicles, etc.) Check out the existent Commbase hardware variables. - Example value: - `False` (Default): Hardware notifications are disabled. +- **COMMBASE_HARDWARE_DEVICE_0**: + - Description: The path of the hardware device used for communication, for example, an Arduino or any other microcontroller. Check out the other existent Commbase hardware variables. + - Possible values: A valid path to a hardware device. + - Example value: + - `/dev/ttyACM0` (Default): Path to hardware device 0. + +- **COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONENT_ON**: + - Description: A flag that indicates whether the speech-to-text engine component of the Commbase hardware is enabled (if present). Relies on the value of the variable `COMMBASE_HARDWARE_NOTIFICATIONS_ON`. + - Possible values: + - `"True"`: The speech-to-text engine is enabled and operational. + - `"False"`: The speech-to-text engine is disabled and not in use. + - Example value: + - `"True"` (Default): Indicates that the speech-to-text engine component is currently enabled and functioning. + - **COMMBASE_HARDWARE_NOTIFICATION_LISTENING_START_ON**: - - Description: Indicates whether Commbase hardware notifications for the start of the STT engines listening process are enabled or disabled. Check out the other existent Commbase hardware variables. + - Description: Indicates whether Commbase hardware notifications for the start of the STT engines listening process are enabled or disabled. Check out the other existent Commbase hardware variables. Relies on the value of the variables `COMMBASE_HARDWARE_NOTIFICATIONS_ON` and `COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONENT_ON`. - Possible values: "True" or "False". - Example value: - `True` (Default): Notifications for the start of the listening process are enabled. - **COMMBASE_HARDWARE_NOTIFICATION_LISTENING_STOP_ON**: - - Description: Indicates whether Commbase hardware notifications for the stop of the STT engines listening process are enabled or disabled. Check out the other existent Commbase hardware variables. + - Description: Indicates whether Commbase hardware notifications for the stop of the STT engines listening process are enabled or disabled. Check out the other existent Commbase hardware variables. Relies on the value of the variables `COMMBASE_HARDWARE_NOTIFICATIONS_ON` and `COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONENT_ON`. - Possible values: "True" or "False". - Example value: - `True` (Default): Notifications for the stop of the listening process are enabled. - **COMMBASE_HARDWARE_NOTIFICATION_PROCESSING_START_ON**: - - Description: Indicates whether Commbase hardware notifications for the start of the STT engines processing are enabled or disabled. Check out the other existent Commbase hardware variables. + - Description: Indicates whether Commbase hardware notifications for the start of the STT engines processing are enabled or disabled. Check out the other existent Commbase hardware variables. Relies on the value of the variables `COMMBASE_HARDWARE_NOTIFICATIONS_ON` and `COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONENT_ON`. - Possible values: "True" or "False". - Example value: - `True` (Default): Notifications for the start of the processing are enabled. - **COMMBASE_HARDWARE_NOTIFICATION_PROCESSING_STOP_ON**: - - Description: Indicates whether Commbase hardware notifications for the stop of the STT engines processing are enabled or disabled. Check out the other existent Commbase hardware variables. + - Description: Indicates whether Commbase hardware notifications for the stop of the STT engines processing are enabled or disabled. Check out the other existent Commbase hardware variables. Relies on the value of the variables `COMMBASE_HARDWARE_NOTIFICATIONS_ON` and `COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONENT_ON`. - Possible values: "True" or "False". - Example value: - `True` (Default): Notifications for the stop of the processing are enabled. -- **COMBASE_HARDWARE_DEVICE_0**: - - Description: The path of the hardware device used for communication, for example, an Arduino or any other microcontroller. Check out the other existent Commbase hardware variables. - - Possible values: A valid path to a hardware device. - - Example value: - - `/dev/ttyACM0`: Path to hardware device 0. - - **COMMBASE_HARDWARE_COMMAND_LISTENING_START_FILE**: - Description: The path to the file containing the command to start STT engines listening on the Commbase hardware. Check out the other existent Commbase hardware variables. - Possible values: A valid file path. - Example value: - - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/listening_start.dat`: Path to the file for starting the working STT engine listening process. + - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/listening_start.dat` (Default): Path to the file for starting the working STT engine listening process. - **COMMBASE_HARDWARE_COMMAND_LISTENING_STOP_FILE**: - Description: The path to the file containing the command to stop STT engines listening on the Commbase hardware. Check out the other existent Commbase hardware variables. - Possible values: A valid file path. - Example value: - - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/listening_stop.dat`: Path to the file for stopping the working STT engine listening process. + - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/listening_stop.dat` (Default): Path to the file for stopping the working STT engine listening process. - **COMMBASE_HARDWARE_COMMAND_PROCESSING_START_FILE**: - Description: The path to the file containing the command to start STT engines processing on the Commbase hardware. Check out the other existent Commbase hardware variables. - Possible values: A valid file path. - Example value: - - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/processing_start.dat`: Path to the file for starting the working STT engine processing. + - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/processing_start.dat` (Default): Path to the file for starting the working STT engine processing. - **COMMBASE_HARDWARE_COMMAND_PROCESSING_STOP_FILE**: - Description: The path to the file containing the command to stop STT engines processing on the Commbase hardware. Check out the other existent Commbase hardware variables. - Possible values: A valid file path. - Example value: - - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/processing_stop.dat`: Path to the file for stopping the working STT engine processing. + - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/processing_stop.dat` (Default): Path to the file for stopping the working STT engine processing. + +- **COMMBASE_HARDWARE_SENTIMENT_COMPONENT_ON**: + - Description: A flag indicating whether the sentiment (from sentiment analysis) component of the Commbase hardware is enabled or disabled. Relies on the value of the variable `COMMBASE_HARDWARE_NOTIFICATIONS_ON`. + - Possible values: + - `True`: The sentiment component is enabled. + - `False`: The sentiment component is disabled. + - Example value: + - `True` (Default): The sentiment component is turned on, meaning that the hardware will perform sentiment actions. For example, the assistant response will include a sentiment action based on a response sentiment analysis. + +- **COMMBASE_HARDWARE_COMMAND_RANDOM_NEGATIVE_FILE**: + - Description: The file path to the dat file used by the Commbase hardware for random negative sentiment actions. This file contains data that the hardware uses to simulate negative sentiments. Check out the value of the variables `COMMBASE_HARDWARE_NOTIFICATIONS_ON` and `COMMBASE_HARDWARE_SENTIMENT_COMPONENT_ON`. + - Possible values: A valid file path. + - Example value: + - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/sentiment_component/random_negative.dat` (Default): Specifies the location of the dat file for negative sentiment actions. + +- **COMMBASE_HARDWARE_COMMAND_RANDOM_NEUTRAL_FILE**: + - Description: The file path to the dat file used by the Commbase hardware for random neutral sentiment actions. This file contains data that the hardware uses to simulate neutral sentiments. Check out the value of the variables `COMMBASE_HARDWARE_NOTIFICATIONS_ON` and `COMMBASE_HARDWARE_SENTIMENT_COMPONENT_ON`. + - Possible values: A valid file path. + - Example value: + - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/sentiment_component/random_neutral.dat` (Default): Specifies the location of the dat file for neutral sentiment actions. + +- **COMMBASE_HARDWARE_COMMAND_RANDOM_POSITIVE_FILE**: + - Description: The file path to the dat file used by the Commbase hardware for random positive sentiment actions. This file contains data that the hardware uses to simulate positive sentiments. Check out the value of the variables `COMMBASE_HARDWARE_NOTIFICATIONS_ON` and `COMMBASE_HARDWARE_SENTIMENT_COMPONENT_ON`. + - Possible values: A valid file path. + - Example value: + - `/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/sentiment_component/random_positive.dat` (Default): Specifies the location of the dat file for positive sentiment actions. Please ensure that these environment variables are correctly set with the appropriate values before running the application. diff --git a/assets/docs/images/backdropbuild-v4-contest-build.png b/assets/docs/images/backdropbuild-v4-contest-build.png new file mode 100644 index 00000000..c92d4dfd Binary files /dev/null and b/assets/docs/images/backdropbuild-v4-contest-build.png differ diff --git a/assets/docs/images/backdropbuild.png b/assets/docs/images/backdropbuild.png new file mode 100644 index 00000000..90bd9061 Binary files /dev/null and b/assets/docs/images/backdropbuild.png differ diff --git a/assets/docs/images/cover.png b/assets/docs/images/cover.png new file mode 100644 index 00000000..7b48ec82 Binary files /dev/null and b/assets/docs/images/cover.png differ diff --git a/assets/docs/images/gemini-api-developer-competition.svg b/assets/docs/images/gemini-api-developer-competition.svg new file mode 100644 index 00000000..5639354f --- /dev/null +++ b/assets/docs/images/gemini-api-developer-competition.svg @@ -0,0 +1,316 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/assets/docs/images/github.png b/assets/docs/images/github.png new file mode 100644 index 00000000..50c19291 Binary files /dev/null and b/assets/docs/images/github.png differ diff --git a/assets/docs/images/pinecone.png b/assets/docs/images/pinecone.png new file mode 100644 index 00000000..2b41196f Binary files /dev/null and b/assets/docs/images/pinecone.png differ diff --git a/assets/docs/images/vercel.png b/assets/docs/images/vercel.png new file mode 100644 index 00000000..da11c291 Binary files /dev/null and b/assets/docs/images/vercel.png differ diff --git a/bundles/commbase-llm-google-gemini-p/COPYING b/bundles/commbase-genai-llm-google-gemini-p/COPYING similarity index 100% rename from bundles/commbase-llm-google-gemini-p/COPYING rename to bundles/commbase-genai-llm-google-gemini-p/COPYING diff --git a/bundles/commbase-llm-google-gemini-p/README.md b/bundles/commbase-genai-llm-google-gemini-p/README.md similarity index 78% rename from bundles/commbase-llm-google-gemini-p/README.md rename to bundles/commbase-genai-llm-google-gemini-p/README.md index c7922f37..5cecbf1c 100644 --- a/bundles/commbase-llm-google-gemini-p/README.md +++ b/bundles/commbase-genai-llm-google-gemini-p/README.md @@ -1,15 +1,15 @@ [![Python Package using Conda](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml/badge.svg)](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml) [![Python Version](https://img.shields.io/badge/Python-3.10%20%7C%203.11%20%7C%203.12-blue)](https://img.shields.io/badge/python-3.10%20%7C%203.11%20%7C%203.12-blue) -# commbase-llm-google-gemini-p +# commbase-genai-llm-google-gemini-p -commbase-llm-google-gemini-p +commbase-genai-llm-google-gemini-p A terminal version of Google's Gemini. This app allows you to chat with Gemini, a powerful large language model. Ask questions, get creative writing inspiration, or receive help with tasks - all through a user-friendly interface. Whether you're a student, writer, or simply curious, this app is your gateway to unlocking the potential of AI language models. -## Unleash the Power of Language with commbase-llm-google-gemini-p +## Unleash the Power of Language with commbase-genai-llm-google-gemini-p -commbase-llm-google-gemini-p puts the power of Gemini right at your fingertips. Explore a world of knowledge and unleash your creativity through intuitive interaction with Gemini's vast capabilities. +commbase-genai-llm-google-gemini-p puts the power of Gemini right at your fingertips. Explore a world of knowledge and unleash your creativity through intuitive interaction with Gemini's vast capabilities. **Key Features:** @@ -20,13 +20,13 @@ commbase-llm-google-gemini-p puts the power of Gemini right at your fingertips. * **Quick and insightful learner:** Gemini is constantly learning and adapting, providing you with the latest insights and perspectives. * **Open-ended exploration:** Dive deeper into your curiosity and explore different topics in an open-ended way, fostering new discoveries and connections. -**More than just a language model, commbase-llm-google-gemini-p is a gateway to:** +**More than just a language model, commbase-genai-llm-google-gemini-p is a gateway to:** * **Expand your knowledge:** Gain insights and information on various subjects, from factual topics to creative exploration. * **Spark your creativity:** Get inspired by unique text formats, brainstorm ideas, or receive assistance with creative writing projects. * **Boost your productivity:** Utilize Gemini's capabilities to summarize information, translate languages, or complete specific tasks efficiently. -**With commbase-llm-google-gemini-p, the potential for communication, learning, and creative exploration is limitless.** +**With commbase-genai-llm-google-gemini-p, the potential for communication, learning, and creative exploration is limitless.** ## Examples diff --git a/bundles/commbase-llm-google-gemini-p/commbase-llm-google-gemini-p.jpg b/bundles/commbase-genai-llm-google-gemini-p/commbase-genai-llm-google-gemini-p.jpg similarity index 100% rename from bundles/commbase-llm-google-gemini-p/commbase-llm-google-gemini-p.jpg rename to bundles/commbase-genai-llm-google-gemini-p/commbase-genai-llm-google-gemini-p.jpg diff --git a/bundles/commbase-llm-google-gemini-p/commbase_llm_google_gemini_p.py b/bundles/commbase-genai-llm-google-gemini-p/commbase-genai-llm-google-gemini-p.py similarity index 100% rename from bundles/commbase-llm-google-gemini-p/commbase_llm_google_gemini_p.py rename to bundles/commbase-genai-llm-google-gemini-p/commbase-genai-llm-google-gemini-p.py diff --git a/bundles/commbase-llm-google-gemini-p/examples/README.md b/bundles/commbase-genai-llm-google-gemini-p/examples/README.md similarity index 85% rename from bundles/commbase-llm-google-gemini-p/examples/README.md rename to bundles/commbase-genai-llm-google-gemini-p/examples/README.md index 8b90b7a6..dbc36602 100644 --- a/bundles/commbase-llm-google-gemini-p/examples/README.md +++ b/bundles/commbase-genai-llm-google-gemini-p/examples/README.md @@ -1,6 +1,6 @@ ## Examples -### Example 1: commbase_llm_google_gemini_p implementation +### Example 1: commbase-genai-llm-google-gemini-p implementation This example demonstrates the implementation of our software. You can find the code in the [Commbase](https://github.com/mydroidandi/commbase) repository. diff --git a/bundles/commbase-llm-meta-llama-p/COPYING b/bundles/commbase-genai-llm-meta-llama-p/COPYING similarity index 100% rename from bundles/commbase-llm-meta-llama-p/COPYING rename to bundles/commbase-genai-llm-meta-llama-p/COPYING diff --git a/bundles/commbase-llm-meta-llama-p/README.md b/bundles/commbase-genai-llm-meta-llama-p/README.md similarity index 78% rename from bundles/commbase-llm-meta-llama-p/README.md rename to bundles/commbase-genai-llm-meta-llama-p/README.md index 901bbad9..060db695 100644 --- a/bundles/commbase-llm-meta-llama-p/README.md +++ b/bundles/commbase-genai-llm-meta-llama-p/README.md @@ -1,15 +1,15 @@ [![Python Package using Conda](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml/badge.svg)](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml) [![Python Version](https://img.shields.io/badge/Python-3.10%20%7C%203.11%20%7C%203.12-blue)](https://img.shields.io/badge/python-3.10%20%7C%203.11%20%7C%203.12-blue) -# commbase-llm-meta-llama-p +# commbase-genai-llm-meta-llama-p -commbase-llm-meta-llama-p +commbase-genai-llm-meta-llama-p A terminal version of Meta's Llama. This app allows you to interact with Llama, a large language model known for its factual language understanding and reasoning abilities. Ask questions, get help with research, or gain insights from complex information - all through a user-friendly interface. Whether you're a student, researcher, or simply curious about the world, this app can be a valuable tool for understanding and analyzing information. -## Unleash the Power of Factual Exploration with commbase-llm-meta-llama-p +## Unleash the Power of Factual Exploration with commbase-genai-llm-meta-llama-p -commbase-llm-meta-llama-p empowers you to tap into the strengths of Llama, a large language model known for its exceptional understanding and reasoning abilities. Dive deeper into factual information, gain insightful perspectives, and unlock the potential for comprehensive learning through seamless interaction with Llama. +commbase-genai-llm-meta-llama-p empowers you to tap into the strengths of Llama, a large language model known for its exceptional understanding and reasoning abilities. Dive deeper into factual information, gain insightful perspectives, and unlock the potential for comprehensive learning through seamless interaction with Llama. **Key strengths:** @@ -21,13 +21,13 @@ commbase-llm-meta-llama-p empowers you to tap into the strengths of Llama, a lar * **User-friendly and accessible:** Interact with Llama in a natural and intuitive manner, making this powerful technology readily available to a wide range of users. * **Continual learning and improvement:** Experience the ongoing advancements in language processing as Llama constantly learns and adapts, offering you the most current information and capabilities. -**commbase-llm-meta-llama-p goes beyond a simple language model to become your gateway to:** +**commbase-genai-llm-meta-llama-p goes beyond a simple language model to become your gateway to:** * **Expand your factual knowledge:** Deepen your understanding of the world and various disciplines through comprehensive and reliable information. * **Gain insightful perspectives:** Analyze information from different angles, unearth hidden connections, and develop critical thinking skills with Llama's assistance. * **Boost your research and analysis skills:** Leverage Llama's capabilities to navigate complex information, synthesize findings, and gain valuable insights from your research endeavors. -**With commbase-llm-meta-llama-p and the power of Llama, the journey towards factual exploration and deeper understanding is limitless.** +**With commbase-genai-llm-meta-llama-p and the power of Llama, the journey towards factual exploration and deeper understanding is limitless.** ## Examples diff --git a/bundles/commbase-llm-meta-llama-p/commbase-llm-meta-llama-p.jpg b/bundles/commbase-genai-llm-meta-llama-p/commbase-genai-llm-meta-llama-p.jpg similarity index 100% rename from bundles/commbase-llm-meta-llama-p/commbase-llm-meta-llama-p.jpg rename to bundles/commbase-genai-llm-meta-llama-p/commbase-genai-llm-meta-llama-p.jpg diff --git a/bundles/commbase-llm-meta-llama-p/commbase_llm_meta_llama_p.py b/bundles/commbase-genai-llm-meta-llama-p/commbase-genai-llm-meta-llama-p.py similarity index 100% rename from bundles/commbase-llm-meta-llama-p/commbase_llm_meta_llama_p.py rename to bundles/commbase-genai-llm-meta-llama-p/commbase-genai-llm-meta-llama-p.py diff --git a/bundles/commbase-llm-meta-llama-p/examples/README.md b/bundles/commbase-genai-llm-meta-llama-p/examples/README.md similarity index 85% rename from bundles/commbase-llm-meta-llama-p/examples/README.md rename to bundles/commbase-genai-llm-meta-llama-p/examples/README.md index e5effe97..8d8f7891 100644 --- a/bundles/commbase-llm-meta-llama-p/examples/README.md +++ b/bundles/commbase-genai-llm-meta-llama-p/examples/README.md @@ -1,6 +1,6 @@ ## Examples -### Example 1: commbase-llm-meta-llama-p implementation +### Example 1: commbase-genai-llm-meta-llama-p implementation This example demonstrates the implementation of our software. You can find the code in the [Commbase](https://github.com/mydroidandi/commbase) repository. diff --git a/bundles/commbase-llm-openai-gpt-p/COPYING b/bundles/commbase-genai-llm-openai-gpt-p/COPYING similarity index 100% rename from bundles/commbase-llm-openai-gpt-p/COPYING rename to bundles/commbase-genai-llm-openai-gpt-p/COPYING diff --git a/bundles/commbase-llm-openai-gpt-p/README.md b/bundles/commbase-genai-llm-openai-gpt-p/README.md similarity index 81% rename from bundles/commbase-llm-openai-gpt-p/README.md rename to bundles/commbase-genai-llm-openai-gpt-p/README.md index 8fd3db3d..6202709b 100644 --- a/bundles/commbase-llm-openai-gpt-p/README.md +++ b/bundles/commbase-genai-llm-openai-gpt-p/README.md @@ -1,17 +1,17 @@ [![Python Package using Conda](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml/badge.svg)](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml) [![Python Version](https://img.shields.io/badge/Python-3.10%20%7C%203.11%20%7C%203.12-blue)](https://img.shields.io/badge/python-3.10%20%7C%203.11%20%7C%203.12-blue) -# commbase-llm-openai-gpt-p +# commbase-genai-llm-openai-gpt-p -commbase-llm-openai-gpt-p +commbase-genai-llm-openai-gpt-p A terminal version of ChatGPT designed to facilitate seamless interaction with the powerful Open AI's GPT language model. This app empowers users to engage in natural language conversations with the model, unlocking a wide range of possibilities for creative, informative, and dynamic interactions. ChatGPT is a chatbot built using the GPT (Generative Pre-trained Transformer) architecture developed by OpenAI. To use ChatGPT, simply visit the website at . This is the online platform where you can access and interact with the ChatGPT service. -## Unleash the Power of Creativity with commbase-llm-openai-gpt-p +## Unleash the Power of Creativity with commbase-genai-llm-openai-gpt-p -commbase-llm-openai-gpt-p harnesses the capabilities of OpenAI's GPT models. Explore the world of storytelling, code exploration, and language manipulation through intuitive interaction with GPT's advanced features. +commbase-genai-llm-openai-gpt-p harnesses the capabilities of OpenAI's GPT models. Explore the world of storytelling, code exploration, and language manipulation through intuitive interaction with GPT's advanced features. **Key strengths:** @@ -23,7 +23,7 @@ commbase-llm-openai-gpt-p harnesses the capabilities of OpenAI's GPT models. Exp * **Accessibility and user-friendliness:** Interact with GPT easily through various APIs and interfaces, making the technology readily available to a diverse audience. * **Constant evolution and improvement:** Stay at the forefront of language processing advancements as OpenAI continuously refines and improves GPT models. -**With commbase-llm-openai-gpt-p, unlock the potential of GPT and embark on a journey of creative exploration, communication, and discovery.** +**With commbase-genai-llm-openai-gpt-p, unlock the potential of GPT and embark on a journey of creative exploration, communication, and discovery.** ## Examples diff --git a/bundles/commbase-llm-openai-gpt-p/commbase-llm-openai-gpt-p.jpg b/bundles/commbase-genai-llm-openai-gpt-p/commbase-genai-llm-openai-gpt-p.jpg similarity index 100% rename from bundles/commbase-llm-openai-gpt-p/commbase-llm-openai-gpt-p.jpg rename to bundles/commbase-genai-llm-openai-gpt-p/commbase-genai-llm-openai-gpt-p.jpg diff --git a/bundles/commbase-genai-llm-openai-gpt-p/commbase-genai-llm-openai-gpt-p.py b/bundles/commbase-genai-llm-openai-gpt-p/commbase-genai-llm-openai-gpt-p.py new file mode 100644 index 00000000..3f2ad968 --- /dev/null +++ b/bundles/commbase-genai-llm-openai-gpt-p/commbase-genai-llm-openai-gpt-p.py @@ -0,0 +1,132 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-llm-openai-gpt-p.py # +# # +# A terminal version of OpenAI's ChatGPT # +# # +# Change History # +# 02/26/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# commbase-genai-llm-openai-gpt-p.py +# A terminal version of ChatGPT +# ChatGPT is a chatbot built using the GPT (Generative Pre-trained Transformer) +# architecture developed by OpenAI. +# You can access ChatGPT by going to https://chat.openai.com/ + +# Imports +import openai +import os +import rich +from rich import print +from rich.table import Table +import typer + + +def get_chatgpt_api_key(callback=None): + # Specify the path of the env file containing the variable + file_path = os.environ["COMMBASE_APP_DIR"] + "/config/app.conf" + + # Open the file and read its contents + with open(file_path, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "OPENAI_API_KEY": + # Remove the quotes from the value of the variable + API_KEY = value.strip()[1:-1] + + # Call the callback function with the API key value as an + # argument + if callback is not None: + callback(API_KEY) + + return API_KEY + + # If the variable is not found, return None + return None + + +def main(): + def process_api_key(api_key): + # Do something with the API key value + # print(f"Received API key: {API_KEY}") + openai.api_key = api_key + + # Get the API key value and pass it to the callback function + get_chatgpt_api_key(callback=process_api_key) + + print("[bold green]Terminal ChatGP[/bold green]") + + table = Table("Command", "Description") + table.add_row("exit", "Exit the application") + table.add_row("new", "New conversation") + + print(table) + + # Assistant context + context = {"role": "system", "content": "You are a very helpful assistant."} + messages = [context] + + while True: + + content = __prompt() + + if content == "new": + print("ChatGPT: New conversation created") + messages = [context] + content = __prompt() + + messages.append({"role": "user", "content": content}) + + response = openai.ChatCompletion.create( + model="gpt-3.5-turbo", messages=messages + ) + + response_content = response.choices[0].message.content + + messages.append({"role": "assistant", "content": response_content}) + + print(f"[bold green]ChatGPT: [/bold green] [green]{response_content}[/green]") + + +def __prompt() -> str: + prompt = typer.prompt("\nYou") + + if prompt == "exit": + exit = typer.confirm("Are you sure?") + if exit: + print("Goodbye!") + raise typer.Abort() + + return __prompt() + + return prompt + + +if __name__ == "__main__": + typer.run(main) diff --git a/bundles/commbase-llm-openai-gpt-p/examples/README.md b/bundles/commbase-genai-llm-openai-gpt-p/examples/README.md similarity index 85% rename from bundles/commbase-llm-openai-gpt-p/examples/README.md rename to bundles/commbase-genai-llm-openai-gpt-p/examples/README.md index 10d3e940..a1c2c9f3 100644 --- a/bundles/commbase-llm-openai-gpt-p/examples/README.md +++ b/bundles/commbase-genai-llm-openai-gpt-p/examples/README.md @@ -1,6 +1,6 @@ ## Examples -### Example 1: commbase-llm-openai-gpt-p implementation +### Example 1: commbase-genai-llm-openai-gpt-p implementation This example demonstrates the implementation of our software. You can find the code in the [Commbase](https://github.com/mydroidandi/commbase) repository. diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/COPYING b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/COPYING new file mode 100644 index 00000000..94a9ed02 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/COPYING @@ -0,0 +1,674 @@ + GNU GENERAL PUBLIC LICENSE + Version 3, 29 June 2007 + + Copyright (C) 2007 Free Software Foundation, Inc. + Everyone is permitted to copy and distribute verbatim copies + of this license document, but changing it is not allowed. + + Preamble + + The GNU General Public License is a free, copyleft license for +software and other kinds of works. + + The licenses for most software and other practical works are designed +to take away your freedom to share and change the works. By contrast, +the GNU General Public License is intended to guarantee your freedom to +share and change all versions of a program--to make sure it remains free +software for all its users. We, the Free Software Foundation, use the +GNU General Public License for most of our software; it applies also to +any other work released this way by its authors. You can apply it to +your programs, too. + + When we speak of free software, we are referring to freedom, not +price. Our General Public Licenses are designed to make sure that you +have the freedom to distribute copies of free software (and charge for +them if you wish), that you receive source code or can get it if you +want it, that you can change the software or use pieces of it in new +free programs, and that you know you can do these things. + + To protect your rights, we need to prevent others from denying you +these rights or asking you to surrender the rights. Therefore, you have +certain responsibilities if you distribute copies of the software, or if +you modify it: responsibilities to respect the freedom of others. + + For example, if you distribute copies of such a program, whether +gratis or for a fee, you must pass on to the recipients the same +freedoms that you received. You must make sure that they, too, receive +or can get the source code. And you must show them these terms so they +know their rights. + + Developers that use the GNU GPL protect your rights with two steps: +(1) assert copyright on the software, and (2) offer you this License +giving you legal permission to copy, distribute and/or modify it. + + For the developers' and authors' protection, the GPL clearly explains +that there is no warranty for this free software. For both users' and +authors' sake, the GPL requires that modified versions be marked as +changed, so that their problems will not be attributed erroneously to +authors of previous versions. + + Some devices are designed to deny users access to install or run +modified versions of the software inside them, although the manufacturer +can do so. This is fundamentally incompatible with the aim of +protecting users' freedom to change the software. The systematic +pattern of such abuse occurs in the area of products for individuals to +use, which is precisely where it is most unacceptable. Therefore, we +have designed this version of the GPL to prohibit the practice for those +products. If such problems arise substantially in other domains, we +stand ready to extend this provision to those domains in future versions +of the GPL, as needed to protect the freedom of users. + + Finally, every program is threatened constantly by software patents. +States should not allow patents to restrict development and use of +software on general-purpose computers, but in those that do, we wish to +avoid the special danger that patents applied to a free program could +make it effectively proprietary. To prevent this, the GPL assures that +patents cannot be used to render the program non-free. + + The precise terms and conditions for copying, distribution and +modification follow. + + TERMS AND CONDITIONS + + 0. Definitions. + + "This License" refers to version 3 of the GNU General Public License. + + "Copyright" also means copyright-like laws that apply to other kinds of +works, such as semiconductor masks. + + "The Program" refers to any copyrightable work licensed under this +License. Each licensee is addressed as "you". "Licensees" and +"recipients" may be individuals or organizations. + + To "modify" a work means to copy from or adapt all or part of the work +in a fashion requiring copyright permission, other than the making of an +exact copy. The resulting work is called a "modified version" of the +earlier work or a work "based on" the earlier work. + + A "covered work" means either the unmodified Program or a work based +on the Program. + + To "propagate" a work means to do anything with it that, without +permission, would make you directly or secondarily liable for +infringement under applicable copyright law, except executing it on a +computer or modifying a private copy. Propagation includes copying, +distribution (with or without modification), making available to the +public, and in some countries other activities as well. + + To "convey" a work means any kind of propagation that enables other +parties to make or receive copies. Mere interaction with a user through +a computer network, with no transfer of a copy, is not conveying. + + An interactive user interface displays "Appropriate Legal Notices" +to the extent that it includes a convenient and prominently visible +feature that (1) displays an appropriate copyright notice, and (2) +tells the user that there is no warranty for the work (except to the +extent that warranties are provided), that licensees may convey the +work under this License, and how to view a copy of this License. If +the interface presents a list of user commands or options, such as a +menu, a prominent item in the list meets this criterion. + + 1. Source Code. + + The "source code" for a work means the preferred form of the work +for making modifications to it. "Object code" means any non-source +form of a work. + + A "Standard Interface" means an interface that either is an official +standard defined by a recognized standards body, or, in the case of +interfaces specified for a particular programming language, one that +is widely used among developers working in that language. + + The "System Libraries" of an executable work include anything, other +than the work as a whole, that (a) is included in the normal form of +packaging a Major Component, but which is not part of that Major +Component, and (b) serves only to enable use of the work with that +Major Component, or to implement a Standard Interface for which an +implementation is available to the public in source code form. A +"Major Component", in this context, means a major essential component +(kernel, window system, and so on) of the specific operating system +(if any) on which the executable work runs, or a compiler used to +produce the work, or an object code interpreter used to run it. + + The "Corresponding Source" for a work in object code form means all +the source code needed to generate, install, and (for an executable +work) run the object code and to modify the work, including scripts to +control those activities. However, it does not include the work's +System Libraries, or general-purpose tools or generally available free +programs which are used unmodified in performing those activities but +which are not part of the work. For example, Corresponding Source +includes interface definition files associated with source files for +the work, and the source code for shared libraries and dynamically +linked subprograms that the work is specifically designed to require, +such as by intimate data communication or control flow between those +subprograms and other parts of the work. + + The Corresponding Source need not include anything that users +can regenerate automatically from other parts of the Corresponding +Source. + + The Corresponding Source for a work in source code form is that +same work. + + 2. Basic Permissions. + + All rights granted under this License are granted for the term of +copyright on the Program, and are irrevocable provided the stated +conditions are met. This License explicitly affirms your unlimited +permission to run the unmodified Program. The output from running a +covered work is covered by this License only if the output, given its +content, constitutes a covered work. This License acknowledges your +rights of fair use or other equivalent, as provided by copyright law. + + You may make, run and propagate covered works that you do not +convey, without conditions so long as your license otherwise remains +in force. You may convey covered works to others for the sole purpose +of having them make modifications exclusively for you, or provide you +with facilities for running those works, provided that you comply with +the terms of this License in conveying all material for which you do +not control copyright. Those thus making or running the covered works +for you must do so exclusively on your behalf, under your direction +and control, on terms that prohibit them from making any copies of +your copyrighted material outside their relationship with you. + + Conveying under any other circumstances is permitted solely under +the conditions stated below. Sublicensing is not allowed; section 10 +makes it unnecessary. + + 3. Protecting Users' Legal Rights From Anti-Circumvention Law. + + No covered work shall be deemed part of an effective technological +measure under any applicable law fulfilling obligations under article +11 of the WIPO copyright treaty adopted on 20 December 1996, or +similar laws prohibiting or restricting circumvention of such +measures. + + When you convey a covered work, you waive any legal power to forbid +circumvention of technological measures to the extent such circumvention +is effected by exercising rights under this License with respect to +the covered work, and you disclaim any intention to limit operation or +modification of the work as a means of enforcing, against the work's +users, your or third parties' legal rights to forbid circumvention of +technological measures. + + 4. Conveying Verbatim Copies. + + You may convey verbatim copies of the Program's source code as you +receive it, in any medium, provided that you conspicuously and +appropriately publish on each copy an appropriate copyright notice; +keep intact all notices stating that this License and any +non-permissive terms added in accord with section 7 apply to the code; +keep intact all notices of the absence of any warranty; and give all +recipients a copy of this License along with the Program. + + You may charge any price or no price for each copy that you convey, +and you may offer support or warranty protection for a fee. + + 5. Conveying Modified Source Versions. + + You may convey a work based on the Program, or the modifications to +produce it from the Program, in the form of source code under the +terms of section 4, provided that you also meet all of these conditions: + + a) The work must carry prominent notices stating that you modified + it, and giving a relevant date. + + b) The work must carry prominent notices stating that it is + released under this License and any conditions added under section + 7. This requirement modifies the requirement in section 4 to + "keep intact all notices". + + c) You must license the entire work, as a whole, under this + License to anyone who comes into possession of a copy. This + License will therefore apply, along with any applicable section 7 + additional terms, to the whole of the work, and all its parts, + regardless of how they are packaged. This License gives no + permission to license the work in any other way, but it does not + invalidate such permission if you have separately received it. + + d) If the work has interactive user interfaces, each must display + Appropriate Legal Notices; however, if the Program has interactive + interfaces that do not display Appropriate Legal Notices, your + work need not make them do so. + + A compilation of a covered work with other separate and independent +works, which are not by their nature extensions of the covered work, +and which are not combined with it such as to form a larger program, +in or on a volume of a storage or distribution medium, is called an +"aggregate" if the compilation and its resulting copyright are not +used to limit the access or legal rights of the compilation's users +beyond what the individual works permit. Inclusion of a covered work +in an aggregate does not cause this License to apply to the other +parts of the aggregate. + + 6. Conveying Non-Source Forms. + + You may convey a covered work in object code form under the terms +of sections 4 and 5, provided that you also convey the +machine-readable Corresponding Source under the terms of this License, +in one of these ways: + + a) Convey the object code in, or embodied in, a physical product + (including a physical distribution medium), accompanied by the + Corresponding Source fixed on a durable physical medium + customarily used for software interchange. + + b) Convey the object code in, or embodied in, a physical product + (including a physical distribution medium), accompanied by a + written offer, valid for at least three years and valid for as + long as you offer spare parts or customer support for that product + model, to give anyone who possesses the object code either (1) a + copy of the Corresponding Source for all the software in the + product that is covered by this License, on a durable physical + medium customarily used for software interchange, for a price no + more than your reasonable cost of physically performing this + conveying of source, or (2) access to copy the + Corresponding Source from a network server at no charge. + + c) Convey individual copies of the object code with a copy of the + written offer to provide the Corresponding Source. This + alternative is allowed only occasionally and noncommercially, and + only if you received the object code with such an offer, in accord + with subsection 6b. + + d) Convey the object code by offering access from a designated + place (gratis or for a charge), and offer equivalent access to the + Corresponding Source in the same way through the same place at no + further charge. You need not require recipients to copy the + Corresponding Source along with the object code. If the place to + copy the object code is a network server, the Corresponding Source + may be on a different server (operated by you or a third party) + that supports equivalent copying facilities, provided you maintain + clear directions next to the object code saying where to find the + Corresponding Source. Regardless of what server hosts the + Corresponding Source, you remain obligated to ensure that it is + available for as long as needed to satisfy these requirements. + + e) Convey the object code using peer-to-peer transmission, provided + you inform other peers where the object code and Corresponding + Source of the work are being offered to the general public at no + charge under subsection 6d. + + A separable portion of the object code, whose source code is excluded +from the Corresponding Source as a System Library, need not be +included in conveying the object code work. + + A "User Product" is either (1) a "consumer product", which means any +tangible personal property which is normally used for personal, family, +or household purposes, or (2) anything designed or sold for incorporation +into a dwelling. In determining whether a product is a consumer product, +doubtful cases shall be resolved in favor of coverage. For a particular +product received by a particular user, "normally used" refers to a +typical or common use of that class of product, regardless of the status +of the particular user or of the way in which the particular user +actually uses, or expects or is expected to use, the product. A product +is a consumer product regardless of whether the product has substantial +commercial, industrial or non-consumer uses, unless such uses represent +the only significant mode of use of the product. + + "Installation Information" for a User Product means any methods, +procedures, authorization keys, or other information required to install +and execute modified versions of a covered work in that User Product from +a modified version of its Corresponding Source. The information must +suffice to ensure that the continued functioning of the modified object +code is in no case prevented or interfered with solely because +modification has been made. + + If you convey an object code work under this section in, or with, or +specifically for use in, a User Product, and the conveying occurs as +part of a transaction in which the right of possession and use of the +User Product is transferred to the recipient in perpetuity or for a +fixed term (regardless of how the transaction is characterized), the +Corresponding Source conveyed under this section must be accompanied +by the Installation Information. But this requirement does not apply +if neither you nor any third party retains the ability to install +modified object code on the User Product (for example, the work has +been installed in ROM). + + The requirement to provide Installation Information does not include a +requirement to continue to provide support service, warranty, or updates +for a work that has been modified or installed by the recipient, or for +the User Product in which it has been modified or installed. Access to a +network may be denied when the modification itself materially and +adversely affects the operation of the network or violates the rules and +protocols for communication across the network. + + Corresponding Source conveyed, and Installation Information provided, +in accord with this section must be in a format that is publicly +documented (and with an implementation available to the public in +source code form), and must require no special password or key for +unpacking, reading or copying. + + 7. Additional Terms. + + "Additional permissions" are terms that supplement the terms of this +License by making exceptions from one or more of its conditions. +Additional permissions that are applicable to the entire Program shall +be treated as though they were included in this License, to the extent +that they are valid under applicable law. If additional permissions +apply only to part of the Program, that part may be used separately +under those permissions, but the entire Program remains governed by +this License without regard to the additional permissions. + + When you convey a copy of a covered work, you may at your option +remove any additional permissions from that copy, or from any part of +it. (Additional permissions may be written to require their own +removal in certain cases when you modify the work.) You may place +additional permissions on material, added by you to a covered work, +for which you have or can give appropriate copyright permission. + + Notwithstanding any other provision of this License, for material you +add to a covered work, you may (if authorized by the copyright holders of +that material) supplement the terms of this License with terms: + + a) Disclaiming warranty or limiting liability differently from the + terms of sections 15 and 16 of this License; or + + b) Requiring preservation of specified reasonable legal notices or + author attributions in that material or in the Appropriate Legal + Notices displayed by works containing it; or + + c) Prohibiting misrepresentation of the origin of that material, or + requiring that modified versions of such material be marked in + reasonable ways as different from the original version; or + + d) Limiting the use for publicity purposes of names of licensors or + authors of the material; or + + e) Declining to grant rights under trademark law for use of some + trade names, trademarks, or service marks; or + + f) Requiring indemnification of licensors and authors of that + material by anyone who conveys the material (or modified versions of + it) with contractual assumptions of liability to the recipient, for + any liability that these contractual assumptions directly impose on + those licensors and authors. + + All other non-permissive additional terms are considered "further +restrictions" within the meaning of section 10. If the Program as you +received it, or any part of it, contains a notice stating that it is +governed by this License along with a term that is a further +restriction, you may remove that term. If a license document contains +a further restriction but permits relicensing or conveying under this +License, you may add to a covered work material governed by the terms +of that license document, provided that the further restriction does +not survive such relicensing or conveying. + + If you add terms to a covered work in accord with this section, you +must place, in the relevant source files, a statement of the +additional terms that apply to those files, or a notice indicating +where to find the applicable terms. + + Additional terms, permissive or non-permissive, may be stated in the +form of a separately written license, or stated as exceptions; +the above requirements apply either way. + + 8. Termination. + + You may not propagate or modify a covered work except as expressly +provided under this License. Any attempt otherwise to propagate or +modify it is void, and will automatically terminate your rights under +this License (including any patent licenses granted under the third +paragraph of section 11). + + However, if you cease all violation of this License, then your +license from a particular copyright holder is reinstated (a) +provisionally, unless and until the copyright holder explicitly and +finally terminates your license, and (b) permanently, if the copyright +holder fails to notify you of the violation by some reasonable means +prior to 60 days after the cessation. + + Moreover, your license from a particular copyright holder is +reinstated permanently if the copyright holder notifies you of the +violation by some reasonable means, this is the first time you have +received notice of violation of this License (for any work) from that +copyright holder, and you cure the violation prior to 30 days after +your receipt of the notice. + + Termination of your rights under this section does not terminate the +licenses of parties who have received copies or rights from you under +this License. If your rights have been terminated and not permanently +reinstated, you do not qualify to receive new licenses for the same +material under section 10. + + 9. Acceptance Not Required for Having Copies. + + You are not required to accept this License in order to receive or +run a copy of the Program. Ancillary propagation of a covered work +occurring solely as a consequence of using peer-to-peer transmission +to receive a copy likewise does not require acceptance. However, +nothing other than this License grants you permission to propagate or +modify any covered work. These actions infringe copyright if you do +not accept this License. Therefore, by modifying or propagating a +covered work, you indicate your acceptance of this License to do so. + + 10. Automatic Licensing of Downstream Recipients. + + Each time you convey a covered work, the recipient automatically +receives a license from the original licensors, to run, modify and +propagate that work, subject to this License. You are not responsible +for enforcing compliance by third parties with this License. + + An "entity transaction" is a transaction transferring control of an +organization, or substantially all assets of one, or subdividing an +organization, or merging organizations. If propagation of a covered +work results from an entity transaction, each party to that +transaction who receives a copy of the work also receives whatever +licenses to the work the party's predecessor in interest had or could +give under the previous paragraph, plus a right to possession of the +Corresponding Source of the work from the predecessor in interest, if +the predecessor has it or can get it with reasonable efforts. + + You may not impose any further restrictions on the exercise of the +rights granted or affirmed under this License. For example, you may +not impose a license fee, royalty, or other charge for exercise of +rights granted under this License, and you may not initiate litigation +(including a cross-claim or counterclaim in a lawsuit) alleging that +any patent claim is infringed by making, using, selling, offering for +sale, or importing the Program or any portion of it. + + 11. Patents. + + A "contributor" is a copyright holder who authorizes use under this +License of the Program or a work on which the Program is based. The +work thus licensed is called the contributor's "contributor version". + + A contributor's "essential patent claims" are all patent claims +owned or controlled by the contributor, whether already acquired or +hereafter acquired, that would be infringed by some manner, permitted +by this License, of making, using, or selling its contributor version, +but do not include claims that would be infringed only as a +consequence of further modification of the contributor version. For +purposes of this definition, "control" includes the right to grant +patent sublicenses in a manner consistent with the requirements of +this License. + + Each contributor grants you a non-exclusive, worldwide, royalty-free +patent license under the contributor's essential patent claims, to +make, use, sell, offer for sale, import and otherwise run, modify and +propagate the contents of its contributor version. + + In the following three paragraphs, a "patent license" is any express +agreement or commitment, however denominated, not to enforce a patent +(such as an express permission to practice a patent or covenant not to +sue for patent infringement). To "grant" such a patent license to a +party means to make such an agreement or commitment not to enforce a +patent against the party. + + If you convey a covered work, knowingly relying on a patent license, +and the Corresponding Source of the work is not available for anyone +to copy, free of charge and under the terms of this License, through a +publicly available network server or other readily accessible means, +then you must either (1) cause the Corresponding Source to be so +available, or (2) arrange to deprive yourself of the benefit of the +patent license for this particular work, or (3) arrange, in a manner +consistent with the requirements of this License, to extend the patent +license to downstream recipients. "Knowingly relying" means you have +actual knowledge that, but for the patent license, your conveying the +covered work in a country, or your recipient's use of the covered work +in a country, would infringe one or more identifiable patents in that +country that you have reason to believe are valid. + + If, pursuant to or in connection with a single transaction or +arrangement, you convey, or propagate by procuring conveyance of, a +covered work, and grant a patent license to some of the parties +receiving the covered work authorizing them to use, propagate, modify +or convey a specific copy of the covered work, then the patent license +you grant is automatically extended to all recipients of the covered +work and works based on it. + + A patent license is "discriminatory" if it does not include within +the scope of its coverage, prohibits the exercise of, or is +conditioned on the non-exercise of one or more of the rights that are +specifically granted under this License. You may not convey a covered +work if you are a party to an arrangement with a third party that is +in the business of distributing software, under which you make payment +to the third party based on the extent of your activity of conveying +the work, and under which the third party grants, to any of the +parties who would receive the covered work from you, a discriminatory +patent license (a) in connection with copies of the covered work +conveyed by you (or copies made from those copies), or (b) primarily +for and in connection with specific products or compilations that +contain the covered work, unless you entered into that arrangement, +or that patent license was granted, prior to 28 March 2007. + + Nothing in this License shall be construed as excluding or limiting +any implied license or other defenses to infringement that may +otherwise be available to you under applicable patent law. + + 12. No Surrender of Others' Freedom. + + If conditions are imposed on you (whether by court order, agreement or +otherwise) that contradict the conditions of this License, they do not +excuse you from the conditions of this License. If you cannot convey a +covered work so as to satisfy simultaneously your obligations under this +License and any other pertinent obligations, then as a consequence you may +not convey it at all. For example, if you agree to terms that obligate you +to collect a royalty for further conveying from those to whom you convey +the Program, the only way you could satisfy both those terms and this +License would be to refrain entirely from conveying the Program. + + 13. Use with the GNU Affero General Public License. + + Notwithstanding any other provision of this License, you have +permission to link or combine any covered work with a work licensed +under version 3 of the GNU Affero General Public License into a single +combined work, and to convey the resulting work. The terms of this +License will continue to apply to the part which is the covered work, +but the special requirements of the GNU Affero General Public License, +section 13, concerning interaction through a network will apply to the +combination as such. + + 14. Revised Versions of this License. + + The Free Software Foundation may publish revised and/or new versions of +the GNU General Public License from time to time. Such new versions will +be similar in spirit to the present version, but may differ in detail to +address new problems or concerns. + + Each version is given a distinguishing version number. If the +Program specifies that a certain numbered version of the GNU General +Public License "or any later version" applies to it, you have the +option of following the terms and conditions either of that numbered +version or of any later version published by the Free Software +Foundation. If the Program does not specify a version number of the +GNU General Public License, you may choose any version ever published +by the Free Software Foundation. + + If the Program specifies that a proxy can decide which future +versions of the GNU General Public License can be used, that proxy's +public statement of acceptance of a version permanently authorizes you +to choose that version for the Program. + + Later license versions may give you additional or different +permissions. However, no additional obligations are imposed on any +author or copyright holder as a result of your choosing to follow a +later version. + + 15. Disclaimer of Warranty. + + THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY +APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT +HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY +OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, +THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR +PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM +IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF +ALL NECESSARY SERVICING, REPAIR OR CORRECTION. + + 16. Limitation of Liability. + + IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING +WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS +THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY +GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE +USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF +DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD +PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), +EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF +SUCH DAMAGES. + + 17. Interpretation of Sections 15 and 16. + + If the disclaimer of warranty and limitation of liability provided +above cannot be given local legal effect according to their terms, +reviewing courts shall apply local law that most closely approximates +an absolute waiver of all civil liability in connection with the +Program, unless a warranty or assumption of liability accompanies a +copy of the Program in return for a fee. + + END OF TERMS AND CONDITIONS + + How to Apply These Terms to Your New Programs + + If you develop a new program, and you want it to be of the greatest +possible use to the public, the best way to achieve this is to make it +free software which everyone can redistribute and change under these terms. + + To do so, attach the following notices to the program. It is safest +to attach them to the start of each source file to most effectively +state the exclusion of warranty; and each file should have at least +the "copyright" line and a pointer to where the full notice is found. + + + Copyright (C) + + This program is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + This program is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with this program. If not, see . + +Also add information on how to contact you by electronic and paper mail. + + If the program does terminal interaction, make it output a short +notice like this when it starts in an interactive mode: + + Copyright (C) + This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. + This is free software, and you are welcome to redistribute it + under certain conditions; type `show c' for details. + +The hypothetical commands `show w' and `show c' should show the appropriate +parts of the General Public License. Of course, your program's commands +might be different; for a GUI interface, you would use an "about box". + + You should also get your employer (if you work as a programmer) or school, +if any, to sign a "copyright disclaimer" for the program, if necessary. +For more information on this, and how to apply and follow the GNU GPL, see +. + + The GNU General Public License does not permit incorporating your program +into proprietary programs. If your program is a subroutine library, you +may consider it more useful to permit linking proprietary applications with +the library. If this is what you want to do, use the GNU Lesser General +Public License instead of this License. But first, please read +. diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/README.md b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/README.md new file mode 100644 index 00000000..6202709b --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/README.md @@ -0,0 +1,44 @@ +[![Python Package using Conda](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml/badge.svg)](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml) +[![Python Version](https://img.shields.io/badge/Python-3.10%20%7C%203.11%20%7C%203.12-blue)](https://img.shields.io/badge/python-3.10%20%7C%203.11%20%7C%203.12-blue) + +# commbase-genai-llm-openai-gpt-p + +commbase-genai-llm-openai-gpt-p + +A terminal version of ChatGPT designed to facilitate seamless interaction with the powerful Open AI's GPT language model. This app empowers users to engage in natural language conversations with the model, unlocking a wide range of possibilities for creative, informative, and dynamic interactions. + +ChatGPT is a chatbot built using the GPT (Generative Pre-trained Transformer) architecture developed by OpenAI. To use ChatGPT, simply visit the website at . This is the online platform where you can access and interact with the ChatGPT service. + +## Unleash the Power of Creativity with commbase-genai-llm-openai-gpt-p + +commbase-genai-llm-openai-gpt-p harnesses the capabilities of OpenAI's GPT models. Explore the world of storytelling, code exploration, and language manipulation through intuitive interaction with GPT's advanced features. + +**Key strengths:** + +* **Creative text formats:** Generate poems, code, scripts, musical pieces, emails, letters, and more, finding inspiration, exploring different communication styles, or aiding in content creation. +* **Storytelling and world-building:** Craft captivating narratives and build intricate fictional worlds, empowering writers and game designers to bring their imaginations to life. +* **Humor and wit:** Engage in stimulating and enjoyable interactions as GPT incorporates humor and wit into its responses, fostering a sense of connection and amusement. +* **Multilingual capabilities:** Break down language barriers and access information across cultures with GPT's ability to understand and respond in multiple languages. +* **Code generation:** Get a head start on coding projects or explore different programming languages with GPT's basic code generation capabilities. +* **Accessibility and user-friendliness:** Interact with GPT easily through various APIs and interfaces, making the technology readily available to a diverse audience. +* **Constant evolution and improvement:** Stay at the forefront of language processing advancements as OpenAI continuously refines and improves GPT models. + +**With commbase-genai-llm-openai-gpt-p, unlock the potential of GPT and embark on a journey of creative exploration, communication, and discovery.** + +## Examples + +Detailed information about examples can be found in the corresponding [`examples`](./examples) directory. + +## Contributors + +Thanks to the following people who have contributed to this project: + +* [@estebanways](https://github.com/estebanways) + +## Contact + +If you want to contact us you can reach us at . + +## License + +This project uses the following license: [`License`](./COPYING). diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone.jpg b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone.jpg new file mode 100644 index 00000000..7ab73dd5 Binary files /dev/null and b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone.jpg differ diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/config.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/config.py new file mode 100644 index 00000000..9a7e1580 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/config.py @@ -0,0 +1,44 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# config.py +# This files makes a variable available throughout the entire Python +# application, including all modules, submodules, and functions. + +# Requirements +import os + +# The path to the .env configuration file +CONFIG_FILE_DIR = os.environ["COMMBASE_APP_DIR"] +CONFIG_FILE_PATH = os.path.join(CONFIG_FILE_DIR, "config/commbase.conf") diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/create_and_configure_a_new_pinecone_index.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/create_and_configure_a_new_pinecone_index.py new file mode 100644 index 00000000..0a7d07f0 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/create_and_configure_a_new_pinecone_index.py @@ -0,0 +1,54 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# create_and_configure_a_new_Pinecone_index.py +# Creates and configures a new Pinecone index named 'new-pinecone-index' + +# Imports +from pinecone import Pinecone, ServerlessSpec + +# Initialize the Pinecone client with your API key +pc = Pinecone(api_key="") + +# Create Pinecone index +pc.create_index( + name='new-pinecone-index', + dimension=384, # dimensionality of minilm + metric='dotproduct', + spec=ServerlessSpec(cloud='aws', region='us-east-1') +) + +# Connect to index and print the index statistics +index = pc.Index("new-pinecone-index") +print(index.describe_index_stats()) diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/create_and_configure_pinecone_index.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/create_and_configure_pinecone_index.py new file mode 100644 index 00000000..73f9df23 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/create_and_configure_pinecone_index.py @@ -0,0 +1,79 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# 02_create_and_configure_pinecone_index.py +# Creates and configures a new Pinecone index named 'new-pinecone-index' + +# Imports +import functions +import time +from pinecone import Pinecone, ServerlessSpec + +# Call test_embedding_model() +model, xq = functions.test_embedding_model() + +# Print the model details +print("Model: ", model) + +# Print the dimensions of the embedding vector +print("Single query dimensions:", xq.shape) + +# Initialize the Pinecone client with your API key +pc = Pinecone(api_key="") + +# Define the cloud provider and region +cloud = 'aws' +region = 'us-east-1' +spec = ServerlessSpec(cloud=cloud, region=region) + +index_name = 'commbase-log-chats' + +# Check if index already exists (it shouldn't if this is first time) +if index_name not in pc.list_indexes().names(): + # If does not exist, create index + pc.create_index( + index_name, + dimension=model.get_sentence_embedding_dimension(), + metric='cosine', + spec=spec + ) + # Wait for index to be initialized + while not pc.describe_index(index_name).status['ready']: + time.sleep(1) + +# Connect to index and print the index statistics +index = pc.Index(index_name) +print("") +print("Index statistics:") +print(index.describe_index_stats()) diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/create_embeddings_and_upsert_data.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/create_embeddings_and_upsert_data.py new file mode 100644 index 00000000..8ed5d0a4 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/create_embeddings_and_upsert_data.py @@ -0,0 +1,158 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# 03_create_embeddings_and_upsert_data.py +# Creates and configures a new Pinecone index named 'new-pinecone-index' + +# Imports +import functions +import json +import pandas as pd +import sentence_transformers +# import time +from pinecone import Pinecone, ServerlessSpec +from tqdm.auto import tqdm + +# Load dataset + + +# df = pd.read_csv('./build/dataset.json') +df = pd.read_json('./build/dataset.json') + +# Print the top 10 rows (or all rows if there are less than 10) +print(df.head(10)) + +# Get the total number of rows +total_rows = len(df) + +# Print the total number of rows +print(f"Total rows in the DataFrame: {total_rows}") + +# See the raw data in the JSON file as it is, without converting it to a DataFrame + + +# Read the JSON file +with open('./build/log_chats.json', 'r') as file: + data = json.load(file) + +# Print the raw JSON data +print(json.dumps(data, indent=4)) + +# Extract and print each text content on a new line similarly to +# 'for record in dataset['object_name']:'. +# for chat in data['log_chats']: +# print(chat['text']) + +# Extract and print each text content on a new line, limit to 5 lines + +for i, chat in enumerate(data['log_chats']): + if i >= 5: + break + print(chat['text']) + +# Building Embeddings and Upsert Format + + +# Call test_embedding_model() +model, xq = functions.test_embedding_model() + +# Print the model details +print("Model: ", model) + +# Print the dimensions of the embedding vector +print("Single query dimensions:", xq.shape) + +# Now we upsert the data, we will do this in batches of 128. +# Note: On Google Colab with GPU expected runtime is ~7 minutes. If using CPU this will be significantly longer. + +# Initialize the Pinecone client with your API key +pc = Pinecone(api_key="") + +index_name = 'commbase-log-chats' + +# Connect to index and print the index statistics +index = pc.Index(index_name) + + +batch_size = 10 +# batch_size = 128 +vector_limit = 50 +# vector_limit = 100000 + +# Extract the first 50 entries into a variable similarly to +# 'dataset = dataset[:vector_limit]'. +entries = data['log_chats'][:50] + +# Print the first 50 entries (optional) +for entry in entries: + print(entry) + +# Now you can use `entries` variable in your code +print("Number of entries:", len(entries)) + + +for i in tqdm(range(0, len(entries), batch_size)): + # find end of batch + i_end = min(i+batch_size, len(entries)) + # create IDs batch + ids = [str(x) for x in range(i, i_end)] + # create metadata batch + # v1 - constructing metadatas as a list of dictionaries, each containing a 'text' key whose value is a dictionary ({'text': text}), rather than just a string. + # metadatas = [{'text': text} for text in entries[i:i_end]] + # v2 - this assumes that each chat object within entries has a 'text' key, and you're extracting its value (text['text']) to be used as the metadata. + # metadatas = [{'text': text['text']} for text in entries[i:i_end]] + # Adding multiple metadata fields + metadatas = [ + { + 'id': text['id'], + 'timestamp': text['timestamp'], + 'origin': text['origin'], + 'severity': text['severity'], + 'speaker': text['speaker'], + 'text': text['text'] + } + for text in entries[i:i_end] + ] + # create embeddings + xc = model.encode(entries[i:i_end]) + # create records list for upsert + records = zip(ids, xc, metadatas) + print(records) + # upsert to Pinecone + index.upsert(vectors=records) + +# Check number of records in the index +print("") +print("Index statistics:") +print(index.describe_index_stats()) diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/examples/README.md b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/examples/README.md new file mode 100644 index 00000000..47bbe373 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/examples/README.md @@ -0,0 +1,12 @@ +## Examples + +### Example 1: commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone implementation + +This example demonstrates the implementation of our software. You can find the code in the [Commbase](https://github.com/mydroidandi/commbase) repository. + +To run the example, follow these steps: +1. Clone the repository. +2. Install the necessary dependencies. +3. Execute the program. + +Make sure to check out the repository for more details and instructions. diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/file_paths.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/file_paths.py new file mode 100644 index 00000000..1b7440e5 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/file_paths.py @@ -0,0 +1,190 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# file_paths.py +# This file stores functions related to loading and managing file paths +# Requires os.path already imported + +# Requirements +from config import CONFIG_FILE_DIR, CONFIG_FILE_PATH + + +def get_assistant_discourse_from_language_model_file(): + """ + Retrieves the value of the ASSISTANT_DISCOURSE_FROM_LANGUAGE_MODEL_FILE + variable from the configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + model_discourse_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "ASSISTANT_DISCOURSE_FROM_LANGUAGE_MODEL_FILE": + # Remove the quotes from the value of the variable + model_discourse_file = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if model_discourse_file is not None: + return model_discourse_file + + # If the variable was not found, return None + return None + + +def get_chat_log_file(): + """ + Retrieves the value of the CHAT_LOG_FILE variable from the configuration + file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + chat_log_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "CHAT_LOG_FILE": + # Remove the quotes from the value of the variable + chat_log_file = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if chat_log_file is not None: + return chat_log_file + + # If the variable was not found, return None + return None + + +def get_chat_memory_file(): + """ + Retrieves the value of the CHAT_MEMORY_FILE variable from the configuration + file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + chat_memory_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "CHAT_MEMORY_FILE": + # Remove the quotes from the value of the variable + chat_memory_file = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if chat_memory_file is not None: + return chat_memory_file + + # If the variable was not found, return None + return None + + +def get_run_voice_recorder_in_pane_path(): + """ + Retrieves the value of the RUN_VOICE_RECORDER_IN_PANE_PATH variable from + the configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + run_recorder_path = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "RUN_VOICE_RECORDER_IN_PANE_PATH": + # Remove the quotes from the value of the variable + run_recorder_path = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if run_recorder_path is not None: + return run_recorder_path + + # If the variable was not found, return None + return None + + +def get_stt_engine_path(): + """ + Retrieves the value of the STT_ENGINE_PATH variable from the configuration + file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + stt_engine_path = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "STT_ENGINE_PATH": + # Remove the quotes from the value of the variable + stt_engine_path = value.strip()[1:-1] + + # Check if the variable was found + if stt_engine_path is not None: + return stt_engine_path + + # If the variable was not found, return None + return None diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/functions.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/functions.py new file mode 100644 index 00000000..46460391 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/functions.py @@ -0,0 +1,262 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# functions.py +# This file contains a collection of utility functions for various tasks. It +# provides a set of commonly used functions related to data manipulation, file +# handling, and mathematical operations. + +# Imports +import torch # pip install torch +from config import CONFIG_FILE_PATH +from sentence_transformers import SentenceTransformer + + +def get_audible_assistant_logging_on(): + """ + Reads the 'AUDIBLE_ASSISTANT_LOGGING_ON' variable from the environment + configuration file. Returns the string value of the variables if found, or + None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + audible_assistant_logging_on = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "AUDIBLE_ASSISTANT_LOGGING_ON": + # Remove the quotes from the value of the variable + audible_assistant_logging_on = value.strip()[1:-1] + + # Check if the variable was found + if audible_assistant_logging_on is not None: + return audible_assistant_logging_on + + # If the variable was not found, return None + return None + + +def get_chat_participant_names(): + """ + Gets the chat participant names from the config file. + + Reads the 'ASSISTANT_NAME_IN_CHAT_PANE' and 'END_USER_NAME_IN_CHAT_PANE' + variables from the environment configuration file. Returns a tuple + containing the string values of the variables if found, or None if any of + the variables are not present. + + Returns: + tuple or None: A tuple containing the assistant, system, and end user + names in the chat pane, or None, if any of the variables are not found. + """ + # Initialize variables for the chat names + assistant_name = None + end_user_name = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "END_USER_NAME_IN_CHAT_PANE": + # Remove the quotes from the value of the variable + end_user_name = value.strip()[1:-1] + + elif variable_name == "ASSISTANT_NAME_IN_CHAT_PANE": + # Remove the quotes from the value of the variable + assistant_name = value.strip()[1:-1] + + # Check if all two variables were found + if ( + assistant_name is not None + and end_user_name is not None + ): + return end_user_name, assistant_name + + # If any of the variables are not found, return None + return None + + +def get_commbase_recorder_transmitter_quit_char(): + """ + Reads the 'COMMBASE_RECORDER_TRANSMITTER_QUIT_CHAR' variable from the + environment configuration file. Returns the string value of the variables + if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + quit_char = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_RECORDER_TRANSMITTER_QUIT_CHAR": + # Remove the quotes from the value of the variable + quit_char = value.strip()[1:-1] + + # Check if the variable was found + if quit_char is not None: + return quit_char + + # If the variable was not found, return None + return None + + +def get_commbase_stt_whisper_reactive_p_auto_open_recorder_after_command(): + """ + Reads the 'COMMBASE_STT_WHISPER_REACTIVE_P_AUTO_OPEN_RECORDER_AFTER_COMMAND' + variable from the environment configuration file. Returns the string value + of the variables if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + auto_open_recoder = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_STT_WHISPER_REACTIVE_P_AUTO_OPEN_RECORDER_AFTER_COMMAND": + # Remove the quotes from the value of the variable + auto_open_recoder = value.strip()[1:-1] + + # Check if the variable was found + if auto_open_recoder is not None: + return auto_open_recoder + + # If the variable was not found, return None + return None + + +def get_log_severity_level_1(): + """ + Reads the 'LOG_SEVERITY_LEVEL_1' variable from the environment + configuration file. Returns the string value of the variables if found, or + None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + log_severity_level_1 = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "LOG_SEVERITY_LEVEL_1": + # Remove the quotes from the value of the variable + log_severity_level_1 = value.strip()[1:-1] + + # Check if the variable was found + if log_severity_level_1 is not None: + return log_severity_level_1 + + # If the variable was not found, return None + return None + + +def test_embedding_model(): + """ + Initialize the model. + max_seq_length is 256. That means that the maximum number of tokens (like + words) that can be encoded into a single vector embedding is 256. Anything + beyond this must be truncated. + word_embedding_dimension is 384. This number is the dimensionality of + vectors output by this model. It is important that we know this number + later when initializing our Pinecone vector index. + Normalize(). This final normalization step indicates that all vectors + produced by the model are normalized. That means that models that we would + typical measure similarity for using cosine similarity can also make use of + the dotproduct similarity metric. In fact, with normalized vectors cosine + and dotproduct are equivalent. + """ + device = 'cuda' if torch.cuda.is_available() else 'cpu' + if device != 'cuda': + print( + f"You are using {device}. This is much slower than using " + "a CUDA-enabled GPU. If on Colab you can change this by " + "clicking Runtime > Change runtime type > GPU." + ) + + # To create our embeddings we will us the MiniLM-L6 sentence transformer + # model. This is a very efficient semantic similarity embedding model from + # the sentence-transformers library. + model = SentenceTransformer('all-MiniLM-L6-v2', device=device) + + # Create a sentence embedding using this model. + # Encoding this single sentence leaves us with a 384 dimensional sentence + # embedding (aligned to the word_embedding_dimension above). + # To prepare this for upsert to Pinecone, all we do is this: + # + # _id = '0' + # metadata = {'text': query} + # + # vectors = [(_id, xq, metadata)] + # + # Later when we do upsert our data to Pinecone, we will be doing so in + # batches. Meaning vectors will be a list of (id, embedding, metadata) + # tuples. + query = 'which city is the most populated in the world?' + + xq = model.encode(query) + + return model, xq diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/load_a_dataset_with_pandas.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/load_a_dataset_with_pandas.py new file mode 100644 index 00000000..4df99ee4 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/load_a_dataset_with_pandas.py @@ -0,0 +1,50 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# load_a_dataset_with_pandas.py +# Uses Wikipedia articles from the SQuAD dataset (squad_dataset.csv + +# Imports +import pandas as pd + +df = pd.read_csv('chat_log_dataset.csv') + +# Print the top 10 rows (or all rows if there are less than 10) +print(df.head(10)) + +# Get the total number of rows +total_rows = len(df) + +# Print the total number of rows +print(f"Total rows in the DataFrame: {total_rows}") diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/making_queries.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/making_queries.py new file mode 100644 index 00000000..904803d5 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/making_queries.py @@ -0,0 +1,89 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# 04_making_queries.py +# Desc + +# Imports +import functions +import json +import sentence_transformers +# import time +from pinecone import Pinecone, ServerlessSpec + + +# Call test_embedding_model() +model, xq = functions.test_embedding_model() + + +# Initialize the Pinecone client with your API key +pc = Pinecone(api_key="") + +index_name = 'commbase-log-chats' + +# Connect to index and print the index statistics +index = pc.Index(index_name) + + +# query = "Who is Eva?" +query = "What happened at 20:35:08?" +# query = "What is the meaning of zero day?" + + +# ---- +# create the query vector +xq = model.encode(query).tolist() + +# now query +xc = index.query(vector=xq, top_k=5, include_metadata=True) +print(xc) + +print("") +print(query) + +# In the returned response xc we can see the most relevant questions to our particular query. We can reformat this response to be a little easier to read +# for result in xc['matches']: +# print(f"{round(result['score'], 2)}: {result['metadata']['text']}") + +# Print the 'speaker' and 'text' along with the score +for result in xc['matches']: + score = round(result['score'], 2) + timestamp = result['metadata']['timestamp'] + speaker = result['metadata']['speaker'] + text = result['metadata']['text'] + print(f"{score}: {timestamp} {speaker}: {text}") + + +# ## Add this to the prompt: +# ## Please keep your responses to a maximum of three to four sentences. diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/resources/src/chat_log_20240609203647_4314.txt b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/resources/src/chat_log_20240609203647_4314.txt new file mode 100644 index 00000000..462adfe1 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/resources/src/chat_log_20240609203647_4314.txt @@ -0,0 +1,51 @@ +[2024-06-09 19:42:10] app: DEBUG: โœจ ASSISTANT: Mute the microphone to pause recording. +[2024-06-09 19:42:39] app: DEBUG: โœจ ASSISTANT: ๐ŸŒ™โœจ Under the moon's silver glow, fairy dust whispers secrets to those who dare to listen. ๐Ÿงšโ€โ™€. A sprinkle of pixie magic can make your dreams dance like fireflies on a midsummer night! ๐ŸŒนโœจ But remember, every spell needs a pinch of responsibility. So use this mystical knowledge wisely and keep the balance in our enchanting world. ๐ŸŒ๐Ÿ”ฎ Now go forth with wonder in your heart and let magic guide you gently through life's twinkling journey! โœจ๐Ÿš€๐Ÿ’ซ +[2024-06-09 19:42:41] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: +[2024-06-09 19:42:49] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Hey how you doing today? +[2024-06-09 19:43:07] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Tell me about yourself. +[2024-06-09 19:43:27] app: DEBUG: โœจ ASSISTANT: I am eva ai. +[2024-06-09 19:43:51] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Nice to meet you, Eva. What can you do? +[2024-06-09 19:44:15] app: DEBUG: โœจ ASSISTANT: I can help with various tasks such as answering questions, managing your schedule, setting reminders, and even controlling smart home devices. How can I assist you today? +[2024-06-09 19:44:20] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: +[2024-06-09 19:44:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: This is the Zero Day. +[2024-06-09 19:44:51] app: DEBUG: โœจ ASSISTANT: Could you please provide more details about what you mean by "Zero Day"? Are you referring to a specific event or topic? +[2024-06-09 19:45:12] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: It's the day we launch our new project. We've been preparing for this for months. +[2024-06-09 19:45:35] app: DEBUG: โœจ ASSISTANT: That's exciting! Do you need any assistance with the launch? Perhaps help with scheduling, reminders, or any last-minute checks? +[2024-06-09 19:45:52] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, can you set a reminder for our team meeting at 10 AM? +[2024-06-09 19:46:09] app: DEBUG: โœจ ASSISTANT: Sure, I've set a reminder for your team meeting at 10 AM. Anything else I can help with for the launch day? +[2024-06-09 19:46:27] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Not right now, but I'll let you know if something comes up. Thanks, Eva. +[2024-06-09 19:46:41] app: DEBUG: โœจ ASSISTANT: You're welcome! I'm here if you need anything. Good luck with the launch! + +[2024-06-09 20:02:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Hey Eva, can you check the weather for today? +[2024-06-09 20:02:47] app: DEBUG: โœจ ASSISTANT: The weather today is sunny with a high of 75ยฐF and a low of 55ยฐF. Perfect day for a launch! +[2024-06-09 20:03:11] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Great! Can you also remind me to call our PR team at 3 PM? +[2024-06-09 20:03:29] app: DEBUG: โœจ ASSISTANT: Reminder set to call the PR team at 3 PM. Anything else on your mind? +[2024-06-09 20:03:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you send an email to the team confirming the meeting at 10 AM? +[2024-06-09 20:04:07] app: DEBUG: โœจ ASSISTANT: Sure, I'll draft the email for you. Here it is: "Hello Team, This is a reminder that we have a meeting at 10 AM to discuss the project launch. Please be on time. Best, [Your Name]." Would you like me to send it now? +[2024-06-09 20:04:27] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, please send it. +[2024-06-09 20:04:41] app: DEBUG: โœจ ASSISTANT: Email sent. Anything else I can do for you? +[2024-06-09 20:05:01] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you play some relaxing music? +[2024-06-09 20:05:18] app: DEBUG: โœจ ASSISTANT: Sure, playing a relaxing music playlist now. +[2024-06-09 20:06:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Thanks, Eva. That helps a lot. +[2024-06-09 20:06:45] app: DEBUG: โœจ ASSISTANT: You're welcome! I'm glad to help. If you need anything else, just let me know. + +[2024-06-09 20:30:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Eva, can you update me on my calendar for the rest of the week? +[2024-06-09 20:31:05] app: DEBUG: โœจ ASSISTANT: Here's your calendar for the rest of the week: Tomorrow you have a team debrief at 9 AM, a client call at 1 PM, and a project wrap-up meeting at 4 PM. On Thursday, you have a strategy meeting at 10 AM and a marketing review at 3 PM. Friday is mostly open except for a lunch meeting at 12 PM. Would you like to make any changes? +[2024-06-09 20:31:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: No changes for now. Can you set a reminder for the client call tomorrow at 12:45 PM? +[2024-06-09 20:31:48] app: DEBUG: โœจ ASSISTANT: Reminder set for the client call at 12:45 PM tomorrow. Anything else? +[2024-06-09 20:32:05] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you check if there are any new emails from the client? +[2024-06-09 20:32:22] app: DEBUG: โœจ ASSISTANT: Checking your emails now... You have one new email from the client confirming the details for tomorrow's call. Would you like me to read it out or do anything else with it? +[2024-06-09 20:32:42] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Just mark it as read. I'll go through it later. +[2024-06-09 20:32:57] app: DEBUG: โœจ ASSISTANT: Email marked as read. Anything else I can assist you with? +[2024-06-09 20:33:15] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you pull up the latest sales report? +[2024-06-09 20:33:32] app: DEBUG: โœจ ASSISTANT: Here is the latest sales report: Total sales for the month are $50,000, a 10% increase from last month. Would you like a more detailed breakdown? +[2024-06-09 20:33:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, can you provide a breakdown by region? +[2024-06-09 20:34:05] app: DEBUG: โœจ ASSISTANT: Certainly. Sales by region are as follows: North America: $20,000, Europe: $15,000, Asia: $10,000, Rest of the World: $5,000. Anything specific you'd like to analyze? +[2024-06-09 20:34:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you compare these figures to the previous month? +[2024-06-09 20:34:45] app: DEBUG: โœจ ASSISTANT: Comparing to the previous month: North America had $18,000, Europe $14,000, Asia $9,000, and Rest of the World $4,000. Overall, each region saw an increase in sales. Would you like to see this in a chart? +[2024-06-09 20:35:08] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, a chart would be helpful. +[2024-06-09 20:35:25] app: DEBUG: โœจ ASSISTANT: Here is the chart comparing sales by region for the past two months. [Chart displayed] Would you like to save or share this chart? +[2024-06-09 20:35:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Save it to the project folder and email it to the team. +[2024-06-09 20:36:07] app: DEBUG: โœจ ASSISTANT: The chart has been saved to the project folder and emailed to the team. Anything else I can do for you? +[2024-06-09 20:36:27] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: That's all for now. Thanks, Eva. +[2024-06-09 20:36:42] app: DEBUG: โœจ ASSISTANT: You're welcome! Have a great day, and good luck with the launch! diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/resources/src/chat_log_20240609220435_18072.txt b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/resources/src/chat_log_20240609220435_18072.txt new file mode 100644 index 00000000..22ca6a52 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/resources/src/chat_log_20240609220435_18072.txt @@ -0,0 +1,52 @@ +[2024-06-09 21:15:39] app: DEBUG: โœจ ASSISTANT: Mute the microphone to pause recording. +[2024-06-09 21:15:49] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Hey there, Eva. +[2024-06-09 21:16:07] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Tell me about yourself. +[2024-06-09 21:16:27] app: DEBUG: โœจ ASSISTANT: I am eva ai. +[2024-06-09 21:16:51] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Nice to meet you, Eva. What are some of your capabilities? +[2024-06-09 21:17:15] app: DEBUG: โœจ ASSISTANT: I can assist with a variety of tasks including answering questions, managing your schedule, setting reminders, sending emails, and controlling smart home devices. How can I assist you today? +[2024-06-09 21:17:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Today is a big day. We're launching our new app. +[2024-06-09 21:17:51] app: DEBUG: โœจ ASSISTANT: That's wonderful! Congratulations! How can I support you with the launch? +[2024-06-09 21:18:12] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you set a reminder for our team to check the servers at 9 AM? +[2024-06-09 21:18:29] app: DEBUG: โœจ ASSISTANT: I've set a reminder for the team to check the servers at 9 AM. Anything else I can do? +[2024-06-09 21:18:47] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, can you also send a message to the marketing team to confirm the social media posts are scheduled? +[2024-06-09 21:19:05] app: DEBUG: โœจ ASSISTANT: I've drafted the message for the marketing team: "Hi Team, Please confirm that the social media posts for the app launch are scheduled. Thanks!" Would you like me to send it now? +[2024-06-09 21:19:25] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, please send it. +[2024-06-09 21:19:41] app: DEBUG: โœจ ASSISTANT: The message has been sent. Anything else? +[2024-06-09 21:20:00] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you check my calendar and let me know if I have any meetings today? +[2024-06-09 21:20:15] app: DEBUG: โœจ ASSISTANT: You have two meetings scheduled today: A project kickoff meeting at 11 AM and a client feedback session at 2 PM. Would you like to make any changes or set reminders? +[2024-06-09 21:20:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Set a reminder for the client feedback session 15 minutes before it starts. +[2024-06-09 21:20:50] app: DEBUG: โœจ ASSISTANT: Reminder set for the client feedback session at 1:45 PM. Anything else I can do for you? +[2024-06-09 21:21:07] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you play some upbeat music to keep the energy high? +[2024-06-09 21:21:22] app: DEBUG: โœจ ASSISTANT: Sure, playing an upbeat music playlist now. +[2024-06-09 21:21:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Thanks, Eva. This is perfect. +[2024-06-09 21:22:05] app: DEBUG: โœจ ASSISTANT: You're welcome! Glad I could help. Let me know if there's anything else you need. + +[2024-06-09 21:35:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Hey Eva, can you provide an update on our app downloads so far? +[2024-06-09 21:35:47] app: DEBUG: โœจ ASSISTANT: Checking the latest data... As of now, there have been 2,500 downloads of your app. Would you like a detailed breakdown by region or platform? +[2024-06-09 21:36:11] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, a breakdown by platform would be great. +[2024-06-09 21:36:28] app: DEBUG: โœจ ASSISTANT: Sure, here it is: iOS: 1,500 downloads, Android: 1,000 downloads. Anything else you need? +[2024-06-09 21:36:45] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you prepare a report summarizing this data for the team? +[2024-06-09 21:37:01] app: DEBUG: โœจ ASSISTANT: I've prepared the report summarizing the download data. Would you like me to email it to the team? +[2024-06-09 21:37:18] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, email it to the team and also save a copy in the project folder. +[2024-06-09 21:37:35] app: DEBUG: โœจ ASSISTANT: The report has been emailed to the team and saved in the project folder. Anything else? +[2024-06-09 21:37:55] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you check if there are any new emails from our investors? +[2024-06-09 21:38:10] app: DEBUG: โœจ ASSISTANT: Checking your emails now... You have two new emails from investors. One is congratulating the team on the launch, and the other is requesting a follow-up meeting next week. Would you like to respond to them? +[2024-06-09 21:38:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, reply to the congratulatory email with a thank you note and schedule the follow-up meeting for next Wednesday at 3 PM. +[2024-06-09 21:38:55] app: DEBUG: โœจ ASSISTANT: I've sent a thank you note and scheduled the follow-up meeting for next Wednesday at 3 PM. Anything else I can do for you? +[2024-06-09 21:39:15] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: That's all for now. Thanks, Eva. +[2024-06-09 21:39:30] app: DEBUG: โœจ ASSISTANT: You're welcome! I'm here if you need anything else. Best of luck with the rest of the launch day! + +[2024-06-09 22:00:45] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Eva, can you update me on the social media engagement for our launch posts? +[2024-06-09 22:01:00] app: DEBUG: โœจ ASSISTANT: Sure, checking the latest social media data... Your launch posts have received 1,000 likes, 200 shares, and 150 comments across all platforms. Would you like a detailed analysis? +[2024-06-09 22:01:27] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, please provide a detailed analysis including the most engaged platform. +[2024-06-09 22:01:45] app: DEBUG: โœจ ASSISTANT: Here is the detailed analysis: Most engaged platform: Instagram with 600 likes, 120 shares, and 100 comments. Facebook follows with 300 likes, 50 shares, and 30 comments. Twitter has 100 likes, 30 shares, and 20 comments. Would you like this data in a report? +[2024-06-09 22:02:10] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, create a report and email it to the marketing team. +[2024-06-09 22:02:28] app: DEBUG: โœจ ASSISTANT: The report has been created and emailed to the marketing team. Is there anything else I can do for you? +[2024-06-09 22:02:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you summarize the comments for me? +[2024-06-09 22:03:10] app: DEBUG: โœจ ASSISTANT: The comments are mostly positive, praising the app's features and user interface. Some users have requested additional features, and a few have reported minor bugs. Would you like to address these issues now? +[2024-06-09 22: + +03:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Forward the bug reports to the development team and schedule a meeting to discuss new feature requests. +[2024-06-09 22:03:55] app: DEBUG: โœจ ASSISTANT: Bug reports have been forwarded to the development team, and a meeting to discuss new feature requests is scheduled for tomorrow at 10 AM. Anything else? +[2024-06-09 22:04:15] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: That's all for now. Thanks, Eva. +[2024-06-09 22:04:30] app: DEBUG: โœจ ASSISTANT: You're welcome! Have a great day and congratulations again on the launch! diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/save_chat_files_as_json_dataset.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/save_chat_files_as_json_dataset.py new file mode 100644 index 00000000..849b5692 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/save_chat_files_as_json_dataset.py @@ -0,0 +1,647 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# 01_save_chat_files_as_json_dataset.py +# Automates the processing of raw chat log files into a structured JSON +# dataset, performing text cleaning, content chunking (optional), and dataset +# augmentation with additional fields. It's designed to handle multiple chat +# files, ensuring data integrity and format consistency throughout the +# transformation process. +# It creates the following JSON Structure: +# Root Object: +# log_chats: This is an array that contains multiple objects, each representing +# a single chat log entry. +# Chat Log Entry Objects: +# Each object within the log_chats array represents one entry in the chat log +# that contains properties. + +# Imports +import os +import json +import re # Regular expressions +from collections import OrderedDict + +# Global ID counter +id_counter = 1 + +# Set the chunking strategy on/off +chunk_by_period_on = "False" # User "True" or "False" + + +def load_chat_files(directory): + """ + Load all chat files from the specified directory. + + Args: + - directory (str): Directory path containing the chat files. + + Returns: + - list: A list of tuples, where each tuple contains the filename and the + file content. + Each tuple represents one chat file found in the directory. + + Example: + If 'directory' contains two text files, 'file1.txt' and 'file2.txt', the + function will return [('file1.txt', 'file1 content'), ('file2.txt', + 'file2 content')]. + """ + chat_files = [] + for filename in os.listdir(directory): + if filename.endswith(".txt"): + filepath = os.path.join(directory, filename) + with open(filepath, 'r', encoding='utf-8') as file: + # Return a list of tuples, each containing the filename and the file content. + chat_files.append((filename, file.read())) + # Return the file content + # chat_files.append(file.read()) + + return chat_files + + +def process_chat_content(content, id_counter, source): + """ + Process the chat content and create a structured dataset. + + Args: + - content (str): Raw content of the chat log. + - id_counter (int): Current ID counter for assigning unique IDs to entries. + - source (str): Source identifier of the chat log. + + Returns: + - tuple: A tuple containing two elements: + - list: A list of dictionaries, each representing a structured chat log + entry. + - int: Updated ID counter after processing. + + Notes: + - Uses a regular expression pattern to parse each line of the content. + - Each matched line is transformed into a dictionary entry with keys + 'timestamp', 'origin', 'severity', 'speaker', 'text', and 'source'. + - Increments the id_counter for each entry added to the dataset. + + Example: + If 'content' represents a chat log with several lines formatted as + '[timestamp] origin: severity: speaker: text', the function returns a list + of dictionaries representing each line's structured data along with the + updated id_counter. + """ + dataset = [] + # Improved regex pattern to capture all components including the text + pattern = re.compile(r'\[(.*?)\] (.*?): (.*?): (.*?): (.*)') + + for line in content.split('\n'): + match = pattern.match(line) + if match: + timestamp, origin, severity, speaker, text = match.groups() + # timestamp, origin, _, speaker, text = match.groups() # Exclude severity here + entry = { + "timestamp": timestamp, + "origin": origin, + "severity": severity, + "speaker": speaker, + "text": text, + "source": source + } + dataset.append(entry) + id_counter += 1 + + return dataset, id_counter + + +def save_dataset(dataset, output_file): + """ + Save a dataset to a JSON file. + + Args: + - dataset (list): List of dictionaries representing the dataset to be + saved. + - output_file (str): File path where the JSON dataset will be saved. + + Returns: + - None + + Notes: + - Encodes the dataset to JSON format with UTF-8 encoding. + - Ensures non-ASCII characters are properly handled by setting + `ensure_ascii=False`. + - Formats the JSON output with an indent of 4 spaces for readability. + + Example: + Given 'dataset' as a list of dictionaries representing structured data + entries, the function saves this dataset to the specified 'output_file' + path in JSON format. + """ + with open(output_file, 'w', encoding='utf-8') as file: + json.dump(dataset, file, ensure_ascii=False, indent=4) # Ensure proper encoding + + +def remove_emojis(text): + """ + Remove emojis from the given text. + + Args: + - text (str): Input text containing emojis. + + Returns: + - str: Text with emojis removed. + + Notes: + - Uses a regular expression pattern to match and remove various Unicode + ranges. + representing emojis and other symbols. + - Handles Unicode characters to ensure comprehensive emoji removal. + - Returns the modified text without emojis. + + Example: + If 'text' contains emojis such as ๐Ÿ˜Š, ๐Ÿš—, or ๐Ÿ‡บ๐Ÿ‡ธ, the function removes + these emojis. + and returns the cleaned text. + """ + emoji_pattern = re.compile( + "[" + u"\U0001F600-\U0001F64F" # emoticons + u"\U0001F300-\U0001F5FF" # symbols & pictographs + u"\U0001F680-\U0001F6FF" # transport & map symbols + u"\U0001F1E0-\U0001F1FF" # flags (iOS) + u"\U00002500-\U00002BEF" # chinese char + u"\U00002702-\U000027B0" + u"\U00002702-\U000027B0" + u"\U000024C2-\U0001F251" + u"\U0001f926-\U0001f937" + u"\U00010000-\U0010ffff" + u"\u2640-\u2642" + u"\u2600-\u2B55" + u"\u200d" + u"\u23cf" + u"\u23e9" + u"\u231a" + u"\ufe0f" # dingbats + u"\u3030" + "]+", + flags=re.UNICODE + ) + return emoji_pattern.sub(r'', text) + + +def trim_speaker_whitespace(text): + """ + Trim leading and trailing whitespace from the speaker field. + + Args: + - text (str): Input text representing the speaker's name. + + Returns: + - str: Text with leading and trailing whitespace removed. + + Notes: + - Uses the `strip()` method to remove any leading and trailing whitespace + characters + from the input text. + - Ensures the speaker's name is clean and formatted without unnecessary + spaces. + + Example: + If 'text' is " John Doe ", the function trims it to "John Doe". + """ + return text.strip() + + +def remove_empty_entries(dataset): + """ + Remove entries from the dataset where the 'text' field after the speaker is + empty. + + Args: + - dataset (list): List of dictionaries representing the dataset. + + Returns: + - list: Filtered list of dictionaries with entries where 'text' is not + empty after stripping. + + Notes: + - Filters out entries in the dataset where the 'text' field, after + stripping leading and trailing whitespace, evaluates to False (empty + string). + - Ensures only meaningful entries with non-empty text are retained in the + cleaned dataset. + + Example: + If 'dataset' contains multiple dictionaries representing chat log entries, + the function removes entries where the 'text' field is empty after + whitespace stripping and returns the cleaned dataset. + """ + cleaned_dataset = [entry for entry in dataset if entry['text'].strip()] + return cleaned_dataset + + +def generate_chat_dataset(): + """ + Generate a structured dataset from chat log files in a specified directory. + + This function reads chat log files from a directory, processes their + content, cleans and formats the data, removes entries with empty text + fields, and saves the resulting dataset to a JSON file. + + Args: + - None (Uses global variables and constants defined within the function). + + Returns: + - None + + Notes: + - Uses global variables: `id_counter` to assign unique IDs, + `source_directory` for input files, and `output_file` for saving the + processed dataset. + - Calls `load_chat_files()` to retrieve chat log contents. + - Calls `process_chat_content()` to parse and structure each chat log + entry. + - Calls `trim_speaker_whitespace()` and `remove_emojis()` to clean speaker + names and text content. + - Handles encoding issues by converting text to UTF-8 where necessary. + - Removes entries from the dataset where the 'text' field is empty after + trimming. + - Saves the cleaned dataset to a JSON file and prints statistics about the + dataset creation. + + Example: + When executed, the function processes chat log files from './src' + directory, cleans and formats the data, removes empty entries, and saves + the structured dataset to './build/dataset.json'. It then prints statistics + including the number of dictionaries created and the file path where the + dataset is saved. + """ + global id_counter + + source_directory = './src' + output_file = './build/dataset.json' + + chat_files = load_chat_files(source_directory) + all_data = [] + + for source, content in chat_files: + processed_data, id_counter = process_chat_content(content, id_counter, source) + # Remove emojis and special characters from processed data + for entry in processed_data: + entry['speaker'] = trim_speaker_whitespace(remove_emojis(entry['speaker'])) # Remove emojis and trim whitespace from speaker + entry['text'] = remove_emojis(entry['text']) + entry['text'] = entry['text'].encode('utf-8', 'ignore').decode('utf-8') # Handle encoding issues + # entry['content'] = remove_emojis(entry.get('content', '')) + # entry['content'] = entry['content'].encode('utf-8', 'ignore').decode('utf-8') # Handle encoding issues + all_data.extend(processed_data) + + # Remove entries with empty text + all_data = remove_empty_entries(all_data) + + save_dataset(all_data, output_file) + print("Statistics:") + print("Dataset created from chat files.") + print(f"Dataset saved to {output_file}") + print(f"Number of dictionaries created: {id_counter}") + print(f"Number of dictionaries created: {len(all_data)}") + + +def add_content(): + """ + Add a 'content' field to each entry in the dataset with concatenated values + of other fields. + + This function reads a dataset from './build/dataset.json', processes each + entry, concatenates selected fields ('timestamp', 'origin', 'severity', + 'speaker', 'text') into a new 'content' field, creates a new dataset with + these additions, and saves it to './build/dataset_with_content.json'. + + Args: + - None + + Returns: + - None + + Notes: + - Loads the dataset from './build/dataset.json' using JSON format. + - Constructs the 'content' field for each entry by concatenating specific + fields. + - Uses OrderedDict to ensure 'content' is placed at the beginning of each + dictionary. + - Saves the modified dataset to './build/dataset_with_content.json' in JSON + format with indentation for readability. + - Prints messages indicating the addition of the 'content' field, the file + path where the dataset is saved, and the number of dictionaries created. + + Example: + When executed, the function processes the dataset from + './build/dataset.json', adds a 'content' field to each entry based on + specified fields, saves the updated dataset to + './build/dataset_with_content.json', and prints messages confirming the + operation and providing statistics about the dataset. + """ + input_file = './build/dataset.json' + output_file = './build/dataset_with_content.json' + + with open(input_file, 'r', encoding='utf-8') as file: + data = json.load(file) + + data_with_content = [] + for item in data: + content_text = ( + f"timestamp: {item['timestamp']} " + f"origin: {item['origin']} " + f"severity: {item['severity']} " + f"speaker: {item['speaker']}: {item['text']}" + ) + new_item = OrderedDict([("content", content_text), *item.items()]) + data_with_content.append(new_item) + + with open(output_file, 'w', encoding='utf-8') as output_file: + json.dump(data_with_content, output_file, indent=4) + + print("Content item added to the dataset.") + print(f"Dataset saved to {output_file}") + print(f"Number of dictionaries created: {len(data_with_content)}") + + +def add_id(): + """ + Add an 'id' field to each entry in the dataset. + + This function reads a dataset from either + './build/dataset_with_chunked_content.json' or + './build/dataset_with_content.json', depending on the value of the global + variable `chunk_by_period_on`. It iterates through each entry in the + dataset, assigns a unique ID starting from 1, adds the 'id' field to each + dictionary, creates a new dataset with IDs added, and saves it to + './build/dataset_with_id.json'. + + Args: + - None + + Returns: + - None + + Notes: + - Uses global variables: `chunk_by_period_on` to determine the input file + path, `output_file` for saving the dataset with IDs. + - Loads the dataset from the appropriate input file based on the condition. + - Assigns IDs starting from 1 using a local counter `id_counter`. + - Saves the modified dataset to './build/dataset_with_id.json' in JSON + format with indentation. + - Prints messages indicating the addition of the 'id' field, the file path + where the dataset is saved, and the number of dictionaries created. + + Example: + When executed, the function checks the value of `chunk_by_period_on` to + determine the input file, adds 'id' fields to each entry in the dataset, + saves the updated dataset to './build/dataset_with_id.json', and prints + messages confirming the operation and providing statistics about the + dataset. + """ + if chunk_by_period_on == "True": + input_file = './build/dataset_with_chunked_content.json' + else: + input_file = './build/dataset_with_content.json' + + output_file = './build/dataset_with_id.json' + + with open(input_file, 'r', encoding='utf-8') as file: + data = json.load(file) + + data_with_id = [] + id_counter = 1 # Local counter to ensure IDs start from 1 for this specific dataset + + for item in data: + item['id'] = id_counter + data_with_id.append(item) + id_counter += 1 + + with open(output_file, 'w', encoding='utf-8') as output_file: + json.dump(data_with_id, output_file, indent=4) + + print("Id item added to the dataset.") + print(f"Dataset saved to {output_file}") + print(f"Number of dictionaries created: {len(data_with_id)}") + + +def chunk_by_period(): + """ + Chunk the 'content' and 'text' fields in the dataset based on periods. + + This function chunks the 'content' and 'text' fields of each entry in the + dataset when it encounters a period ('.'). It splits the 'content' and + 'text' fields into sentences based on periods, ensures that corresponding + sentences in both fields match in length, and creates new entries for each + sentence pair with the period added back to each sentence except the last + one. The resulting dataset is saved to + './build/dataset_with_chunked_content.json'. + + Args: + - None + + Returns: + - None + + Notes: + - Loads the dataset from './build/dataset_with_content.json'. + - Splits 'content' and 'text' fields into sentences based on periods ('.'). + - Asserts that the number of sentences in 'content' and 'text' fields + match. + - Saves the chunked dataset to './build/dataset_with_chunked_content.json' + in JSON format with indentation. + - Prints messages confirming the chunking operation and providing + statistics about the resulting dataset. + + Example: + When executed, the function processes the dataset from + './build/dataset_with_content.json', chunks the 'content' and 'text' fields + based on periods, saves the chunked dataset to + './build/dataset_with_chunked_content.json', + and prints messages indicating the success of the chunking operation and + the number of dictionaries created in the resulting dataset. + """ + if chunk_by_period_on == "True": + input_file = './build/dataset_with_content.json' + output_file = './build/dataset_with_chunked_content.json' + + with open(input_file, 'r', encoding='utf-8') as file: + data = json.load(file) + + chunked_data = [] + for item in data: + content = item['content'] + text = item['text'] + + # Find the index where the actual message starts after "speaker: " + prefix_end_index = content.index(item['speaker'] + ':') + len(item['speaker'] + ': ') + + # Split the message part of content and text by period + prefix = content[:prefix_end_index] + message = content[prefix_end_index:] + message_text = text + + message_sentences = message.split('. ') + text_sentences = message_text.split('. ') + + # Ensure the sentences arrays match in length + assert len(message_sentences) == len(text_sentences) + + # Add the period back to each sentence except the last one + message_sentences = [s + '.' if i < len(message_sentences) - 1 else s for i, s in enumerate(message_sentences)] + text_sentences = [s + '.' if i < len(text_sentences) - 1 else s for i, s in enumerate(text_sentences)] + + for message_sentence, text_sentence in zip(message_sentences, text_sentences): + if message_sentence.strip(): # Avoid adding empty sentences + new_item = OrderedDict(item) + new_item['content'] = prefix + message_sentence + new_item['text'] = text_sentence + chunked_data.append(new_item) + + with open(output_file, 'w', encoding='utf-8') as output_file: + json.dump(chunked_data, output_file, indent=4) + + print(f"Chunked dataset saved to {output_file}.") + print(f"Number of dictionaries created: {len(chunked_data)}") + + +def wrap_inside_object(): + """ + Wrap the JSON content inside an object called 'log_chats'. + + This function reads JSON data from './build/dataset_with_id.json', wraps it + inside a new JSON object with key 'log_chats', and saves the wrapped + dataset to './build/log_chats.json'. It also prints debugging information + about the first item in the dataset before saving, including checking for + emojis. + + Args: + - None + + Returns: + - None + + Notes: + - Reads input JSON data from './build/dataset_with_id.json'. + - Wraps the dataset inside a new JSON object with key 'log_chats'. + - Saves the wrapped dataset to './build/log_chats.json' with indentation. + - Prints debugging information about the first item in the dataset before + saving, including its contents for debugging purposes. + + Example: + When executed, the function reads the dataset from + './build/dataset_with_id.json', wraps it inside an object 'log_chats', + saves the wrapped dataset to './build/log_chats.json', and prints messages + confirming the wrapping operation and indicating where the dataset was + saved. + """ + input_file = './build/dataset_with_id.json' + output_file = './build/log_chats.json' + + with open(input_file, 'r', encoding='utf-8') as file: + data = json.load(file) + + log_chats = {'log_chats': data} + + # Debugging print statements + # print("First item in data before saving:") + # print(data[0]) # Prints the first item to check for emojis + print("Last item in data before saving:") + print(data[-1]) # Prints the last item in the list to check for emojis + + with open(output_file, 'w', encoding='utf-8') as output_file: + json.dump(log_chats, output_file, indent=4) + + print("Dataset wrapped inside json object.") + print(f"Dataset saved to {output_file}") + + +def main(): + """ + Main function to orchestrate the data processing pipeline. + + This function coordinates the execution of several sub-functions to + generate and manipulate datasets: + 1. Calls generate_chat_dataset() to create 'dataset.json' containing + processed chat data. + 2. Calls add_content() to add a 'content' field to each entry in + 'dataset.json', creating 'dataset_with_content.json'. + 3. Depending on the value of chunk_by_period_on: + - If True, calls chunk_by_period() to chunk the content and text fields + in 'dataset_with_content.json', creating + 'dataset_with_chunked_content.json'. + Then calls add_id() to add an 'id' field to each entry in + 'dataset_with_chunked_content.json', creating 'dataset_with_id.json'. + - If False, directly calls add_id() on 'dataset_with_content.json' to + create 'dataset_with_id.json'. + 4. Calls wrap_inside_object() to wrap the dataset inside a 'log_chats' + object and save it as 'log_chats.json'. + + Args: + - None + + Returns: + - None + + Notes: + - The function serves as the entry point for executing the entire data + processing pipeline. + - It manages the flow of data through various processing stages, ensuring + each stage is executed in sequence. + - Depending on the value of chunk_by_period_on, different processing paths + are followed to handle dataset chunking. + + Example: + When executed, this function initiates the processing pipeline: + - It generates initial datasets, adds necessary fields, chunks data if + specified, and wraps final data in an object. + - Each step's completion is indicated by appropriate print statements + confirming dataset creation and saving. + """ + # Generates dataset.json + generate_chat_dataset() + + # Generate dataset_with_content.json + add_content() + + if chunk_by_period_on == "True": + # Generate dataset_with_chunked_content.json + chunk_by_period() + # Generates dataset_with_id.json + add_id() + else: + # Generates dataset_with_id.json + add_id() + + # Generates log_chats.json + wrap_inside_object() + + +if __name__ == "__main__": + main() diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory/COPYING b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/COPYING new file mode 100644 index 00000000..94a9ed02 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/COPYING @@ -0,0 +1,674 @@ + GNU GENERAL PUBLIC LICENSE + Version 3, 29 June 2007 + + Copyright (C) 2007 Free Software Foundation, Inc. + Everyone is permitted to copy and distribute verbatim copies + of this license document, but changing it is not allowed. + + Preamble + + The GNU General Public License is a free, copyleft license for +software and other kinds of works. + + The licenses for most software and other practical works are designed +to take away your freedom to share and change the works. By contrast, +the GNU General Public License is intended to guarantee your freedom to +share and change all versions of a program--to make sure it remains free +software for all its users. We, the Free Software Foundation, use the +GNU General Public License for most of our software; it applies also to +any other work released this way by its authors. You can apply it to +your programs, too. + + When we speak of free software, we are referring to freedom, not +price. Our General Public Licenses are designed to make sure that you +have the freedom to distribute copies of free software (and charge for +them if you wish), that you receive source code or can get it if you +want it, that you can change the software or use pieces of it in new +free programs, and that you know you can do these things. + + To protect your rights, we need to prevent others from denying you +these rights or asking you to surrender the rights. Therefore, you have +certain responsibilities if you distribute copies of the software, or if +you modify it: responsibilities to respect the freedom of others. + + For example, if you distribute copies of such a program, whether +gratis or for a fee, you must pass on to the recipients the same +freedoms that you received. You must make sure that they, too, receive +or can get the source code. And you must show them these terms so they +know their rights. + + Developers that use the GNU GPL protect your rights with two steps: +(1) assert copyright on the software, and (2) offer you this License +giving you legal permission to copy, distribute and/or modify it. + + For the developers' and authors' protection, the GPL clearly explains +that there is no warranty for this free software. For both users' and +authors' sake, the GPL requires that modified versions be marked as +changed, so that their problems will not be attributed erroneously to +authors of previous versions. + + Some devices are designed to deny users access to install or run +modified versions of the software inside them, although the manufacturer +can do so. This is fundamentally incompatible with the aim of +protecting users' freedom to change the software. The systematic +pattern of such abuse occurs in the area of products for individuals to +use, which is precisely where it is most unacceptable. Therefore, we +have designed this version of the GPL to prohibit the practice for those +products. If such problems arise substantially in other domains, we +stand ready to extend this provision to those domains in future versions +of the GPL, as needed to protect the freedom of users. + + Finally, every program is threatened constantly by software patents. +States should not allow patents to restrict development and use of +software on general-purpose computers, but in those that do, we wish to +avoid the special danger that patents applied to a free program could +make it effectively proprietary. To prevent this, the GPL assures that +patents cannot be used to render the program non-free. + + The precise terms and conditions for copying, distribution and +modification follow. + + TERMS AND CONDITIONS + + 0. Definitions. + + "This License" refers to version 3 of the GNU General Public License. + + "Copyright" also means copyright-like laws that apply to other kinds of +works, such as semiconductor masks. + + "The Program" refers to any copyrightable work licensed under this +License. Each licensee is addressed as "you". "Licensees" and +"recipients" may be individuals or organizations. + + To "modify" a work means to copy from or adapt all or part of the work +in a fashion requiring copyright permission, other than the making of an +exact copy. The resulting work is called a "modified version" of the +earlier work or a work "based on" the earlier work. + + A "covered work" means either the unmodified Program or a work based +on the Program. + + To "propagate" a work means to do anything with it that, without +permission, would make you directly or secondarily liable for +infringement under applicable copyright law, except executing it on a +computer or modifying a private copy. Propagation includes copying, +distribution (with or without modification), making available to the +public, and in some countries other activities as well. + + To "convey" a work means any kind of propagation that enables other +parties to make or receive copies. Mere interaction with a user through +a computer network, with no transfer of a copy, is not conveying. + + An interactive user interface displays "Appropriate Legal Notices" +to the extent that it includes a convenient and prominently visible +feature that (1) displays an appropriate copyright notice, and (2) +tells the user that there is no warranty for the work (except to the +extent that warranties are provided), that licensees may convey the +work under this License, and how to view a copy of this License. If +the interface presents a list of user commands or options, such as a +menu, a prominent item in the list meets this criterion. + + 1. Source Code. + + The "source code" for a work means the preferred form of the work +for making modifications to it. "Object code" means any non-source +form of a work. + + A "Standard Interface" means an interface that either is an official +standard defined by a recognized standards body, or, in the case of +interfaces specified for a particular programming language, one that +is widely used among developers working in that language. + + The "System Libraries" of an executable work include anything, other +than the work as a whole, that (a) is included in the normal form of +packaging a Major Component, but which is not part of that Major +Component, and (b) serves only to enable use of the work with that +Major Component, or to implement a Standard Interface for which an +implementation is available to the public in source code form. A +"Major Component", in this context, means a major essential component +(kernel, window system, and so on) of the specific operating system +(if any) on which the executable work runs, or a compiler used to +produce the work, or an object code interpreter used to run it. + + The "Corresponding Source" for a work in object code form means all +the source code needed to generate, install, and (for an executable +work) run the object code and to modify the work, including scripts to +control those activities. However, it does not include the work's +System Libraries, or general-purpose tools or generally available free +programs which are used unmodified in performing those activities but +which are not part of the work. For example, Corresponding Source +includes interface definition files associated with source files for +the work, and the source code for shared libraries and dynamically +linked subprograms that the work is specifically designed to require, +such as by intimate data communication or control flow between those +subprograms and other parts of the work. + + The Corresponding Source need not include anything that users +can regenerate automatically from other parts of the Corresponding +Source. + + The Corresponding Source for a work in source code form is that +same work. + + 2. Basic Permissions. + + All rights granted under this License are granted for the term of +copyright on the Program, and are irrevocable provided the stated +conditions are met. This License explicitly affirms your unlimited +permission to run the unmodified Program. The output from running a +covered work is covered by this License only if the output, given its +content, constitutes a covered work. This License acknowledges your +rights of fair use or other equivalent, as provided by copyright law. + + You may make, run and propagate covered works that you do not +convey, without conditions so long as your license otherwise remains +in force. You may convey covered works to others for the sole purpose +of having them make modifications exclusively for you, or provide you +with facilities for running those works, provided that you comply with +the terms of this License in conveying all material for which you do +not control copyright. Those thus making or running the covered works +for you must do so exclusively on your behalf, under your direction +and control, on terms that prohibit them from making any copies of +your copyrighted material outside their relationship with you. + + Conveying under any other circumstances is permitted solely under +the conditions stated below. Sublicensing is not allowed; section 10 +makes it unnecessary. + + 3. Protecting Users' Legal Rights From Anti-Circumvention Law. + + No covered work shall be deemed part of an effective technological +measure under any applicable law fulfilling obligations under article +11 of the WIPO copyright treaty adopted on 20 December 1996, or +similar laws prohibiting or restricting circumvention of such +measures. + + When you convey a covered work, you waive any legal power to forbid +circumvention of technological measures to the extent such circumvention +is effected by exercising rights under this License with respect to +the covered work, and you disclaim any intention to limit operation or +modification of the work as a means of enforcing, against the work's +users, your or third parties' legal rights to forbid circumvention of +technological measures. + + 4. Conveying Verbatim Copies. + + You may convey verbatim copies of the Program's source code as you +receive it, in any medium, provided that you conspicuously and +appropriately publish on each copy an appropriate copyright notice; +keep intact all notices stating that this License and any +non-permissive terms added in accord with section 7 apply to the code; +keep intact all notices of the absence of any warranty; and give all +recipients a copy of this License along with the Program. + + You may charge any price or no price for each copy that you convey, +and you may offer support or warranty protection for a fee. + + 5. Conveying Modified Source Versions. + + You may convey a work based on the Program, or the modifications to +produce it from the Program, in the form of source code under the +terms of section 4, provided that you also meet all of these conditions: + + a) The work must carry prominent notices stating that you modified + it, and giving a relevant date. + + b) The work must carry prominent notices stating that it is + released under this License and any conditions added under section + 7. This requirement modifies the requirement in section 4 to + "keep intact all notices". + + c) You must license the entire work, as a whole, under this + License to anyone who comes into possession of a copy. This + License will therefore apply, along with any applicable section 7 + additional terms, to the whole of the work, and all its parts, + regardless of how they are packaged. This License gives no + permission to license the work in any other way, but it does not + invalidate such permission if you have separately received it. + + d) If the work has interactive user interfaces, each must display + Appropriate Legal Notices; however, if the Program has interactive + interfaces that do not display Appropriate Legal Notices, your + work need not make them do so. + + A compilation of a covered work with other separate and independent +works, which are not by their nature extensions of the covered work, +and which are not combined with it such as to form a larger program, +in or on a volume of a storage or distribution medium, is called an +"aggregate" if the compilation and its resulting copyright are not +used to limit the access or legal rights of the compilation's users +beyond what the individual works permit. Inclusion of a covered work +in an aggregate does not cause this License to apply to the other +parts of the aggregate. + + 6. Conveying Non-Source Forms. + + You may convey a covered work in object code form under the terms +of sections 4 and 5, provided that you also convey the +machine-readable Corresponding Source under the terms of this License, +in one of these ways: + + a) Convey the object code in, or embodied in, a physical product + (including a physical distribution medium), accompanied by the + Corresponding Source fixed on a durable physical medium + customarily used for software interchange. + + b) Convey the object code in, or embodied in, a physical product + (including a physical distribution medium), accompanied by a + written offer, valid for at least three years and valid for as + long as you offer spare parts or customer support for that product + model, to give anyone who possesses the object code either (1) a + copy of the Corresponding Source for all the software in the + product that is covered by this License, on a durable physical + medium customarily used for software interchange, for a price no + more than your reasonable cost of physically performing this + conveying of source, or (2) access to copy the + Corresponding Source from a network server at no charge. + + c) Convey individual copies of the object code with a copy of the + written offer to provide the Corresponding Source. This + alternative is allowed only occasionally and noncommercially, and + only if you received the object code with such an offer, in accord + with subsection 6b. + + d) Convey the object code by offering access from a designated + place (gratis or for a charge), and offer equivalent access to the + Corresponding Source in the same way through the same place at no + further charge. You need not require recipients to copy the + Corresponding Source along with the object code. If the place to + copy the object code is a network server, the Corresponding Source + may be on a different server (operated by you or a third party) + that supports equivalent copying facilities, provided you maintain + clear directions next to the object code saying where to find the + Corresponding Source. Regardless of what server hosts the + Corresponding Source, you remain obligated to ensure that it is + available for as long as needed to satisfy these requirements. + + e) Convey the object code using peer-to-peer transmission, provided + you inform other peers where the object code and Corresponding + Source of the work are being offered to the general public at no + charge under subsection 6d. + + A separable portion of the object code, whose source code is excluded +from the Corresponding Source as a System Library, need not be +included in conveying the object code work. + + A "User Product" is either (1) a "consumer product", which means any +tangible personal property which is normally used for personal, family, +or household purposes, or (2) anything designed or sold for incorporation +into a dwelling. In determining whether a product is a consumer product, +doubtful cases shall be resolved in favor of coverage. For a particular +product received by a particular user, "normally used" refers to a +typical or common use of that class of product, regardless of the status +of the particular user or of the way in which the particular user +actually uses, or expects or is expected to use, the product. A product +is a consumer product regardless of whether the product has substantial +commercial, industrial or non-consumer uses, unless such uses represent +the only significant mode of use of the product. + + "Installation Information" for a User Product means any methods, +procedures, authorization keys, or other information required to install +and execute modified versions of a covered work in that User Product from +a modified version of its Corresponding Source. The information must +suffice to ensure that the continued functioning of the modified object +code is in no case prevented or interfered with solely because +modification has been made. + + If you convey an object code work under this section in, or with, or +specifically for use in, a User Product, and the conveying occurs as +part of a transaction in which the right of possession and use of the +User Product is transferred to the recipient in perpetuity or for a +fixed term (regardless of how the transaction is characterized), the +Corresponding Source conveyed under this section must be accompanied +by the Installation Information. But this requirement does not apply +if neither you nor any third party retains the ability to install +modified object code on the User Product (for example, the work has +been installed in ROM). + + The requirement to provide Installation Information does not include a +requirement to continue to provide support service, warranty, or updates +for a work that has been modified or installed by the recipient, or for +the User Product in which it has been modified or installed. Access to a +network may be denied when the modification itself materially and +adversely affects the operation of the network or violates the rules and +protocols for communication across the network. + + Corresponding Source conveyed, and Installation Information provided, +in accord with this section must be in a format that is publicly +documented (and with an implementation available to the public in +source code form), and must require no special password or key for +unpacking, reading or copying. + + 7. Additional Terms. + + "Additional permissions" are terms that supplement the terms of this +License by making exceptions from one or more of its conditions. +Additional permissions that are applicable to the entire Program shall +be treated as though they were included in this License, to the extent +that they are valid under applicable law. If additional permissions +apply only to part of the Program, that part may be used separately +under those permissions, but the entire Program remains governed by +this License without regard to the additional permissions. + + When you convey a copy of a covered work, you may at your option +remove any additional permissions from that copy, or from any part of +it. (Additional permissions may be written to require their own +removal in certain cases when you modify the work.) You may place +additional permissions on material, added by you to a covered work, +for which you have or can give appropriate copyright permission. + + Notwithstanding any other provision of this License, for material you +add to a covered work, you may (if authorized by the copyright holders of +that material) supplement the terms of this License with terms: + + a) Disclaiming warranty or limiting liability differently from the + terms of sections 15 and 16 of this License; or + + b) Requiring preservation of specified reasonable legal notices or + author attributions in that material or in the Appropriate Legal + Notices displayed by works containing it; or + + c) Prohibiting misrepresentation of the origin of that material, or + requiring that modified versions of such material be marked in + reasonable ways as different from the original version; or + + d) Limiting the use for publicity purposes of names of licensors or + authors of the material; or + + e) Declining to grant rights under trademark law for use of some + trade names, trademarks, or service marks; or + + f) Requiring indemnification of licensors and authors of that + material by anyone who conveys the material (or modified versions of + it) with contractual assumptions of liability to the recipient, for + any liability that these contractual assumptions directly impose on + those licensors and authors. + + All other non-permissive additional terms are considered "further +restrictions" within the meaning of section 10. If the Program as you +received it, or any part of it, contains a notice stating that it is +governed by this License along with a term that is a further +restriction, you may remove that term. If a license document contains +a further restriction but permits relicensing or conveying under this +License, you may add to a covered work material governed by the terms +of that license document, provided that the further restriction does +not survive such relicensing or conveying. + + If you add terms to a covered work in accord with this section, you +must place, in the relevant source files, a statement of the +additional terms that apply to those files, or a notice indicating +where to find the applicable terms. + + Additional terms, permissive or non-permissive, may be stated in the +form of a separately written license, or stated as exceptions; +the above requirements apply either way. + + 8. Termination. + + You may not propagate or modify a covered work except as expressly +provided under this License. Any attempt otherwise to propagate or +modify it is void, and will automatically terminate your rights under +this License (including any patent licenses granted under the third +paragraph of section 11). + + However, if you cease all violation of this License, then your +license from a particular copyright holder is reinstated (a) +provisionally, unless and until the copyright holder explicitly and +finally terminates your license, and (b) permanently, if the copyright +holder fails to notify you of the violation by some reasonable means +prior to 60 days after the cessation. + + Moreover, your license from a particular copyright holder is +reinstated permanently if the copyright holder notifies you of the +violation by some reasonable means, this is the first time you have +received notice of violation of this License (for any work) from that +copyright holder, and you cure the violation prior to 30 days after +your receipt of the notice. + + Termination of your rights under this section does not terminate the +licenses of parties who have received copies or rights from you under +this License. If your rights have been terminated and not permanently +reinstated, you do not qualify to receive new licenses for the same +material under section 10. + + 9. Acceptance Not Required for Having Copies. + + You are not required to accept this License in order to receive or +run a copy of the Program. Ancillary propagation of a covered work +occurring solely as a consequence of using peer-to-peer transmission +to receive a copy likewise does not require acceptance. However, +nothing other than this License grants you permission to propagate or +modify any covered work. These actions infringe copyright if you do +not accept this License. Therefore, by modifying or propagating a +covered work, you indicate your acceptance of this License to do so. + + 10. Automatic Licensing of Downstream Recipients. + + Each time you convey a covered work, the recipient automatically +receives a license from the original licensors, to run, modify and +propagate that work, subject to this License. You are not responsible +for enforcing compliance by third parties with this License. + + An "entity transaction" is a transaction transferring control of an +organization, or substantially all assets of one, or subdividing an +organization, or merging organizations. If propagation of a covered +work results from an entity transaction, each party to that +transaction who receives a copy of the work also receives whatever +licenses to the work the party's predecessor in interest had or could +give under the previous paragraph, plus a right to possession of the +Corresponding Source of the work from the predecessor in interest, if +the predecessor has it or can get it with reasonable efforts. + + You may not impose any further restrictions on the exercise of the +rights granted or affirmed under this License. For example, you may +not impose a license fee, royalty, or other charge for exercise of +rights granted under this License, and you may not initiate litigation +(including a cross-claim or counterclaim in a lawsuit) alleging that +any patent claim is infringed by making, using, selling, offering for +sale, or importing the Program or any portion of it. + + 11. Patents. + + A "contributor" is a copyright holder who authorizes use under this +License of the Program or a work on which the Program is based. The +work thus licensed is called the contributor's "contributor version". + + A contributor's "essential patent claims" are all patent claims +owned or controlled by the contributor, whether already acquired or +hereafter acquired, that would be infringed by some manner, permitted +by this License, of making, using, or selling its contributor version, +but do not include claims that would be infringed only as a +consequence of further modification of the contributor version. For +purposes of this definition, "control" includes the right to grant +patent sublicenses in a manner consistent with the requirements of +this License. + + Each contributor grants you a non-exclusive, worldwide, royalty-free +patent license under the contributor's essential patent claims, to +make, use, sell, offer for sale, import and otherwise run, modify and +propagate the contents of its contributor version. + + In the following three paragraphs, a "patent license" is any express +agreement or commitment, however denominated, not to enforce a patent +(such as an express permission to practice a patent or covenant not to +sue for patent infringement). To "grant" such a patent license to a +party means to make such an agreement or commitment not to enforce a +patent against the party. + + If you convey a covered work, knowingly relying on a patent license, +and the Corresponding Source of the work is not available for anyone +to copy, free of charge and under the terms of this License, through a +publicly available network server or other readily accessible means, +then you must either (1) cause the Corresponding Source to be so +available, or (2) arrange to deprive yourself of the benefit of the +patent license for this particular work, or (3) arrange, in a manner +consistent with the requirements of this License, to extend the patent +license to downstream recipients. "Knowingly relying" means you have +actual knowledge that, but for the patent license, your conveying the +covered work in a country, or your recipient's use of the covered work +in a country, would infringe one or more identifiable patents in that +country that you have reason to believe are valid. + + If, pursuant to or in connection with a single transaction or +arrangement, you convey, or propagate by procuring conveyance of, a +covered work, and grant a patent license to some of the parties +receiving the covered work authorizing them to use, propagate, modify +or convey a specific copy of the covered work, then the patent license +you grant is automatically extended to all recipients of the covered +work and works based on it. + + A patent license is "discriminatory" if it does not include within +the scope of its coverage, prohibits the exercise of, or is +conditioned on the non-exercise of one or more of the rights that are +specifically granted under this License. You may not convey a covered +work if you are a party to an arrangement with a third party that is +in the business of distributing software, under which you make payment +to the third party based on the extent of your activity of conveying +the work, and under which the third party grants, to any of the +parties who would receive the covered work from you, a discriminatory +patent license (a) in connection with copies of the covered work +conveyed by you (or copies made from those copies), or (b) primarily +for and in connection with specific products or compilations that +contain the covered work, unless you entered into that arrangement, +or that patent license was granted, prior to 28 March 2007. + + Nothing in this License shall be construed as excluding or limiting +any implied license or other defenses to infringement that may +otherwise be available to you under applicable patent law. + + 12. No Surrender of Others' Freedom. + + If conditions are imposed on you (whether by court order, agreement or +otherwise) that contradict the conditions of this License, they do not +excuse you from the conditions of this License. If you cannot convey a +covered work so as to satisfy simultaneously your obligations under this +License and any other pertinent obligations, then as a consequence you may +not convey it at all. For example, if you agree to terms that obligate you +to collect a royalty for further conveying from those to whom you convey +the Program, the only way you could satisfy both those terms and this +License would be to refrain entirely from conveying the Program. + + 13. Use with the GNU Affero General Public License. + + Notwithstanding any other provision of this License, you have +permission to link or combine any covered work with a work licensed +under version 3 of the GNU Affero General Public License into a single +combined work, and to convey the resulting work. The terms of this +License will continue to apply to the part which is the covered work, +but the special requirements of the GNU Affero General Public License, +section 13, concerning interaction through a network will apply to the +combination as such. + + 14. Revised Versions of this License. + + The Free Software Foundation may publish revised and/or new versions of +the GNU General Public License from time to time. Such new versions will +be similar in spirit to the present version, but may differ in detail to +address new problems or concerns. + + Each version is given a distinguishing version number. If the +Program specifies that a certain numbered version of the GNU General +Public License "or any later version" applies to it, you have the +option of following the terms and conditions either of that numbered +version or of any later version published by the Free Software +Foundation. If the Program does not specify a version number of the +GNU General Public License, you may choose any version ever published +by the Free Software Foundation. + + If the Program specifies that a proxy can decide which future +versions of the GNU General Public License can be used, that proxy's +public statement of acceptance of a version permanently authorizes you +to choose that version for the Program. + + Later license versions may give you additional or different +permissions. However, no additional obligations are imposed on any +author or copyright holder as a result of your choosing to follow a +later version. + + 15. Disclaimer of Warranty. + + THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY +APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT +HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY +OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, +THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR +PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM +IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF +ALL NECESSARY SERVICING, REPAIR OR CORRECTION. + + 16. Limitation of Liability. + + IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING +WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS +THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY +GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE +USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF +DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD +PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), +EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF +SUCH DAMAGES. + + 17. Interpretation of Sections 15 and 16. + + If the disclaimer of warranty and limitation of liability provided +above cannot be given local legal effect according to their terms, +reviewing courts shall apply local law that most closely approximates +an absolute waiver of all civil liability in connection with the +Program, unless a warranty or assumption of liability accompanies a +copy of the Program in return for a fee. + + END OF TERMS AND CONDITIONS + + How to Apply These Terms to Your New Programs + + If you develop a new program, and you want it to be of the greatest +possible use to the public, the best way to achieve this is to make it +free software which everyone can redistribute and change under these terms. + + To do so, attach the following notices to the program. It is safest +to attach them to the start of each source file to most effectively +state the exclusion of warranty; and each file should have at least +the "copyright" line and a pointer to where the full notice is found. + + + Copyright (C) + + This program is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + This program is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with this program. If not, see . + +Also add information on how to contact you by electronic and paper mail. + + If the program does terminal interaction, make it output a short +notice like this when it starts in an interactive mode: + + Copyright (C) + This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. + This is free software, and you are welcome to redistribute it + under certain conditions; type `show c' for details. + +The hypothetical commands `show w' and `show c' should show the appropriate +parts of the General Public License. Of course, your program's commands +might be different; for a GUI interface, you would use an "about box". + + You should also get your employer (if you work as a programmer) or school, +if any, to sign a "copyright disclaimer" for the program, if necessary. +For more information on this, and how to apply and follow the GNU GPL, see +. + + The GNU General Public License does not permit incorporating your program +into proprietary programs. If your program is a subroutine library, you +may consider it more useful to permit linking proprietary applications with +the library. If this is what you want to do, use the GNU Lesser General +Public License instead of this License. But first, please read +. diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory/README.md b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/README.md new file mode 100644 index 00000000..6202709b --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/README.md @@ -0,0 +1,44 @@ +[![Python Package using Conda](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml/badge.svg)](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml) +[![Python Version](https://img.shields.io/badge/Python-3.10%20%7C%203.11%20%7C%203.12-blue)](https://img.shields.io/badge/python-3.10%20%7C%203.11%20%7C%203.12-blue) + +# commbase-genai-llm-openai-gpt-p + +commbase-genai-llm-openai-gpt-p + +A terminal version of ChatGPT designed to facilitate seamless interaction with the powerful Open AI's GPT language model. This app empowers users to engage in natural language conversations with the model, unlocking a wide range of possibilities for creative, informative, and dynamic interactions. + +ChatGPT is a chatbot built using the GPT (Generative Pre-trained Transformer) architecture developed by OpenAI. To use ChatGPT, simply visit the website at . This is the online platform where you can access and interact with the ChatGPT service. + +## Unleash the Power of Creativity with commbase-genai-llm-openai-gpt-p + +commbase-genai-llm-openai-gpt-p harnesses the capabilities of OpenAI's GPT models. Explore the world of storytelling, code exploration, and language manipulation through intuitive interaction with GPT's advanced features. + +**Key strengths:** + +* **Creative text formats:** Generate poems, code, scripts, musical pieces, emails, letters, and more, finding inspiration, exploring different communication styles, or aiding in content creation. +* **Storytelling and world-building:** Craft captivating narratives and build intricate fictional worlds, empowering writers and game designers to bring their imaginations to life. +* **Humor and wit:** Engage in stimulating and enjoyable interactions as GPT incorporates humor and wit into its responses, fostering a sense of connection and amusement. +* **Multilingual capabilities:** Break down language barriers and access information across cultures with GPT's ability to understand and respond in multiple languages. +* **Code generation:** Get a head start on coding projects or explore different programming languages with GPT's basic code generation capabilities. +* **Accessibility and user-friendliness:** Interact with GPT easily through various APIs and interfaces, making the technology readily available to a diverse audience. +* **Constant evolution and improvement:** Stay at the forefront of language processing advancements as OpenAI continuously refines and improves GPT models. + +**With commbase-genai-llm-openai-gpt-p, unlock the potential of GPT and embark on a journey of creative exploration, communication, and discovery.** + +## Examples + +Detailed information about examples can be found in the corresponding [`examples`](./examples) directory. + +## Contributors + +Thanks to the following people who have contributed to this project: + +* [@estebanways](https://github.com/estebanways) + +## Contact + +If you want to contact us you can reach us at . + +## License + +This project uses the following license: [`License`](./COPYING). diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory/commbase-genai-slm-ollama-phi3-mini-memory.jpg b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/commbase-genai-slm-ollama-phi3-mini-memory.jpg new file mode 100644 index 00000000..7ab73dd5 Binary files /dev/null and b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/commbase-genai-slm-ollama-phi3-mini-memory.jpg differ diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory/commbase-genai-slm-ollama-phi3-mini-memory.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/commbase-genai-slm-ollama-phi3-mini-memory.py new file mode 100644 index 00000000..2618d893 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/commbase-genai-slm-ollama-phi3-mini-memory.py @@ -0,0 +1,421 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/26/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# commbase-genai-slm-ollama-phi3-mini-memory.py +# Interacts with the Commbase platform by generating AI-driven responses using +# the Ollama API with the 'commbase-phi3-mini' model. The chat memory +# management workflow ensures that the history of prompts and responses is +# preserved across interactions. This approach allows the AI model to generate +# more relevant and context-aware responses. + +# Imports +from datetime import datetime +import ollama +import os +import subprocess +import sys +from file_paths import ( + get_assistant_discourse_from_language_model_file, + get_chat_log_file, + get_chat_memory_file, + get_run_voice_recorder_in_pane_path, + get_stt_engine_path +) +from functions import ( + get_audible_assistant_logging_on, + get_chat_participant_names, + get_commbase_recorder_transmitter_quit_char, + get_commbase_stt_whisper_reactive_p_auto_open_recorder_after_command, + get_log_severity_level_1 +) + +# Set the values returned by get_chat_memory_file() +memory_file = get_chat_memory_file() + + +def read_memory(file_path): + """ + Reads chat memory from a file. + + This function checks if the specified file exists. If the file exists, + it opens the file in read mode and reads its contents, which it then + returns as a string. If the file does not exist, the function returns + an empty string. + + Args: + file_path (str): The path to the file from which to read the chat + memory. + + Returns: + str: The contents of the file as a string if the file exists, otherwise + an empty string. + """ + if os.path.exists(file_path): + with open(file_path, 'r') as file: + return file.read() + # If the file does not exist, return an empty string + return "" + + +def append_memory(file_path, memory): + """ + Appends chat memory to a file. + + This function opens the specified file in append mode and appends the + provided memory string followed by a newline character to it. If the file + does not exist, it will be created. + + Args: + file_path (str): The path to the file where the chat memory will be + appended. + memory (str): The chat memory to be appended to the file. + + Returns: + None + """ + with open(file_path, 'a') as file: + file.write(memory + "\n") + + +def check_arguments(): + """ + Checks if the script received the correct number of arguments. + + This function ensures that exactly one argument (the prompt) is passed to + the script. + If the number of arguments is incorrect, it prints the usage instructions + and exits the program. + + Raises: + SystemExit: If the number of arguments is not equal to 2. + """ + if len(sys.argv) != 2: + print("Usage: python commbase-genai-slm-ollama-phi3-mini-memory.py \"Your prompt here\"") + sys.exit(1) + + +def get_new_prompt(): + """ + Retrieves the user-provided prompt from the command line arguments. + + This function extracts the prompt passed as the first argument to the + script. + It assumes that the arguments have already been checked for correctness + by another function. + + Returns: + str: The user-provided prompt. + """ + new_prompt = sys.argv[1] + return new_prompt + + +def read_existing_chat_memory(memory_file): + """ + Reads the existing chat memory from a file. + + This function reads the contents of the specified chat memory file + using the `read_memory` function and returns it as a string. + + Args: + memory_file (str): The path to the file from which to read the existing + chat memory. + + Returns: + str: The contents of the chat memory file as a string. + """ + memory = read_memory(memory_file) + return memory + + +def append_new_prompt_to_chat_memory(memory, new_prompt): + """ + Appends the new prompt to the existing chat memory. + + This function takes the current chat memory and a new prompt, and appends + the new prompt to the memory. If the memory is empty, it initializes it + with the new prompt. + + Args: + memory (str): The existing chat memory as a string. + new_prompt (str): The new prompt to append to the chat memory. + + Returns: + str: The updated chat memory with the new prompt appended. + """ + if memory: + memory += "\n" + new_prompt + else: + memory = new_prompt + + return memory + + +def generate_response(memory): + """ + Generates a response using the provided prompt and the commbase-phi3-mini + model. + + This function sends the user-provided prompt to the Ollama API, using the + 'commbase-phi3-mini' model to generate a response. The function then + extracts and returns the text response from the API's response dictionary. + + Args: + prompt (str): The user-provided prompt to generate a response for. + + Returns: + str: The generated text response from the model. + """ + response = ollama.generate(model='commbase-phi3-mini', prompt=memory) + return response['response'] + + +def append_response_to_chat_memory(response, memory): + """ + Appends the generated response to the existing chat memory. + + This function takes the generated response and appends it to the current + chat memory. The response is added as a new line at the end of the existing + memory. + + Args: + response (str): The generated response from the AI model to be + appended. + memory (str): The existing chat memory to which the response will be + appended. + + Returns: + str: The updated chat memory with the new response appended at the end. + """ + # Extract the response + text_response = response + + # Append the response to the chat memory + memory += "\n" + text_response + return memory + + +def print_response(response): + """ + Logs a response from an assistant or system to a specified log file. + + Args: + - response (str): The response message to log. + + Returns: + None + + Description: + This function logs the given response message along with current timestamp, + severity level, and system identifier to a designated log file. The + severity level is obtained from `get_log_severity_level_1()` function. The + logged format is: "[YYYY-MM-DD HH:MM:SS] phi-mini-nomemory: + : \n". + The logged message is appended to the file specified by `temp_file_path`. + + Notes: + - Ensure `get_log_severity_level_1()` returns an appropriate severity level + string. + - `temp_file_path` should point to the file where the logs are to be + appended. + """ + current_timestamp = datetime.now().strftime('%Y-%m-%d %H:%M:%S') + + # Set the value returned by get_log_severity_level_1() + log_severity_level_1 = get_log_severity_level_1() + + # Set the values returned by get_chat_participant_names() + end_user_name, assistant_name = get_chat_participant_names() + + # A temporary file path + temp_file_path = get_chat_log_file() + + severity_level = log_severity_level_1 + assistant_text = "[" + current_timestamp + "]" + " phi3-mini-memory: " + severity_level + ": " + assistant_name + response + "\n" + with open(temp_file_path, 'a') as temp_file: + temp_file.write(assistant_text) + + +def text_to_speech(window_number, pane_number, time): + """ + Sends a command to tmux to open a language model discourse in a specified + tmux window and pane, allowing text-to-speech synthesis. + + Args: + - window_number (int): The number of the tmux window to select. + - pane_number (int): The number of the tmux pane to select. + - time (int or float): Time in seconds to wait after sending the command. + + Returns: + None + + Notes: + - Requires tmux to be installed and accessible from the command line. + - Uses subprocess module to execute tmux commands. + + Example: + >>> text_to_speech(1, 0, 2) + This will select window 1, pane 0 in tmux, send a clear and bash command + to run the language model discourse, and wait for 2 seconds before + returning. + """ + # Set the value returned by get_stt_engine_path() + stt_engine_path = get_stt_engine_path() + + # Set the value returned by get_commbase_recorder_transmitter_quit_char() + quit_char = get_commbase_recorder_transmitter_quit_char() + + # Set the value returned by get_assistant_discourse_from_language_model_file() + assistant_discourse_from_language_model = get_assistant_discourse_from_language_model_file() + + # Set the value returned by get_commbase_stt_whisper_reactive_p_auto_open_recorder_after_command() + auto_open_recorder_after_command = get_commbase_stt_whisper_reactive_p_auto_open_recorder_after_command() + + # Set the value returned by get_run_voice_recorder_in_pane() + run_voice_recorder_in_pane_path = get_run_voice_recorder_in_pane_path() + + # run_voice_recorder_in_pane_routine_pane = 7 + run_voice_recorder_in_pane_routine_pane = pane_number + + # If the STT engine is Whisper reactive, attempt to close the + # recorder-transmitter app. + if stt_engine_path == "$COMMBASE_APP_DIR/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py": + # Construct the tmux command + command = ( + f"tmux select-window -t {window_number} && " + f"tmux select-pane -t {pane_number} && " + f"tmux send-keys \'{quit_char}; clear\' C-m && " + f"sleep {time}" + ) + + # Execute the command + subprocess.run(command, shell=True, check=True) # Do not manage errors here or the script will fail + + # Make the assistant speak out loud the language model response + # Construct the tmux command + command = ( + f"tmux select-window -t {window_number} && " + f"tmux select-pane -t {pane_number} && " + f"tmux send-keys \'clear; bash {assistant_discourse_from_language_model}\' C-m && " + f"sleep {time}" + ) + + # Execute the command + subprocess.run(command, shell=True, check=True) # Do not manage errors here or the script will fail + + # If the Whisper reactive STT Engine is set up to reopen the recorder after + # any command, just do it. + if auto_open_recorder_after_command == "True": + # If the STT engine is Whisper reactive, reopen the + # recorder-transmitter app. + if stt_engine_path == "$COMMBASE_APP_DIR/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py": + # Construct the tmux command + command = ( + f"tmux select-window -t {window_number} && " + f"tmux select-pane -t {pane_number} && " + f"tmux send-keys \'clear; bash {run_voice_recorder_in_pane_path} {run_voice_recorder_in_pane_routine_pane}\' C-m && " + f"sleep {time}" + ) + + # Execute the command + subprocess.run(command, shell=True, check=True) # Do not manage errors here or the script will fail + + +def main(): + """ + Executes the main functionality of the script: + - Checks command line arguments to ensure correct usage. + - Retrieves the user-provided prompt. + - Reads existing chat memory from a file. + - Appends the new prompt to the chat memory. + - Generates a response using the commbase-phi3-mini model via the Ollama + API. + - Appends the current prompt and response to the chat memory file. + - Logs the generated response to a designated log file with timestamp and + severity level. + - Initiates text-to-speech synthesis of the response in a specific tmux + window and pane. + + Args: + None + + Returns: + None + + Notes: + - Requires the Ollama API, tmux, and other dependencies as specified. + - Uses functions from 'file_paths' and 'functions' modules for file paths + and helper functions. + - Assumes correct configuration and availability of the tmux session. + - Each function is called only once, and their return values are stored in + variables to be used in subsequent steps. + + Example: + >>> main() + This will execute the main functionality of the script, including response + generation, logging, and text-to-speech synthesis in the specified tmux + window and pane. + """ + # Set the value returned by get_audible_assistant_logging_on() + audible_assistant_logging_on = get_audible_assistant_logging_on() + + # Ensure the script has the correct number of arguments + check_arguments() + + # Retrieve the new prompt + new_prompt = get_new_prompt() + + # Read the existing chat memory + memory = read_existing_chat_memory(memory_file) + + # Append the new prompt to the chat memory + updated_memory = append_new_prompt_to_chat_memory(memory, new_prompt) + + # Generate a response based on the updated memory + response = generate_response(updated_memory) + + # Append the current prompt and response to the chat memory + current_memory = f"{new_prompt}\n{response}" + append_memory(memory_file, current_memory) + + # Print the response + print_response(response) + + # If audible assistant logging is enabled, use text-to-speech + if audible_assistant_logging_on == "True": + text_to_speech(1, 7, 0.1) + + +if __name__ == "__main__": + main() diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory/config.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/config.py new file mode 100644 index 00000000..eb7f20fc --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/config.py @@ -0,0 +1,44 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# config.py +# This files makes a variable available throughout the entire Python +# application, including all modules, submodules, and functions. + +# Requirements +import os + +# The path to the .env configuration file +CONFIG_FILE_DIR = os.environ["COMMBASE_APP_DIR"] +CONFIG_FILE_PATH = os.path.join(CONFIG_FILE_DIR, "config/commbase.conf") diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory/examples/README.md b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/examples/README.md new file mode 100644 index 00000000..b8927a5c --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/examples/README.md @@ -0,0 +1,12 @@ +## Examples + +### Example 1: commbase-genai-slm-ollama-phi3-mini-history implementation + +This example demonstrates the implementation of our software. You can find the code in the [Commbase](https://github.com/mydroidandi/commbase) repository. + +To run the example, follow these steps: +1. Clone the repository. +2. Install the necessary dependencies. +3. Execute the program. + +Make sure to check out the repository for more details and instructions. diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory/file_paths.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/file_paths.py new file mode 100644 index 00000000..66f5c47c --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/file_paths.py @@ -0,0 +1,190 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# file_paths.py +# This file stores functions related to loading and managing file paths +# Requires os.path already imported + +# Requirements +from config import CONFIG_FILE_DIR, CONFIG_FILE_PATH + + +def get_assistant_discourse_from_language_model_file(): + """ + Retrieves the value of the ASSISTANT_DISCOURSE_FROM_LANGUAGE_MODEL_FILE + variable from the configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + model_discourse_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "ASSISTANT_DISCOURSE_FROM_LANGUAGE_MODEL_FILE": + # Remove the quotes from the value of the variable + model_discourse_file = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if model_discourse_file is not None: + return model_discourse_file + + # If the variable was not found, return None + return None + + +def get_chat_log_file(): + """ + Retrieves the value of the CHAT_LOG_FILE variable from the configuration + file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + chat_log_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "CHAT_LOG_FILE": + # Remove the quotes from the value of the variable + chat_log_file = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if chat_log_file is not None: + return chat_log_file + + # If the variable was not found, return None + return None + + +def get_chat_memory_file(): + """ + Retrieves the value of the CHAT_MEMORY_FILE variable from the configuration + file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + chat_memory_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "CHAT_MEMORY_FILE": + # Remove the quotes from the value of the variable + chat_memory_file = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if chat_memory_file is not None: + return chat_memory_file + + # If the variable was not found, return None + return None + + +def get_run_voice_recorder_in_pane_path(): + """ + Retrieves the value of the RUN_VOICE_RECORDER_IN_PANE_PATH variable from + the configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + run_recorder_path = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "RUN_VOICE_RECORDER_IN_PANE_PATH": + # Remove the quotes from the value of the variable + run_recorder_path = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if run_recorder_path is not None: + return run_recorder_path + + # If the variable was not found, return None + return None + + +def get_stt_engine_path(): + """ + Retrieves the value of the STT_ENGINE_PATH variable from the configuration + file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + stt_engine_path = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "STT_ENGINE_PATH": + # Remove the quotes from the value of the variable + stt_engine_path = value.strip()[1:-1] + + # Check if the variable was found + if stt_engine_path is not None: + return stt_engine_path + + # If the variable was not found, return None + return None diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-memory/functions.py b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/functions.py new file mode 100644 index 00000000..0056526a --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-memory/functions.py @@ -0,0 +1,212 @@ +#!/usr/bin/env python +################################################################################ +# commbase-genai-slm-ollama-phi3-mini-memory # +# # +# A sophisticated AI assistant's Small Language Model (Phi3), enhanced by # +# Retrieval-Augmented Generation (RAG) for improved response accuracy, and # +# supported by a Picone semantic vector database. # +# # +# Change History # +# 06/25/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# functions.py +# This file contains a collection of utility functions for various tasks. It +# provides a set of commonly used functions related to data manipulation, file +# handling, and mathematical operations. + +# Imports +from config import CONFIG_FILE_PATH + + +def get_audible_assistant_logging_on(): + """ + Reads the 'AUDIBLE_ASSISTANT_LOGGING_ON' variable from the environment + configuration file. Returns the string value of the variables if found, or + None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + audible_assistant_logging_on = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "AUDIBLE_ASSISTANT_LOGGING_ON": + # Remove the quotes from the value of the variable + audible_assistant_logging_on = value.strip()[1:-1] + + # Check if the variable was found + if audible_assistant_logging_on is not None: + return audible_assistant_logging_on + + # If the variable was not found, return None + return None + + +def get_chat_participant_names(): + """ + Gets the chat participant names from the config file. + + Reads the 'ASSISTANT_NAME_IN_CHAT_PANE' and 'END_USER_NAME_IN_CHAT_PANE' + variables from the environment configuration file. Returns a tuple + containing the string values of the variables if found, or None if any of + the variables are not present. + + Returns: + tuple or None: A tuple containing the assistant, system, and end user + names in the chat pane, or None, if any of the variables are not found. + """ + # Initialize variables for the chat names + assistant_name = None + end_user_name = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "END_USER_NAME_IN_CHAT_PANE": + # Remove the quotes from the value of the variable + end_user_name = value.strip()[1:-1] + + elif variable_name == "ASSISTANT_NAME_IN_CHAT_PANE": + # Remove the quotes from the value of the variable + assistant_name = value.strip()[1:-1] + + # Check if all two variables were found + if ( + assistant_name is not None + and end_user_name is not None + ): + return end_user_name, assistant_name + + # If any of the variables are not found, return None + return None + + +def get_commbase_recorder_transmitter_quit_char(): + """ + Reads the 'COMMBASE_RECORDER_TRANSMITTER_QUIT_CHAR' variable from the + environment configuration file. Returns the string value of the variables + if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + quit_char = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_RECORDER_TRANSMITTER_QUIT_CHAR": + # Remove the quotes from the value of the variable + quit_char = value.strip()[1:-1] + + # Check if the variable was found + if quit_char is not None: + return quit_char + + # If the variable was not found, return None + return None + + +def get_commbase_stt_whisper_reactive_p_auto_open_recorder_after_command(): + """ + Reads the 'COMMBASE_STT_WHISPER_REACTIVE_P_AUTO_OPEN_RECORDER_AFTER_COMMAND' + variable from the environment configuration file. Returns the string value + of the variables if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + auto_open_recoder = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_STT_WHISPER_REACTIVE_P_AUTO_OPEN_RECORDER_AFTER_COMMAND": + # Remove the quotes from the value of the variable + auto_open_recoder = value.strip()[1:-1] + + # Check if the variable was found + if auto_open_recoder is not None: + return auto_open_recoder + + # If the variable was not found, return None + return None + + +def get_log_severity_level_1(): + """ + Reads the 'LOG_SEVERITY_LEVEL_1' variable from the environment + configuration file. Returns the string value of the variables if found, or + None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + log_severity_level_1 = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "LOG_SEVERITY_LEVEL_1": + # Remove the quotes from the value of the variable + log_severity_level_1 = value.strip()[1:-1] + + # Check if the variable was found + if log_severity_level_1 is not None: + return log_severity_level_1 + + # If the variable was not found, return None + return None diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/COPYING b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/COPYING new file mode 100644 index 00000000..94a9ed02 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/COPYING @@ -0,0 +1,674 @@ + GNU GENERAL PUBLIC LICENSE + Version 3, 29 June 2007 + + Copyright (C) 2007 Free Software Foundation, Inc. + Everyone is permitted to copy and distribute verbatim copies + of this license document, but changing it is not allowed. + + Preamble + + The GNU General Public License is a free, copyleft license for +software and other kinds of works. + + The licenses for most software and other practical works are designed +to take away your freedom to share and change the works. By contrast, +the GNU General Public License is intended to guarantee your freedom to +share and change all versions of a program--to make sure it remains free +software for all its users. We, the Free Software Foundation, use the +GNU General Public License for most of our software; it applies also to +any other work released this way by its authors. You can apply it to +your programs, too. + + When we speak of free software, we are referring to freedom, not +price. Our General Public Licenses are designed to make sure that you +have the freedom to distribute copies of free software (and charge for +them if you wish), that you receive source code or can get it if you +want it, that you can change the software or use pieces of it in new +free programs, and that you know you can do these things. + + To protect your rights, we need to prevent others from denying you +these rights or asking you to surrender the rights. Therefore, you have +certain responsibilities if you distribute copies of the software, or if +you modify it: responsibilities to respect the freedom of others. + + For example, if you distribute copies of such a program, whether +gratis or for a fee, you must pass on to the recipients the same +freedoms that you received. You must make sure that they, too, receive +or can get the source code. And you must show them these terms so they +know their rights. + + Developers that use the GNU GPL protect your rights with two steps: +(1) assert copyright on the software, and (2) offer you this License +giving you legal permission to copy, distribute and/or modify it. + + For the developers' and authors' protection, the GPL clearly explains +that there is no warranty for this free software. For both users' and +authors' sake, the GPL requires that modified versions be marked as +changed, so that their problems will not be attributed erroneously to +authors of previous versions. + + Some devices are designed to deny users access to install or run +modified versions of the software inside them, although the manufacturer +can do so. This is fundamentally incompatible with the aim of +protecting users' freedom to change the software. The systematic +pattern of such abuse occurs in the area of products for individuals to +use, which is precisely where it is most unacceptable. Therefore, we +have designed this version of the GPL to prohibit the practice for those +products. If such problems arise substantially in other domains, we +stand ready to extend this provision to those domains in future versions +of the GPL, as needed to protect the freedom of users. + + Finally, every program is threatened constantly by software patents. +States should not allow patents to restrict development and use of +software on general-purpose computers, but in those that do, we wish to +avoid the special danger that patents applied to a free program could +make it effectively proprietary. To prevent this, the GPL assures that +patents cannot be used to render the program non-free. + + The precise terms and conditions for copying, distribution and +modification follow. + + TERMS AND CONDITIONS + + 0. Definitions. + + "This License" refers to version 3 of the GNU General Public License. + + "Copyright" also means copyright-like laws that apply to other kinds of +works, such as semiconductor masks. + + "The Program" refers to any copyrightable work licensed under this +License. Each licensee is addressed as "you". "Licensees" and +"recipients" may be individuals or organizations. + + To "modify" a work means to copy from or adapt all or part of the work +in a fashion requiring copyright permission, other than the making of an +exact copy. The resulting work is called a "modified version" of the +earlier work or a work "based on" the earlier work. + + A "covered work" means either the unmodified Program or a work based +on the Program. + + To "propagate" a work means to do anything with it that, without +permission, would make you directly or secondarily liable for +infringement under applicable copyright law, except executing it on a +computer or modifying a private copy. Propagation includes copying, +distribution (with or without modification), making available to the +public, and in some countries other activities as well. + + To "convey" a work means any kind of propagation that enables other +parties to make or receive copies. Mere interaction with a user through +a computer network, with no transfer of a copy, is not conveying. + + An interactive user interface displays "Appropriate Legal Notices" +to the extent that it includes a convenient and prominently visible +feature that (1) displays an appropriate copyright notice, and (2) +tells the user that there is no warranty for the work (except to the +extent that warranties are provided), that licensees may convey the +work under this License, and how to view a copy of this License. If +the interface presents a list of user commands or options, such as a +menu, a prominent item in the list meets this criterion. + + 1. Source Code. + + The "source code" for a work means the preferred form of the work +for making modifications to it. "Object code" means any non-source +form of a work. + + A "Standard Interface" means an interface that either is an official +standard defined by a recognized standards body, or, in the case of +interfaces specified for a particular programming language, one that +is widely used among developers working in that language. + + The "System Libraries" of an executable work include anything, other +than the work as a whole, that (a) is included in the normal form of +packaging a Major Component, but which is not part of that Major +Component, and (b) serves only to enable use of the work with that +Major Component, or to implement a Standard Interface for which an +implementation is available to the public in source code form. A +"Major Component", in this context, means a major essential component +(kernel, window system, and so on) of the specific operating system +(if any) on which the executable work runs, or a compiler used to +produce the work, or an object code interpreter used to run it. + + The "Corresponding Source" for a work in object code form means all +the source code needed to generate, install, and (for an executable +work) run the object code and to modify the work, including scripts to +control those activities. However, it does not include the work's +System Libraries, or general-purpose tools or generally available free +programs which are used unmodified in performing those activities but +which are not part of the work. For example, Corresponding Source +includes interface definition files associated with source files for +the work, and the source code for shared libraries and dynamically +linked subprograms that the work is specifically designed to require, +such as by intimate data communication or control flow between those +subprograms and other parts of the work. + + The Corresponding Source need not include anything that users +can regenerate automatically from other parts of the Corresponding +Source. + + The Corresponding Source for a work in source code form is that +same work. + + 2. Basic Permissions. + + All rights granted under this License are granted for the term of +copyright on the Program, and are irrevocable provided the stated +conditions are met. This License explicitly affirms your unlimited +permission to run the unmodified Program. The output from running a +covered work is covered by this License only if the output, given its +content, constitutes a covered work. This License acknowledges your +rights of fair use or other equivalent, as provided by copyright law. + + You may make, run and propagate covered works that you do not +convey, without conditions so long as your license otherwise remains +in force. You may convey covered works to others for the sole purpose +of having them make modifications exclusively for you, or provide you +with facilities for running those works, provided that you comply with +the terms of this License in conveying all material for which you do +not control copyright. Those thus making or running the covered works +for you must do so exclusively on your behalf, under your direction +and control, on terms that prohibit them from making any copies of +your copyrighted material outside their relationship with you. + + Conveying under any other circumstances is permitted solely under +the conditions stated below. Sublicensing is not allowed; section 10 +makes it unnecessary. + + 3. Protecting Users' Legal Rights From Anti-Circumvention Law. + + No covered work shall be deemed part of an effective technological +measure under any applicable law fulfilling obligations under article +11 of the WIPO copyright treaty adopted on 20 December 1996, or +similar laws prohibiting or restricting circumvention of such +measures. + + When you convey a covered work, you waive any legal power to forbid +circumvention of technological measures to the extent such circumvention +is effected by exercising rights under this License with respect to +the covered work, and you disclaim any intention to limit operation or +modification of the work as a means of enforcing, against the work's +users, your or third parties' legal rights to forbid circumvention of +technological measures. + + 4. Conveying Verbatim Copies. + + You may convey verbatim copies of the Program's source code as you +receive it, in any medium, provided that you conspicuously and +appropriately publish on each copy an appropriate copyright notice; +keep intact all notices stating that this License and any +non-permissive terms added in accord with section 7 apply to the code; +keep intact all notices of the absence of any warranty; and give all +recipients a copy of this License along with the Program. + + You may charge any price or no price for each copy that you convey, +and you may offer support or warranty protection for a fee. + + 5. Conveying Modified Source Versions. + + You may convey a work based on the Program, or the modifications to +produce it from the Program, in the form of source code under the +terms of section 4, provided that you also meet all of these conditions: + + a) The work must carry prominent notices stating that you modified + it, and giving a relevant date. + + b) The work must carry prominent notices stating that it is + released under this License and any conditions added under section + 7. This requirement modifies the requirement in section 4 to + "keep intact all notices". + + c) You must license the entire work, as a whole, under this + License to anyone who comes into possession of a copy. This + License will therefore apply, along with any applicable section 7 + additional terms, to the whole of the work, and all its parts, + regardless of how they are packaged. This License gives no + permission to license the work in any other way, but it does not + invalidate such permission if you have separately received it. + + d) If the work has interactive user interfaces, each must display + Appropriate Legal Notices; however, if the Program has interactive + interfaces that do not display Appropriate Legal Notices, your + work need not make them do so. + + A compilation of a covered work with other separate and independent +works, which are not by their nature extensions of the covered work, +and which are not combined with it such as to form a larger program, +in or on a volume of a storage or distribution medium, is called an +"aggregate" if the compilation and its resulting copyright are not +used to limit the access or legal rights of the compilation's users +beyond what the individual works permit. Inclusion of a covered work +in an aggregate does not cause this License to apply to the other +parts of the aggregate. + + 6. Conveying Non-Source Forms. + + You may convey a covered work in object code form under the terms +of sections 4 and 5, provided that you also convey the +machine-readable Corresponding Source under the terms of this License, +in one of these ways: + + a) Convey the object code in, or embodied in, a physical product + (including a physical distribution medium), accompanied by the + Corresponding Source fixed on a durable physical medium + customarily used for software interchange. + + b) Convey the object code in, or embodied in, a physical product + (including a physical distribution medium), accompanied by a + written offer, valid for at least three years and valid for as + long as you offer spare parts or customer support for that product + model, to give anyone who possesses the object code either (1) a + copy of the Corresponding Source for all the software in the + product that is covered by this License, on a durable physical + medium customarily used for software interchange, for a price no + more than your reasonable cost of physically performing this + conveying of source, or (2) access to copy the + Corresponding Source from a network server at no charge. + + c) Convey individual copies of the object code with a copy of the + written offer to provide the Corresponding Source. This + alternative is allowed only occasionally and noncommercially, and + only if you received the object code with such an offer, in accord + with subsection 6b. + + d) Convey the object code by offering access from a designated + place (gratis or for a charge), and offer equivalent access to the + Corresponding Source in the same way through the same place at no + further charge. You need not require recipients to copy the + Corresponding Source along with the object code. If the place to + copy the object code is a network server, the Corresponding Source + may be on a different server (operated by you or a third party) + that supports equivalent copying facilities, provided you maintain + clear directions next to the object code saying where to find the + Corresponding Source. Regardless of what server hosts the + Corresponding Source, you remain obligated to ensure that it is + available for as long as needed to satisfy these requirements. + + e) Convey the object code using peer-to-peer transmission, provided + you inform other peers where the object code and Corresponding + Source of the work are being offered to the general public at no + charge under subsection 6d. + + A separable portion of the object code, whose source code is excluded +from the Corresponding Source as a System Library, need not be +included in conveying the object code work. + + A "User Product" is either (1) a "consumer product", which means any +tangible personal property which is normally used for personal, family, +or household purposes, or (2) anything designed or sold for incorporation +into a dwelling. In determining whether a product is a consumer product, +doubtful cases shall be resolved in favor of coverage. For a particular +product received by a particular user, "normally used" refers to a +typical or common use of that class of product, regardless of the status +of the particular user or of the way in which the particular user +actually uses, or expects or is expected to use, the product. A product +is a consumer product regardless of whether the product has substantial +commercial, industrial or non-consumer uses, unless such uses represent +the only significant mode of use of the product. + + "Installation Information" for a User Product means any methods, +procedures, authorization keys, or other information required to install +and execute modified versions of a covered work in that User Product from +a modified version of its Corresponding Source. The information must +suffice to ensure that the continued functioning of the modified object +code is in no case prevented or interfered with solely because +modification has been made. + + If you convey an object code work under this section in, or with, or +specifically for use in, a User Product, and the conveying occurs as +part of a transaction in which the right of possession and use of the +User Product is transferred to the recipient in perpetuity or for a +fixed term (regardless of how the transaction is characterized), the +Corresponding Source conveyed under this section must be accompanied +by the Installation Information. But this requirement does not apply +if neither you nor any third party retains the ability to install +modified object code on the User Product (for example, the work has +been installed in ROM). + + The requirement to provide Installation Information does not include a +requirement to continue to provide support service, warranty, or updates +for a work that has been modified or installed by the recipient, or for +the User Product in which it has been modified or installed. Access to a +network may be denied when the modification itself materially and +adversely affects the operation of the network or violates the rules and +protocols for communication across the network. + + Corresponding Source conveyed, and Installation Information provided, +in accord with this section must be in a format that is publicly +documented (and with an implementation available to the public in +source code form), and must require no special password or key for +unpacking, reading or copying. + + 7. Additional Terms. + + "Additional permissions" are terms that supplement the terms of this +License by making exceptions from one or more of its conditions. +Additional permissions that are applicable to the entire Program shall +be treated as though they were included in this License, to the extent +that they are valid under applicable law. If additional permissions +apply only to part of the Program, that part may be used separately +under those permissions, but the entire Program remains governed by +this License without regard to the additional permissions. + + When you convey a copy of a covered work, you may at your option +remove any additional permissions from that copy, or from any part of +it. (Additional permissions may be written to require their own +removal in certain cases when you modify the work.) You may place +additional permissions on material, added by you to a covered work, +for which you have or can give appropriate copyright permission. + + Notwithstanding any other provision of this License, for material you +add to a covered work, you may (if authorized by the copyright holders of +that material) supplement the terms of this License with terms: + + a) Disclaiming warranty or limiting liability differently from the + terms of sections 15 and 16 of this License; or + + b) Requiring preservation of specified reasonable legal notices or + author attributions in that material or in the Appropriate Legal + Notices displayed by works containing it; or + + c) Prohibiting misrepresentation of the origin of that material, or + requiring that modified versions of such material be marked in + reasonable ways as different from the original version; or + + d) Limiting the use for publicity purposes of names of licensors or + authors of the material; or + + e) Declining to grant rights under trademark law for use of some + trade names, trademarks, or service marks; or + + f) Requiring indemnification of licensors and authors of that + material by anyone who conveys the material (or modified versions of + it) with contractual assumptions of liability to the recipient, for + any liability that these contractual assumptions directly impose on + those licensors and authors. + + All other non-permissive additional terms are considered "further +restrictions" within the meaning of section 10. If the Program as you +received it, or any part of it, contains a notice stating that it is +governed by this License along with a term that is a further +restriction, you may remove that term. If a license document contains +a further restriction but permits relicensing or conveying under this +License, you may add to a covered work material governed by the terms +of that license document, provided that the further restriction does +not survive such relicensing or conveying. + + If you add terms to a covered work in accord with this section, you +must place, in the relevant source files, a statement of the +additional terms that apply to those files, or a notice indicating +where to find the applicable terms. + + Additional terms, permissive or non-permissive, may be stated in the +form of a separately written license, or stated as exceptions; +the above requirements apply either way. + + 8. Termination. + + You may not propagate or modify a covered work except as expressly +provided under this License. Any attempt otherwise to propagate or +modify it is void, and will automatically terminate your rights under +this License (including any patent licenses granted under the third +paragraph of section 11). + + However, if you cease all violation of this License, then your +license from a particular copyright holder is reinstated (a) +provisionally, unless and until the copyright holder explicitly and +finally terminates your license, and (b) permanently, if the copyright +holder fails to notify you of the violation by some reasonable means +prior to 60 days after the cessation. + + Moreover, your license from a particular copyright holder is +reinstated permanently if the copyright holder notifies you of the +violation by some reasonable means, this is the first time you have +received notice of violation of this License (for any work) from that +copyright holder, and you cure the violation prior to 30 days after +your receipt of the notice. + + Termination of your rights under this section does not terminate the +licenses of parties who have received copies or rights from you under +this License. If your rights have been terminated and not permanently +reinstated, you do not qualify to receive new licenses for the same +material under section 10. + + 9. Acceptance Not Required for Having Copies. + + You are not required to accept this License in order to receive or +run a copy of the Program. Ancillary propagation of a covered work +occurring solely as a consequence of using peer-to-peer transmission +to receive a copy likewise does not require acceptance. However, +nothing other than this License grants you permission to propagate or +modify any covered work. These actions infringe copyright if you do +not accept this License. Therefore, by modifying or propagating a +covered work, you indicate your acceptance of this License to do so. + + 10. Automatic Licensing of Downstream Recipients. + + Each time you convey a covered work, the recipient automatically +receives a license from the original licensors, to run, modify and +propagate that work, subject to this License. You are not responsible +for enforcing compliance by third parties with this License. + + An "entity transaction" is a transaction transferring control of an +organization, or substantially all assets of one, or subdividing an +organization, or merging organizations. If propagation of a covered +work results from an entity transaction, each party to that +transaction who receives a copy of the work also receives whatever +licenses to the work the party's predecessor in interest had or could +give under the previous paragraph, plus a right to possession of the +Corresponding Source of the work from the predecessor in interest, if +the predecessor has it or can get it with reasonable efforts. + + You may not impose any further restrictions on the exercise of the +rights granted or affirmed under this License. For example, you may +not impose a license fee, royalty, or other charge for exercise of +rights granted under this License, and you may not initiate litigation +(including a cross-claim or counterclaim in a lawsuit) alleging that +any patent claim is infringed by making, using, selling, offering for +sale, or importing the Program or any portion of it. + + 11. Patents. + + A "contributor" is a copyright holder who authorizes use under this +License of the Program or a work on which the Program is based. The +work thus licensed is called the contributor's "contributor version". + + A contributor's "essential patent claims" are all patent claims +owned or controlled by the contributor, whether already acquired or +hereafter acquired, that would be infringed by some manner, permitted +by this License, of making, using, or selling its contributor version, +but do not include claims that would be infringed only as a +consequence of further modification of the contributor version. For +purposes of this definition, "control" includes the right to grant +patent sublicenses in a manner consistent with the requirements of +this License. + + Each contributor grants you a non-exclusive, worldwide, royalty-free +patent license under the contributor's essential patent claims, to +make, use, sell, offer for sale, import and otherwise run, modify and +propagate the contents of its contributor version. + + In the following three paragraphs, a "patent license" is any express +agreement or commitment, however denominated, not to enforce a patent +(such as an express permission to practice a patent or covenant not to +sue for patent infringement). To "grant" such a patent license to a +party means to make such an agreement or commitment not to enforce a +patent against the party. + + If you convey a covered work, knowingly relying on a patent license, +and the Corresponding Source of the work is not available for anyone +to copy, free of charge and under the terms of this License, through a +publicly available network server or other readily accessible means, +then you must either (1) cause the Corresponding Source to be so +available, or (2) arrange to deprive yourself of the benefit of the +patent license for this particular work, or (3) arrange, in a manner +consistent with the requirements of this License, to extend the patent +license to downstream recipients. "Knowingly relying" means you have +actual knowledge that, but for the patent license, your conveying the +covered work in a country, or your recipient's use of the covered work +in a country, would infringe one or more identifiable patents in that +country that you have reason to believe are valid. + + If, pursuant to or in connection with a single transaction or +arrangement, you convey, or propagate by procuring conveyance of, a +covered work, and grant a patent license to some of the parties +receiving the covered work authorizing them to use, propagate, modify +or convey a specific copy of the covered work, then the patent license +you grant is automatically extended to all recipients of the covered +work and works based on it. + + A patent license is "discriminatory" if it does not include within +the scope of its coverage, prohibits the exercise of, or is +conditioned on the non-exercise of one or more of the rights that are +specifically granted under this License. You may not convey a covered +work if you are a party to an arrangement with a third party that is +in the business of distributing software, under which you make payment +to the third party based on the extent of your activity of conveying +the work, and under which the third party grants, to any of the +parties who would receive the covered work from you, a discriminatory +patent license (a) in connection with copies of the covered work +conveyed by you (or copies made from those copies), or (b) primarily +for and in connection with specific products or compilations that +contain the covered work, unless you entered into that arrangement, +or that patent license was granted, prior to 28 March 2007. + + Nothing in this License shall be construed as excluding or limiting +any implied license or other defenses to infringement that may +otherwise be available to you under applicable patent law. + + 12. No Surrender of Others' Freedom. + + If conditions are imposed on you (whether by court order, agreement or +otherwise) that contradict the conditions of this License, they do not +excuse you from the conditions of this License. If you cannot convey a +covered work so as to satisfy simultaneously your obligations under this +License and any other pertinent obligations, then as a consequence you may +not convey it at all. For example, if you agree to terms that obligate you +to collect a royalty for further conveying from those to whom you convey +the Program, the only way you could satisfy both those terms and this +License would be to refrain entirely from conveying the Program. + + 13. Use with the GNU Affero General Public License. + + Notwithstanding any other provision of this License, you have +permission to link or combine any covered work with a work licensed +under version 3 of the GNU Affero General Public License into a single +combined work, and to convey the resulting work. The terms of this +License will continue to apply to the part which is the covered work, +but the special requirements of the GNU Affero General Public License, +section 13, concerning interaction through a network will apply to the +combination as such. + + 14. Revised Versions of this License. + + The Free Software Foundation may publish revised and/or new versions of +the GNU General Public License from time to time. Such new versions will +be similar in spirit to the present version, but may differ in detail to +address new problems or concerns. + + Each version is given a distinguishing version number. If the +Program specifies that a certain numbered version of the GNU General +Public License "or any later version" applies to it, you have the +option of following the terms and conditions either of that numbered +version or of any later version published by the Free Software +Foundation. If the Program does not specify a version number of the +GNU General Public License, you may choose any version ever published +by the Free Software Foundation. + + If the Program specifies that a proxy can decide which future +versions of the GNU General Public License can be used, that proxy's +public statement of acceptance of a version permanently authorizes you +to choose that version for the Program. + + Later license versions may give you additional or different +permissions. However, no additional obligations are imposed on any +author or copyright holder as a result of your choosing to follow a +later version. + + 15. Disclaimer of Warranty. + + THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY +APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT +HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY +OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, +THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR +PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM +IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF +ALL NECESSARY SERVICING, REPAIR OR CORRECTION. + + 16. Limitation of Liability. + + IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING +WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS +THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY +GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE +USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF +DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD +PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), +EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF +SUCH DAMAGES. + + 17. Interpretation of Sections 15 and 16. + + If the disclaimer of warranty and limitation of liability provided +above cannot be given local legal effect according to their terms, +reviewing courts shall apply local law that most closely approximates +an absolute waiver of all civil liability in connection with the +Program, unless a warranty or assumption of liability accompanies a +copy of the Program in return for a fee. + + END OF TERMS AND CONDITIONS + + How to Apply These Terms to Your New Programs + + If you develop a new program, and you want it to be of the greatest +possible use to the public, the best way to achieve this is to make it +free software which everyone can redistribute and change under these terms. + + To do so, attach the following notices to the program. It is safest +to attach them to the start of each source file to most effectively +state the exclusion of warranty; and each file should have at least +the "copyright" line and a pointer to where the full notice is found. + + + Copyright (C) + + This program is free software: you can redistribute it and/or modify + it under the terms of the GNU General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + This program is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU General Public License for more details. + + You should have received a copy of the GNU General Public License + along with this program. If not, see . + +Also add information on how to contact you by electronic and paper mail. + + If the program does terminal interaction, make it output a short +notice like this when it starts in an interactive mode: + + Copyright (C) + This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. + This is free software, and you are welcome to redistribute it + under certain conditions; type `show c' for details. + +The hypothetical commands `show w' and `show c' should show the appropriate +parts of the General Public License. Of course, your program's commands +might be different; for a GUI interface, you would use an "about box". + + You should also get your employer (if you work as a programmer) or school, +if any, to sign a "copyright disclaimer" for the program, if necessary. +For more information on this, and how to apply and follow the GNU GPL, see +. + + The GNU General Public License does not permit incorporating your program +into proprietary programs. If your program is a subroutine library, you +may consider it more useful to permit linking proprietary applications with +the library. If this is what you want to do, use the GNU Lesser General +Public License instead of this License. But first, please read +. diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/README.md b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/README.md new file mode 100644 index 00000000..6202709b --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/README.md @@ -0,0 +1,44 @@ +[![Python Package using Conda](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml/badge.svg)](https://github.com/mydroidandi/commbase/actions/workflows/python-package-conda.yml) +[![Python Version](https://img.shields.io/badge/Python-3.10%20%7C%203.11%20%7C%203.12-blue)](https://img.shields.io/badge/python-3.10%20%7C%203.11%20%7C%203.12-blue) + +# commbase-genai-llm-openai-gpt-p + +commbase-genai-llm-openai-gpt-p + +A terminal version of ChatGPT designed to facilitate seamless interaction with the powerful Open AI's GPT language model. This app empowers users to engage in natural language conversations with the model, unlocking a wide range of possibilities for creative, informative, and dynamic interactions. + +ChatGPT is a chatbot built using the GPT (Generative Pre-trained Transformer) architecture developed by OpenAI. To use ChatGPT, simply visit the website at . This is the online platform where you can access and interact with the ChatGPT service. + +## Unleash the Power of Creativity with commbase-genai-llm-openai-gpt-p + +commbase-genai-llm-openai-gpt-p harnesses the capabilities of OpenAI's GPT models. Explore the world of storytelling, code exploration, and language manipulation through intuitive interaction with GPT's advanced features. + +**Key strengths:** + +* **Creative text formats:** Generate poems, code, scripts, musical pieces, emails, letters, and more, finding inspiration, exploring different communication styles, or aiding in content creation. +* **Storytelling and world-building:** Craft captivating narratives and build intricate fictional worlds, empowering writers and game designers to bring their imaginations to life. +* **Humor and wit:** Engage in stimulating and enjoyable interactions as GPT incorporates humor and wit into its responses, fostering a sense of connection and amusement. +* **Multilingual capabilities:** Break down language barriers and access information across cultures with GPT's ability to understand and respond in multiple languages. +* **Code generation:** Get a head start on coding projects or explore different programming languages with GPT's basic code generation capabilities. +* **Accessibility and user-friendliness:** Interact with GPT easily through various APIs and interfaces, making the technology readily available to a diverse audience. +* **Constant evolution and improvement:** Stay at the forefront of language processing advancements as OpenAI continuously refines and improves GPT models. + +**With commbase-genai-llm-openai-gpt-p, unlock the potential of GPT and embark on a journey of creative exploration, communication, and discovery.** + +## Examples + +Detailed information about examples can be found in the corresponding [`examples`](./examples) directory. + +## Contributors + +Thanks to the following people who have contributed to this project: + +* [@estebanways](https://github.com/estebanways) + +## Contact + +If you want to contact us you can reach us at . + +## License + +This project uses the following license: [`License`](./COPYING). diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/commbase-genai-slm-ollama-phi3-mini-nomemory.jpg b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/commbase-genai-slm-ollama-phi3-mini-nomemory.jpg new file mode 100644 index 00000000..7ab73dd5 Binary files /dev/null and b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/commbase-genai-slm-ollama-phi3-mini-nomemory.jpg differ diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/commbase-genai-slm-ollama-phi3-mini-nomemory.py b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/commbase-genai-slm-ollama-phi3-mini-nomemory.py new file mode 100644 index 00000000..cc770bdd --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/commbase-genai-slm-ollama-phi3-mini-nomemory.py @@ -0,0 +1,473 @@ +#!/usr/bin/env python +################################################################################ +# commbase-slm-ollama-phi3-mini-nomemory # +# # +# A simple generative AI assistant using the Phi3 Small Language Model (SLM). # +# # +# Change History # +# 06/18/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# commbase-genai-slm-ollama-phi3-mini-nomemory.py +# Interacts with the Commbase platform by generating AI-driven responses using +# the Ollama API with the 'commbase-phi3-mini' model. + +# Imports +from datetime import datetime +import ollama +import os +import subprocess +import sys +from file_paths import ( + get_assistant_discourse_from_language_model_file, + get_chat_log_file, + get_commbase_hardware_command_random_negative_file, + get_commbase_hardware_command_random_neutral_file, + get_commbase_hardware_command_random_positive_file, + get_commbase_hardware_device_0, + get_run_voice_recorder_in_pane_path, + get_stt_engine_path +) +from functions import ( + get_assistant_response_sentiment_analysis_on, + get_audible_assistant_logging_on, + get_chat_participant_names, + get_commbase_hardware_notifications_on, + get_commbase_recorder_transmitter_quit_char, + get_commbase_stt_whisper_reactive_p_auto_open_recorder_after_command, + get_log_severity_level_1 +) + + +def notify_hardware_about_negative_sentiment_action(): + """ + Notifies the hardware about the start of the negative sentiment action. + + This function sends a signal to the hardware to indicate that the negative + sentiment action has started. It constructs a command to be executed, which + involves reading a command from a file and redirecting its output to a + hardware device. If the hardware is connected and accessible, the command + is executed successfully. + + Returns: + None + + Raises: + subprocess.CalledProcessError: If the subprocess command fails. + Exception: If any other error occurs during the execution. + + Note: + Ensure that the paths retrieved by + `get_commbase_hardware_command_processing_start_file()` and + `get_commbase_hardware_device_0()` functions are correctly configured. + """ + # Define the paths + hardware_command = get_commbase_hardware_command_random_negative_file() + hardware_device = get_commbase_hardware_device_0() + + # If any hardware is connected to the device + if os.path.exists(hardware_device): + + try: + # Construct the command and its arguments + command = ['cat', hardware_command] + redirect_to_device = ['>', hardware_device] + + # Run the command + subprocess.run(command + redirect_to_device, check=True) + for item in command: + print(item, end=' ') + + except subprocess.CalledProcessError as e: + print(f"Error executing subprocess command: {e}") + except Exception as e: + print(f"Other error occurred: {e}") + # discourse_data_exchange_client_error() + else: + print(f"The device file {hardware_device} does not exist, or it is not connected.") + + +def notify_hardware_about_neutral_sentiment_action(): + """ + Notifies the hardware about the start of the neutral sentiment action. + + This function sends a signal to the hardware to indicate that the neutral + sentiment action has started. It constructs a command to be executed, which + involves reading a command from a file and redirecting its output to a + hardware device. If the hardware is connected and accessible, the command + is executed successfully. + + Returns: + None + + Raises: + subprocess.CalledProcessError: If the subprocess command fails. + Exception: If any other error occurs during the execution. + + Note: + Ensure that the paths retrieved by + `get_commbase_hardware_command_processing_start_file()` and + `get_commbase_hardware_device_0()` functions are correctly configured. + """ + # Define the paths + hardware_command = get_commbase_hardware_command_random_neutral_file() + hardware_device = get_commbase_hardware_device_0() + + # If any hardware is connected to the device + if os.path.exists(hardware_device): + + try: + # Construct the command and its arguments + command = ['cat', hardware_command] + redirect_to_device = ['>', hardware_device] + + # Run the command + subprocess.run(command + redirect_to_device, check=True) + for item in command: + print(item, end=' ') + + except subprocess.CalledProcessError as e: + print(f"Error executing subprocess command: {e}") + except Exception as e: + print(f"Other error occurred: {e}") + # discourse_data_exchange_client_error() + else: + print(f"The device file {hardware_device} does not exist, or it is not connected.") + + +def notify_hardware_about_positive_sentiment_action(): + """ + Notifies the hardware about the start of the positive sentiment action. + + This function sends a signal to the hardware to indicate that the positive + sentiment action has started. It constructs a command to be executed, which + involves reading a command from a file and redirecting its output to a + hardware device. If the hardware is connected and accessible, the command + is executed successfully. + + Returns: + None + + Raises: + subprocess.CalledProcessError: If the subprocess command fails. + Exception: If any other error occurs during the execution. + + Note: + Ensure that the paths retrieved by + `get_commbase_hardware_command_processing_start_file()` and + `get_commbase_hardware_device_0()` functions are correctly configured. + """ + # Define the paths + hardware_command = get_commbase_hardware_command_random_positive_file() + hardware_device = get_commbase_hardware_device_0() + + # If any hardware is connected to the device + if os.path.exists(hardware_device): + + try: + # Construct the command and its arguments + command = ['cat', hardware_command] + redirect_to_device = ['>', hardware_device] + + # Run the command + subprocess.run(command + redirect_to_device, check=True) + for item in command: + print(item, end=' ') + + except subprocess.CalledProcessError as e: + print(f"Error executing subprocess command: {e}") + except Exception as e: + print(f"Other error occurred: {e}") + # discourse_data_exchange_client_error() + else: + print(f"The device file {hardware_device} does not exist, or it is not connected.") + + +def check_arguments(): + """ + Checks if the script received the correct number of arguments. + + This function ensures that exactly one argument (the prompt) is passed to + the script. + If the number of arguments is incorrect, it prints the usage instructions + and exits the program. + + Raises: + SystemExit: If the number of arguments is not equal to 2. + """ + if len(sys.argv) != 2: + print("Usage: python commbase-genai-slm-ollama-phi3-mini-nomemory.py \"Your prompt here\"") + sys.exit(1) + + +def get_prompt(): + """ + Retrieves the user-provided prompt from the command line arguments. + + This function extracts the prompt passed as the first argument to the + script. + It assumes that the arguments have already been checked for correctness + by another function. + + Returns: + str: The user-provided prompt. + """ + return sys.argv[1] + + +def generate_response(prompt): + """ + Generates a response using the provided prompt and the commbase-phi3-mini + model. + + This function sends the user-provided prompt to the Ollama API, using the + 'commbase-phi3-mini' model to generate a response. The function then + extracts and returns the text response from the API's response dictionary. + + Args: + prompt (str): The user-provided prompt to generate a response for. + + Returns: + str: The generated text response from the model. + """ + response = ollama.generate(model='commbase-phi3-mini', prompt=prompt) + return response['response'] + + +def print_response(response): + """ + Logs a response from an assistant or system to a specified log file. + + Args: + - response (str): The response message to log. + + Returns: + None + + Description: + This function logs the given response message along with current timestamp, + severity level, and system identifier to a designated log file. The + severity level is obtained from `get_log_severity_level_1()` function. The + logged format is: "[YYYY-MM-DD HH:MM:SS] phi-mini-nomemory: + : \n". + The logged message is appended to the file specified by `temp_file_path`. + + Notes: + - Ensure `get_log_severity_level_1()` returns an appropriate severity level + string. + - `temp_file_path` should point to the file where the logs are to be + appended. + """ + current_timestamp = datetime.now().strftime('%Y-%m-%d %H:%M:%S') + + # Set the value returned by get_log_severity_level_1() + log_severity_level_1 = get_log_severity_level_1() + + severity_level = log_severity_level_1 + + # Set the values returned by get_chat_participant_names() + end_user_name, assistant_name = get_chat_participant_names() + + # A temporary file path + temp_file_path = get_chat_log_file() + + assistant_text = "[" + current_timestamp + "]" + " phi3-mini-nomemory: " + severity_level + ": " + assistant_name + response + "\n" + with open(temp_file_path, 'a') as temp_file: + temp_file.write(assistant_text) + + +def response_sentiment_action(response): + """ + Analyzes the sentiment of a given response and triggers appropriate + hardware actions based on the sentiment. + + Parameters: + response (str): The input text whose sentiment needs to be analyzed. + + The function uses a sentiment analysis pipeline from the transformers + library to classify the sentiment of the input text as either 'POSITIVE', + 'NEGATIVE', or 'NEUTRAL'. Depending on the classification, it triggers one + of three hardware notification functions: + - notify_hardware_about_positive_sentiment_action() for positive sentiments + - notify_hardware_about_negative_sentiment_action() for negative sentiments + - notify_hardware_about_neutral_sentiment_action() for neutral sentiments + + Returns: + None + """ + from transformers import pipeline + + classifier = pipeline("sentiment-analysis") + + json_response = classifier(response) # [{'label': 'POSITIVE', 'score': 0.9998795986175537}] + + # Extract values + label = json_response[0]['label'] + + if label == "POSITIVE": + notify_hardware_about_positive_sentiment_action() + elif label == "NEGATIVE": + notify_hardware_about_negative_sentiment_action() + else: + notify_hardware_about_neutral_sentiment_action() + + +def text_to_speech(window_number, pane_number, time): + """ + Sends a command to tmux to open a language model discourse in a specified + tmux window and pane, allowing text-to-speech synthesis. + + Args: + - window_number (int): The number of the tmux window to select. + - pane_number (int): The number of the tmux pane to select. + - time (int or float): Time in seconds to wait after sending the command. + + Returns: + None + + Notes: + - Requires tmux to be installed and accessible from the command line. + - Uses subprocess module to execute tmux commands. + + Example: + >>> text_to_speech(1, 0, 2) + This will select window 1, pane 0 in tmux, send a clear and bash command + to run the language model discourse, and wait for 2 seconds before + returning. + """ + # Set the value returned by get_stt_engine_path() + stt_engine_path = get_stt_engine_path() + + # Set the value returned by get_commbase_recorder_transmitter_quit_char() + quit_char = get_commbase_recorder_transmitter_quit_char() + + # Set the value returned by get_assistant_discourse_from_language_model_file() + assistant_discourse_from_language_model = get_assistant_discourse_from_language_model_file() + + # Set the value returned by get_commbase_stt_whisper_reactive_p_auto_open_recorder_after_command() + auto_open_recorder_after_command = get_commbase_stt_whisper_reactive_p_auto_open_recorder_after_command() + + # Set the value returned by get_run_voice_recorder_in_pane() + run_voice_recorder_in_pane_path = get_run_voice_recorder_in_pane_path() + + # run_voice_recorder_in_pane_routine_pane = 7 + run_voice_recorder_in_pane_routine_pane = pane_number + + # If the STT engine is Whisper reactive, attempt to close the + # recorder-transmitter app. + if stt_engine_path == "$COMMBASE_APP_DIR/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py": + # Construct the tmux command + command = ( + f"tmux select-window -t {window_number} && " + f"tmux select-pane -t {pane_number} && " + f"tmux send-keys \'{quit_char}; clear\' C-m && " + f"sleep {time}" + ) + + # Execute the command + subprocess.run(command, shell=True, check=True) # Do not manage errors here or the script will fail + + # Make the assistant speak out loud the language model response + # Construct the tmux command + command = ( + f"tmux select-window -t {window_number} && " + f"tmux select-pane -t {pane_number} && " + f"tmux send-keys \'clear; bash {assistant_discourse_from_language_model}\' C-m && " + f"sleep {time}" + ) + + # Execute the command + subprocess.run(command, shell=True, check=True) # Do not manage errors here or the script will fail + + # If the Whisper reactive STT Engine is set up to reopen the recorder after + # any command, just do it. + if auto_open_recorder_after_command == "True": + # If the STT engine is Whisper reactive, reopen the + # recorder-transmitter app. + if stt_engine_path == "$COMMBASE_APP_DIR/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py": + # Construct the tmux command + command = ( + f"tmux select-window -t {window_number} && " + f"tmux select-pane -t {pane_number} && " + f"tmux send-keys \'clear; bash {run_voice_recorder_in_pane_path} {run_voice_recorder_in_pane_routine_pane}\' C-m && " + f"sleep {time}" + ) + + # Execute the command + subprocess.run(command, shell=True, check=True) # Do not manage errors here or the script will fail + + +def main(): + """ + Executes the main functionality of the script: + - Checks command line arguments to ensure correct usage. + - Retrieves the user-provided prompt. + - Generates a response using the commbase-phi3-mini model via the Ollama + API. + - Logs the generated response to a designated log file with timestamp and + severity level. + - Initiates text-to-speech synthesis of the response in a specific tmux + window and pane. + + Args: + None + + Returns: + None + + Notes: + - Requires the Ollama API, tmux, and other dependencies as specified. + - Uses functions from 'file_paths' and 'functions' modules for file paths + and helper functions. + - Assumes correct configuration and availability of the tmux session. + + Example: + >>> main() + This will execute the main functionality of the script, including response + generation, logging, and text-to-speech synthesis in the specified tmux + window and pane. + """ + # Set the value returned by et_assistant_response_sentiment_analysis_on() + assistant_response_sentiment_analysis_on = get_assistant_response_sentiment_analysis_on() + + # Set the value returned by get_audible_assistant_logging_on() + audible_assistant_logging_on = get_audible_assistant_logging_on() + + # Set the value returned by get_commbase_hardware_notifications_on() + commbase_hardware_notifications_on = get_commbase_hardware_notifications_on() + + check_arguments() + prompt = get_prompt() + response = generate_response(prompt) + print_response(response) + if commbase_hardware_notifications_on == "True": + if assistant_response_sentiment_analysis_on == "True": + response_sentiment_action(response) + if audible_assistant_logging_on == "True": + text_to_speech(1, 7, 0.1) + + +if __name__ == "__main__": + main() diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/config.py b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/config.py new file mode 100644 index 00000000..a5020a87 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/config.py @@ -0,0 +1,42 @@ +#!/usr/bin/env python +################################################################################ +# commbase-slm-ollama-phi3-mini-nomemory # +# # +# A simple generative AI assistant using the Phi3 Small Language Model (SLM). # +# # +# Change History # +# 06/18/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# config.py +# This files makes a variable available throughout the entire Python +# application, including all modules, submodules, and functions. + +# Requirements +import os + +# The path to the .env configuration file +CONFIG_FILE_DIR = os.environ["COMMBASE_APP_DIR"] +CONFIG_FILE_PATH = os.path.join(CONFIG_FILE_DIR, "config/commbase.conf") diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/examples/README.md b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/examples/README.md new file mode 100644 index 00000000..ef0f689c --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/examples/README.md @@ -0,0 +1,12 @@ +## Examples + +### Example 1: commbase-genai-slm-ollama-phi3-mini-nomemory implementation + +This example demonstrates the implementation of our software. You can find the code in the [Commbase](https://github.com/mydroidandi/commbase) repository. + +To run the example, follow these steps: +1. Clone the repository. +2. Install the necessary dependencies. +3. Execute the program. + +Make sure to check out the repository for more details and instructions. diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/file_paths.py b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/file_paths.py new file mode 100644 index 00000000..1c74b250 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/file_paths.py @@ -0,0 +1,278 @@ +#!/usr/bin/env python +################################################################################ +# commbase-slm-ollama-phi3-mini-nomemory # +# # +# A simple generative AI assistant using the Phi3 Small Language Model (SLM). # +# # +# Change History # +# 06/21/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# file_paths.py +# This file stores functions related to loading and managing file paths +# Requires os.path already imported + +# Requirements +from config import CONFIG_FILE_DIR, CONFIG_FILE_PATH + + +def get_assistant_discourse_from_language_model_file(): + """ + Retrieves the value of the ASSISTANT_DISCOURSE_FROM_LANGUAGE_MODEL_FILE + variable from the configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + model_discourse_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "ASSISTANT_DISCOURSE_FROM_LANGUAGE_MODEL_FILE": + # Remove the quotes from the value of the variable + model_discourse_file = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if model_discourse_file is not None: + return model_discourse_file + + # If the variable was not found, return None + return None + + +def get_chat_log_file(): + """ + Retrieves the value of the CHAT_LOG_FILE variable from the configuration + file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + chat_log_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "CHAT_LOG_FILE": + # Remove the quotes from the value of the variable + chat_log_file = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if chat_log_file is not None: + return chat_log_file + + # If the variable was not found, return None + return None + + +def get_commbase_hardware_command_random_negative_file(): + """ + Retrieves the value of the COMMBASE_HARDWARE_COMMAND_RANDOM_NEGATIVE_FILE + variable from the configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + random_negative_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_HARDWARE_COMMAND_RANDOM_NEGATIVE_FILE": + # Remove the quotes from the value of the variable + random_negative_file = CONFIG_FILE_DIR + value.strip()[1:-1] + + # Check if the variable was found + if random_negative_file is not None: + return random_negative_file + + # If the variable was not found, return None + return None + + +def get_commbase_hardware_command_random_neutral_file(): + """ + Retrieves the value of the COMMBASE_HARDWARE_COMMAND_RANDOM_NEUTRAL_FILE + variable from the configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + random_neutral_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_HARDWARE_COMMAND_RANDOM_NEUTRAL_FILE": + # Remove the quotes from the value of the variable + random_neutral_file = CONFIG_FILE_DIR + value.strip()[1:-1] + + # Check if the variable was found + if random_neutral_file is not None: + return random_neutral_file + + # If the variable was not found, return None + return None + + +def get_commbase_hardware_command_random_positive_file(): + """ + Retrieves the value of the COMMBASE_HARDWARE_COMMAND_RANDOM_POSITIVE_FILE + variable from the configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + random_positive_file = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_HARDWARE_COMMAND_RANDOM_POSITIVE_FILE": + # Remove the quotes from the value of the variable + random_positive_file = CONFIG_FILE_DIR + value.strip()[1:-1] + + # Check if the variable was found + if random_positive_file is not None: + return random_positive_file + + # If the variable was not found, return None + return None + + +def get_commbase_hardware_device_0(): + """ + Retrieves the value of the COMMBASE_HARDWARE_DEVICE_0 variable from the + configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + hardware_device_0 = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_HARDWARE_DEVICE_0": + # Remove the quotes from the value of the variable + hardware_device_0 = value.strip()[1:-1] + + # Check if the variable was found + if hardware_device_0 is not None: + return hardware_device_0 + + # If the variable was not found, return None + return None + + +def get_run_voice_recorder_in_pane_path(): + """ + Retrieves the value of the RUN_VOICE_RECORDER_IN_PANE_PATH variable from + the configuration file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + run_recorder_path = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "RUN_VOICE_RECORDER_IN_PANE_PATH": + # Remove the quotes from the value of the variable + run_recorder_path = CONFIG_FILE_DIR + "/" + value.strip()[1:-1] + + # Check if the variable was found + if run_recorder_path is not None: + return run_recorder_path + + # If the variable was not found, return None + return None + + +def get_stt_engine_path(): + """ + Retrieves the value of the STT_ENGINE_PATH variable from the configuration + file. + + Returns: + str or None: The value of the variable if found, or None if not found. + """ + # Initialize variable + stt_engine_path = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "STT_ENGINE_PATH": + # Remove the quotes from the value of the variable + stt_engine_path = value.strip()[1:-1] + + # Check if the variable was found + if stt_engine_path is not None: + return stt_engine_path + + # If the variable was not found, return None + return None diff --git a/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/functions.py b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/functions.py new file mode 100644 index 00000000..807f8675 --- /dev/null +++ b/bundles/commbase-genai-slm-ollama-phi3-mini-nomemory/functions.py @@ -0,0 +1,274 @@ +#!/usr/bin/env python +################################################################################ +# commbase-slm-ollama-phi3-mini-nohistory # +# # +# A simple generative AI assistant using the Phi3 Small Language Model (SLM). # +# # +# Change History # +# 06/19/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# functions.py +# This file contains a collection of utility functions for various tasks. It +# provides a set of commonly used functions related to data manipulation, file +# handling, and mathematical operations. + +# Imports +from config import CONFIG_FILE_PATH + + +def get_assistant_response_sentiment_analysis_on(): + """ + Reads the 'ASSISTANT_RESPONSE_SENTIMENT_ANALYSIS_ON' variable from the + environment configuration file. Returns the string value of the variables + if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + assistant_response_sentiment = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "ASSISTANT_RESPONSE_SENTIMENT_ANALYSIS_ON": + # Remove the quotes from the value of the variable + assistant_response_sentiment = value.strip()[1:-1] + + # Check if the variable was found + if assistant_response_sentiment is not None: + return assistant_response_sentiment + + # If the variable was not found, return None + return None + + +def get_audible_assistant_logging_on(): + """ + Reads the 'AUDIBLE_ASSISTANT_LOGGING_ON' variable from the environment + configuration file. Returns the string value of the variables if found, or + None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + audible_assistant_logging_on = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "AUDIBLE_ASSISTANT_LOGGING_ON": + # Remove the quotes from the value of the variable + audible_assistant_logging_on = value.strip()[1:-1] + + # Check if the variable was found + if audible_assistant_logging_on is not None: + return audible_assistant_logging_on + + # If the variable was not found, return None + return None + + +def get_chat_participant_names(): + """ + Gets the chat participant names from the config file. + + Reads the 'ASSISTANT_NAME_IN_CHAT_PANE' and 'END_USER_NAME_IN_CHAT_PANE' + variables from the environment configuration file. Returns a tuple + containing the string values of the variables if found, or None if any of + the variables are not present. + + Returns: + tuple or None: A tuple containing the assistant, system, and end user + names in the chat pane, or None, if any of the variables are not found. + """ + # Initialize variables for the chat names + assistant_name = None + end_user_name = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "END_USER_NAME_IN_CHAT_PANE": + # Remove the quotes from the value of the variable + end_user_name = value.strip()[1:-1] + + elif variable_name == "ASSISTANT_NAME_IN_CHAT_PANE": + # Remove the quotes from the value of the variable + assistant_name = value.strip()[1:-1] + + # Check if all two variables were found + if ( + assistant_name is not None + and end_user_name is not None + ): + return end_user_name, assistant_name + + # If any of the variables are not found, return None + return None + + +def get_commbase_hardware_notifications_on(): + """ + Reads the 'COMMBASE_HARDWARE_NOTIFICATIONS_ON' variable from the + environment configuration file. Returns the string value of the variables + if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + commbase_hardware_notifications = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_HARDWARE_NOTIFICATIONS_ON": + # Remove the quotes from the value of the variable + commbase_hardware_notifications = value.strip()[1:-1] + + # Check if the variable was found + if commbase_hardware_notifications is not None: + return commbase_hardware_notifications + + # If the variable was not found, return None + return None + + +def get_commbase_recorder_transmitter_quit_char(): + """ + Reads the 'COMMBASE_RECORDER_TRANSMITTER_QUIT_CHAR' variable from the + environment configuration file. Returns the string value of the variables + if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + quit_char = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_RECORDER_TRANSMITTER_QUIT_CHAR": + # Remove the quotes from the value of the variable + quit_char = value.strip()[1:-1] + + # Check if the variable was found + if quit_char is not None: + return quit_char + + # If the variable was not found, return None + return None + + +def get_commbase_stt_whisper_reactive_p_auto_open_recorder_after_command(): + """ + Reads the 'COMMBASE_STT_WHISPER_REACTIVE_P_AUTO_OPEN_RECORDER_AFTER_COMMAND' + variable from the environment configuration file. Returns the string value + of the variables if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + auto_open_recoder = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_STT_WHISPER_REACTIVE_P_AUTO_OPEN_RECORDER_AFTER_COMMAND": + # Remove the quotes from the value of the variable + auto_open_recoder = value.strip()[1:-1] + + # Check if the variable was found + if auto_open_recoder is not None: + return auto_open_recoder + + # If the variable was not found, return None + return None + + +def get_log_severity_level_1(): + """ + Reads the 'LOG_SEVERITY_LEVEL_1' variable from the environment + configuration file. Returns the string value of the variables if found, or + None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + log_severity_level_1 = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "LOG_SEVERITY_LEVEL_1": + # Remove the quotes from the value of the variable + log_severity_level_1 = value.strip()[1:-1] + + # Check if the variable was found + if log_severity_level_1 is not None: + return log_severity_level_1 + + # If the variable was not found, return None + return None diff --git a/bundles/commbase-stt-whisper-proactive-p/commbase_stt_whisper_proactive_p.py b/bundles/commbase-stt-whisper-proactive-p/commbase_stt_whisper_proactive_p.py index 6d6024f4..02ab18f0 100755 --- a/bundles/commbase-stt-whisper-proactive-p/commbase_stt_whisper_proactive_p.py +++ b/bundles/commbase-stt-whisper-proactive-p/commbase_stt_whisper_proactive_p.py @@ -64,6 +64,7 @@ get_commbase_hardware_notification_listening_stop_on, get_commbase_hardware_notification_processing_start_on, get_commbase_hardware_notification_processing_stop_on, + get_commbase_hardware_speech_to_text_engine_component_on, get_log_severity_level_1, get_stt_engine_language, get_stt_whisper_proactive_timeout, @@ -306,6 +307,7 @@ def listen(): commbase_hardware_notification_listening_stop_on = get_commbase_hardware_notification_listening_stop_on() commbase_hardware_notification_processing_start_on = get_commbase_hardware_notification_processing_start_on() commbase_hardware_notification_processing_stop_on = get_commbase_hardware_notification_processing_stop_on() + commbase_hardware_speech_to_text_engine_component_on = get_commbase_hardware_speech_to_text_engine_component_on() # Set the values returned by get_stt_whisper_proactive_timeout(). stt_engine_timeout = get_stt_whisper_proactive_timeout() listener = sr.Recognizer() # Create an instance of Recognizer @@ -316,20 +318,22 @@ def listen(): # print("Microphone with name \"{1}\" found for `Microphone(device_index={0})`".format(index, name)) # listener.adjust_for_ambient_noise(source) if commbase_hardware_notifications_on == "True": - if commbase_hardware_notification_processing_stop_on == "True": - notify_hardware_about_processing_stop() - if commbase_hardware_notification_listening_start_on == "True": - notify_hardware_about_listening_start() + if commbase_hardware_speech_to_text_engine_component_on == "True": + if commbase_hardware_notification_processing_stop_on == "True": + notify_hardware_about_processing_stop() + if commbase_hardware_notification_listening_start_on == "True": + notify_hardware_about_listening_start() try: audio = listener.listen(source, timeout=int(stt_engine_timeout)) # Set a timeout=15 (in seconds) discourse = "Processing..." print(f"{assistant_name} {discourse}") if commbase_hardware_notifications_on == "True": - if commbase_hardware_notification_listening_stop_on == "True": - notify_hardware_about_listening_stop() - if commbase_hardware_notification_processing_start_on == "True": - notify_hardware_about_processing_start() + if commbase_hardware_speech_to_text_engine_component_on == "True": + if commbase_hardware_notification_listening_stop_on == "True": + notify_hardware_about_listening_stop() + if commbase_hardware_notification_processing_start_on == "True": + notify_hardware_about_processing_start() data = io.BytesIO(audio.get_wav_data()) audio_clip = AudioSegment.from_file(data) audio_clip.export(save_path, format='wav') @@ -337,10 +341,11 @@ def listen(): discourse = "Speech stopped. Recognizing..." print(f"{assistant_name} {discourse}") if commbase_hardware_notifications_on == "True": - if commbase_hardware_notification_processing_stop_on == "True": - notify_hardware_about_processing_stop() - if commbase_hardware_notification_listening_stop_on == "True": - notify_hardware_about_listening_stop() + if commbase_hardware_speech_to_text_engine_component_on == "True": + if commbase_hardware_notification_processing_stop_on == "True": + notify_hardware_about_processing_stop() + if commbase_hardware_notification_listening_stop_on == "True": + notify_hardware_about_listening_stop() return save_path @@ -392,6 +397,7 @@ def write_to_temp_file(text): current_timestamp = datetime.now().strftime('%Y-%m-%d %H:%M:%S') severity_level = log_severity_level_1 + end_user_text = "[" + current_timestamp + "]" + " stt-whisper-proactive: " + severity_level + ": " + end_user_name + text + "\n" with open(temp_file_path, 'a') as temp_file: temp_file.write(end_user_text) diff --git a/bundles/commbase-stt-whisper-proactive-p/file_paths.py b/bundles/commbase-stt-whisper-proactive-p/file_paths.py index 3838e5ba..4ca1ae47 100644 --- a/bundles/commbase-stt-whisper-proactive-p/file_paths.py +++ b/bundles/commbase-stt-whisper-proactive-p/file_paths.py @@ -206,7 +206,7 @@ def get_commbase_hardware_device_0(): variable_name, value = line.strip().split("=") # Check if the variable we are looking for exists in the line - if variable_name == "COMBASE_HARDWARE_DEVICE_0": + if variable_name == "COMMBASE_HARDWARE_DEVICE_0": # Remove the quotes from the value of the variable hardware_device_0 = value.strip()[1:-1] diff --git a/bundles/commbase-stt-whisper-proactive-p/functions.py b/bundles/commbase-stt-whisper-proactive-p/functions.py index f569f048..09813cdf 100644 --- a/bundles/commbase-stt-whisper-proactive-p/functions.py +++ b/bundles/commbase-stt-whisper-proactive-p/functions.py @@ -338,6 +338,38 @@ def get_commbase_hardware_notification_processing_stop_on(): return None +def get_commbase_hardware_speech_to_text_engine_component_on(): + """ + Reads the 'COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONEN_ON' variable + from the environment configuration file. Returns the string value of the + variables if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + commbase_hardware_notification_component = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONENT_ON": + # Remove the quotes from the value of the variable + commbase_hardware_notification_component = value.strip()[1:-1] + + # Check if the variable was found + if commbase_hardware_notification_component is not None: + return commbase_hardware_notification_component + + # If the variable was not found, return None + return None + + def stt_engine_processing_time_visible_on(): """ Reads the 'STT_ENGINE_PROCESSING_TIME_VISIBLE_ON' variable from diff --git a/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py b/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py index f12d91f6..a6c5af69 100755 --- a/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py +++ b/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py @@ -44,6 +44,7 @@ import wave import whisper from config import CONFIG_FILE_PATH +from datetime import datetime from file_paths import ( get_chat_log_file, get_commbase_hardware_command_processing_start_file, @@ -56,6 +57,8 @@ get_commbase_hardware_notifications_on, get_commbase_hardware_notification_processing_start_on, get_commbase_hardware_notification_processing_stop_on, + get_commbase_hardware_speech_to_text_engine_component_on, + get_log_severity_level_1, get_stt_engine_language, stt_engine_processing_time_visible_on ) @@ -181,7 +184,15 @@ def write_to_temp_file(text): """ # Set the values returned by get_chat_participant_names() end_user_name, assistant_name = get_chat_participant_names() - end_user_text = end_user_name + text + "\n" + + # Set the value returned by get_log_severity_level_1() + log_severity_level_1 = get_log_severity_level_1() + + current_timestamp = datetime.now().strftime('%Y-%m-%d %H:%M:%S') + + severity_level = log_severity_level_1 + + end_user_text = "[" + current_timestamp + "]" + " stt-whisper-reactive: " + severity_level + ": " + end_user_name + text + "\n" with open(temp_file_path, 'a') as temp_file: temp_file.write(end_user_text) @@ -219,6 +230,7 @@ def main(): commbase_hardware_notifications_on = get_commbase_hardware_notifications_on() commbase_hardware_notification_processing_start_on = get_commbase_hardware_notification_processing_start_on() commbase_hardware_notification_processing_stop_on = get_commbase_hardware_notification_processing_stop_on() + commbase_hardware_speech_to_text_engine_component_on = get_commbase_hardware_speech_to_text_engine_component_on() processing_time_visible_on = stt_engine_processing_time_visible_on() @@ -237,10 +249,11 @@ def main(): last_modified_time = current_modified_time if commbase_hardware_notifications_on == "True": - if commbase_hardware_notification_processing_stop_on == "True": - notify_hardware_about_processing_stop() - if commbase_hardware_notification_processing_start_on == "True": - notify_hardware_about_processing_start() + if commbase_hardware_speech_to_text_engine_component_on == "True": + if commbase_hardware_notification_processing_stop_on == "True": + notify_hardware_about_processing_stop() + if commbase_hardware_notification_processing_start_on == "True": + notify_hardware_about_processing_start() if processing_time_visible_on == "True": start_time = time.time() # Record the start time @@ -254,8 +267,9 @@ def main(): print(f"Elapsed processing time: {elapsed_time} seconds") if commbase_hardware_notifications_on == "True": - if commbase_hardware_notification_processing_stop_on == "True": - notify_hardware_about_processing_stop() + if commbase_hardware_speech_to_text_engine_component_on == "True": + if commbase_hardware_notification_processing_stop_on == "True": + notify_hardware_about_processing_stop() # Write the transcribed text to a temporary file write_to_temp_file(response) diff --git a/bundles/commbase-stt-whisper-reactive-p/file_paths.py b/bundles/commbase-stt-whisper-reactive-p/file_paths.py index 0633736d..5038e636 100644 --- a/bundles/commbase-stt-whisper-reactive-p/file_paths.py +++ b/bundles/commbase-stt-whisper-reactive-p/file_paths.py @@ -130,7 +130,7 @@ def get_commbase_hardware_command_processing_stop_file(): def get_commbase_hardware_device_0(): """ - Retrieves the value of the COMBASE_HARDWARE_DEVICE_0 variable from the + Retrieves the value of the COMMBASE_HARDWARE_DEVICE_0 variable from the configuration file. Returns: @@ -146,7 +146,7 @@ def get_commbase_hardware_device_0(): variable_name, value = line.strip().split("=") # Check if the variable we are looking for exists in the line - if variable_name == "COMBASE_HARDWARE_DEVICE_0": + if variable_name == "COMMBASE_HARDWARE_DEVICE_0": # Remove the quotes from the value of the variable hardware_device_0 = value.strip()[1:-1] diff --git a/bundles/commbase-stt-whisper-reactive-p/functions.py b/bundles/commbase-stt-whisper-reactive-p/functions.py index 2f5911a4..c753e8c8 100644 --- a/bundles/commbase-stt-whisper-reactive-p/functions.py +++ b/bundles/commbase-stt-whisper-reactive-p/functions.py @@ -178,6 +178,70 @@ def get_commbase_hardware_notification_processing_stop_on(): return None +def get_commbase_hardware_speech_to_text_engine_component_on(): + """ + Reads the 'COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONEN_ON' variable + from the environment configuration file. Returns the string value of the + variables if found, or None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + commbase_hardware_notification_component = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONENT_ON": + # Remove the quotes from the value of the variable + commbase_hardware_notification_component = value.strip()[1:-1] + + # Check if the variable was found + if commbase_hardware_notification_component is not None: + return commbase_hardware_notification_component + + # If the variable was not found, return None + return None + + +def get_log_severity_level_1(): + """ + Reads the 'LOG_SEVERITY_LEVEL_1' variable from the environment + configuration file. Returns the string value of the variables if found, or + None if the variable is not present. + + Returns: + str or None: The string if found in the configuration file, otherwise + None. + """ + # Initialize variable + log_severity_level_1 = None + + # Open the file and read its contents + with open(CONFIG_FILE_PATH, "r") as f: + for line in f: + # Split the line into variable name and value + variable_name, value = line.strip().split("=") + + # Check if the variable we are looking for exists in the line + if variable_name == "LOG_SEVERITY_LEVEL_1": + # Remove the quotes from the value of the variable + log_severity_level_1 = value.strip()[1:-1] + + # Check if the variable was found + if log_severity_level_1 is not None: + return log_severity_level_1 + + # If the variable was not found, return None + return None + + def get_stt_engine_language(): """ Reads the 'STT_ENGINE_LANGUAGE' variable from the environment configuration diff --git a/bundles/commbase-tts-gTTS/README.md b/bundles/commbase-tts-gTTS/README.md index bdbeb5a7..38283a2e 100644 --- a/bundles/commbase-tts-gTTS/README.md +++ b/bundles/commbase-tts-gTTS/README.md @@ -5,7 +5,7 @@ commbase-tts-gTTS -A Commbase-ready TTS executable using gTTS (Google Text-to-Speech), a Python library and CLI tool to interface with Google Translate text-to-speech API +A Commbase-ready TTS executable using gTTS (Google Text-to-Speech), a Python library and CLI tool to interface with Google Translate text-to-speech API. ## Examples diff --git a/bundles/libcommbase/libcommbase/interactive/terminal_select_stt_engine.sh b/bundles/libcommbase/libcommbase/interactive/terminal_select_stt_engine.sh index 9566632c..4742b712 100755 --- a/bundles/libcommbase/libcommbase/interactive/terminal_select_stt_engine.sh +++ b/bundles/libcommbase/libcommbase/interactive/terminal_select_stt_engine.sh @@ -31,9 +31,9 @@ # along with this program; if not, write to the Free Software # # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # -# terminal_only_select_stt_engine.sh +# terminal_select_stt_engine.sh # Sets up the STT engine based on the value of the variable STT_ENGINE_PATH -terminal_only_select_stt_engine() { +terminal_select_stt_engine() { # The configuration file source "$COMMBASE_APP_DIR"/config/commbase.conf @@ -134,9 +134,9 @@ terminal_only_select_stt_engine() { exit 99 } -# Call terminal_only_select_stt_engine if the script is run directly (not sourced) +# Call terminal_select_stt_engine if the script is run directly (not sourced) if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then - (terminal_only_select_stt_engine) + (terminal_select_stt_engine) fi exit 99 diff --git a/bundles/libcommbase/libcommbase/interactive/terminal_set_language.sh b/bundles/libcommbase/libcommbase/interactive/terminal_set_language.sh index 83b22ba8..caaba676 100755 --- a/bundles/libcommbase/libcommbase/interactive/terminal_set_language.sh +++ b/bundles/libcommbase/libcommbase/interactive/terminal_set_language.sh @@ -31,9 +31,9 @@ # along with this program; if not, write to the Free Software # # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # -# terminal_only_set_language.sh +# terminal_set_language.sh # Sets up the Commbase language and the Commbase app language -terminal_only_set_language() { +terminal_set_language() { # The configuration file source "$COMMBASE_APP_DIR"/config/commbase.conf @@ -194,9 +194,9 @@ terminal_only_set_language() { exit 99 } -# Call terminal_only_set_language if the script is run directly (not sourced) +# Call terminal_set_language if the script is run directly (not sourced) if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then - (terminal_only_set_language) + (terminal_set_language) fi exit 99 diff --git a/bundles/libcommbase/libcommbase/routines/assistant_discourse_from_language_model.sh b/bundles/libcommbase/libcommbase/routines/assistant_discourse_from_language_model.sh new file mode 100644 index 00000000..897c4b4e --- /dev/null +++ b/bundles/libcommbase/libcommbase/routines/assistant_discourse_from_language_model.sh @@ -0,0 +1,76 @@ +#!/usr/bin/env bash +################################################################################ +# libcommbase # +# # +# A collection of libraries to centralize common functions that can be shared # +# across multiple conversational AI assistant projects # +# # +# Change History # +# 06/22/2024 Esteban Herrera Original code. # +# Add new history entries as needed. # +# # +# # +################################################################################ +################################################################################ +################################################################################ +# # +# Copyright (c) 2022-present Esteban Herrera C. # +# stv.herrera@gmail.com # +# # +# This program is free software; you can redistribute it and/or modify # +# it under the terms of the GNU General Public License as published by # +# the Free Software Foundation; either version 3 of the License, or # +# (at your option) any later version. # +# # +# This program is distributed in the hope that it will be useful, # +# but WITHOUT ANY WARRANTY; without even the implied warranty of # +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # +# GNU General Public License for more details. # +# # +# You should have received a copy of the GNU General Public License # +# along with this program; if not, write to the Free Software # +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # + +# assistant_discourse_from_language_model.sh +# Manages how discourse messages from an assistant language model are logged +# audibly. +assistant_discourse_from_language_model() { + # Imports + source "$COMMBASE_APP_DIR"/config/commbase.conf + + # Imports from libcommbase + mute_capture=$COMMBASE_APP_DIR/bundles/libcommbase/libcommbase/routines/capture_mute.sh + unmute_capture=$COMMBASE_APP_DIR/bundles/libcommbase/libcommbase/routines/capture_unmute.sh + + # Define the path to the log file + log_file="$COMMBASE_APP_DIR"/"$CHAT_LOG_FILE" + + # Check if ASSISTANT_NAME_IN_CHAT_PANE is set + if [ -z "$ASSISTANT_NAME_IN_CHAT_PANE" ]; then + echo "ASSISTANT_NAME_IN_CHAT_PANE is not set in the .env file." + exit 1 + fi + + # Check if the log file exists + if [ -f "$log_file" ]; then + # Extract the last log entry by the assistant + last_log=$(grep "$ASSISTANT_NAME_IN_CHAT_PANE" "$log_file" | tail -n 1) + (bash "$mute_capture") + # Extract only the text after the assistant name and pass it to the TTS_ENGINE_STRING command + echo "$last_log" | awk -v name="$ASSISTANT_NAME_IN_CHAT_PANE" '{n=split($0, a, name); for (i=2; i<=n; i++) print a[i]}' | $TTS_ENGINE_STRING + (bash "$unmute_capture") + else + echo "Log file not found." + exit 1 + fi + + exit 99 + +} + +# Call assistant_discourse_from_language_model if the script is run directly (not sourced) +if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then + (assistant_discourse_from_language_model) +fi + +exit 99 diff --git a/config/commbase.conf b/config/commbase.conf index 8de5bb00..b4a18c68 100644 --- a/config/commbase.conf +++ b/config/commbase.conf @@ -31,11 +31,14 @@ VIDEO_CAPTURE_DEVICE_01_INDEX="0" VIDEO_CAPTURE_DEVICE_02_INDEX="1" RECORDER_TRANSMITTER_FILE="bundles/commbase-recorder-transmitter-b/reccomm.sh" CUSTOM_RECORDER_TRANSMITTER_FILE="bundles/commbase-recorder-transmitter-x/reccomm.sh" +RUN_VOICE_RECORDER_IN_PANE_PATH="bundles/libcommbase/libcommbase/routines/run_voice_recorder_in_pane.sh" +COMMBASE_RECORDER_TRANSMITTER_QUIT_CHAR="q" STT_ENGINE_MODEL_DIRECTORY="openai_whisper_models" STT_ENGINE_MODEL_SUBDIRECTORY="base" STT_ENGINE_PATH="$COMMBASE_APP_DIR/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py" STT_ENGINE_STRING="$PYTHON_ENV_VERSION $STT_ENGINE_PATH 2> /dev/null" COMMBASE_STT_WHISPER_REACTIVE_P_CLIENT_DATA_FILE="/bundles/commbase-stt-whisper-reactive-p/client_data/recording.wav" +COMMBASE_STT_WHISPER_REACTIVE_P_AUTO_OPEN_RECORDER_AFTER_COMMAND="True" STT_ENGINE_PROCESSING_TIME_VISIBLE_ON="False" CHAT_LOG_FILE="data/.chat_log.txt" CONVERSATION_LOGS_PATH="/user/conversation_logs/" @@ -87,6 +90,8 @@ MESSAGING_FILE="/data/.messages.json" MESSAGING_HISTORY_FILE="/history/.messaging_history" MAX_CONVERSATION_FILES_COUNT="30" MAX_CONVERSATION_FILE_DAYS_STORED="30" +CHAT_MEMORY_FILE="data/.chat_memory.txt" +ASSISTANT_DISCOURSE_FROM_LANGUAGE_MODEL_FILE="bundles/libcommbase/libcommbase/routines/assistant_discourse_from_language_model.sh" ANSWER_UNKNOWN_COMMANDS_USING_AI_CHATBOT_ON="False" AI_CHATBOT_STRING="$COMMBASE_APP_DIR/bundles/commbase-llm-google-gemini-p/commbase_llm_google_gemini_p.py" UNKNOWN_COMMAND_AI_CHATBOT_WORDS_LIMIT="35" @@ -100,13 +105,19 @@ SOUND_SAY_NO_WITHOUT_SPEAKING="$COMMBASE_APP_DIR/bundles/libcommbase/resources/b SOUND_A_PENDING_TASK_AWAITS_ATTENTION="$COMMBASE_APP_DIR/bundles/libcommbase/resources/bundles/sounds/mixkit-unlock-game-notification-253.wav" SOUND_PENDING_TASKS_REMINDER="$COMMBASE_APP_DIR/bundles/libcommbase/resources/bundles/sounds/mixkit-uplifting-flute-notification-2317.wav" PENDING_TASKS_VERIFICATION_FREQUENCY_IN_SECS="3600" +ASSISTANT_RESPONSE_SENTIMENT_ANALYSIS_ON="True" COMMBASE_HARDWARE_NOTIFICATIONS_ON="False" +COMMBASE_HARDWARE_DEVICE_0="/dev/ttyACM0" +COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONENT_ON="True" COMMBASE_HARDWARE_NOTIFICATION_LISTENING_START_ON="True" COMMBASE_HARDWARE_NOTIFICATION_LISTENING_STOP_ON="True" COMMBASE_HARDWARE_NOTIFICATION_PROCESSING_START_ON="True" COMMBASE_HARDWARE_NOTIFICATION_PROCESSING_STOP_ON="True" -COMBASE_HARDWARE_DEVICE_0="/dev/ttyACM0" COMMBASE_HARDWARE_COMMAND_LISTENING_START_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/listening_start.dat" COMMBASE_HARDWARE_COMMAND_LISTENING_STOP_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/listening_stop.dat" COMMBASE_HARDWARE_COMMAND_PROCESSING_START_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/processing_start.dat" COMMBASE_HARDWARE_COMMAND_PROCESSING_STOP_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/processing_stop.dat" +COMMBASE_HARDWARE_SENTIMENT_COMPONENT_ON="True" +COMMBASE_HARDWARE_COMMAND_RANDOM_NEGATIVE_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/sentiment_component/random_negative.dat" +COMMBASE_HARDWARE_COMMAND_RANDOM_NEUTRAL_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/sentiment_component/random_neutral.dat" +COMMBASE_HARDWARE_COMMAND_RANDOM_POSITIVE_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/sentiment_component/random_positive.dat" diff --git a/data/.chat_memory.txt b/data/.chat_memory.txt new file mode 100644 index 00000000..e69de29b diff --git a/screenshots/commbase.png b/screenshots/commbase.png index 5a9585ba..f888fa20 100644 Binary files a/screenshots/commbase.png and b/screenshots/commbase.png differ diff --git a/scripts/configuration/install_commbase_executable/commbase.default b/scripts/configuration/install_commbase_executable/commbase.default index 6f7daac9..167b02cb 100755 --- a/scripts/configuration/install_commbase_executable/commbase.default +++ b/scripts/configuration/install_commbase_executable/commbase.default @@ -309,8 +309,13 @@ route_option() { # Write the message to the log file (bash "$log_message_with_timestamp" "commbase" "$LOG_SEVERITY_LEVEL_2" "2" "cleaning_up_directories_and_files" "True" "$COMMBASE_APP_DIR/$LOG_MESSAGES_FILE") + # Clean up the conversation files (bash "$cleanup_directory_by_file_count" "$MAX_CONVERSATION_FILES_COUNT" "$COMMBASE_APP_DIR"/user/conversation_logs/) (bash "$cleanup_directory_by_age" "$MAX_CONVERSATION_FILE_DAYS_STORED" "$COMMBASE_APP_DIR"/user/conversation_logs/) + + # Clean up the current session memory file + (truncate -s 0 "$COMMBASE_APP_DIR"/"$CHAT_MEMORY_FILE") + (sleep 2) # Time in seconds to read messages on pane before the app shutdown # Kill the processes of the scripts that do not respond to the diff --git a/scripts/configuration/reset_commbase.conf/commbase.conf.default b/scripts/configuration/reset_commbase.conf/commbase.conf.default index 8de5bb00..b4a18c68 100644 --- a/scripts/configuration/reset_commbase.conf/commbase.conf.default +++ b/scripts/configuration/reset_commbase.conf/commbase.conf.default @@ -31,11 +31,14 @@ VIDEO_CAPTURE_DEVICE_01_INDEX="0" VIDEO_CAPTURE_DEVICE_02_INDEX="1" RECORDER_TRANSMITTER_FILE="bundles/commbase-recorder-transmitter-b/reccomm.sh" CUSTOM_RECORDER_TRANSMITTER_FILE="bundles/commbase-recorder-transmitter-x/reccomm.sh" +RUN_VOICE_RECORDER_IN_PANE_PATH="bundles/libcommbase/libcommbase/routines/run_voice_recorder_in_pane.sh" +COMMBASE_RECORDER_TRANSMITTER_QUIT_CHAR="q" STT_ENGINE_MODEL_DIRECTORY="openai_whisper_models" STT_ENGINE_MODEL_SUBDIRECTORY="base" STT_ENGINE_PATH="$COMMBASE_APP_DIR/bundles/commbase-stt-whisper-reactive-p/commbase_stt_whisper_reactive_p.py" STT_ENGINE_STRING="$PYTHON_ENV_VERSION $STT_ENGINE_PATH 2> /dev/null" COMMBASE_STT_WHISPER_REACTIVE_P_CLIENT_DATA_FILE="/bundles/commbase-stt-whisper-reactive-p/client_data/recording.wav" +COMMBASE_STT_WHISPER_REACTIVE_P_AUTO_OPEN_RECORDER_AFTER_COMMAND="True" STT_ENGINE_PROCESSING_TIME_VISIBLE_ON="False" CHAT_LOG_FILE="data/.chat_log.txt" CONVERSATION_LOGS_PATH="/user/conversation_logs/" @@ -87,6 +90,8 @@ MESSAGING_FILE="/data/.messages.json" MESSAGING_HISTORY_FILE="/history/.messaging_history" MAX_CONVERSATION_FILES_COUNT="30" MAX_CONVERSATION_FILE_DAYS_STORED="30" +CHAT_MEMORY_FILE="data/.chat_memory.txt" +ASSISTANT_DISCOURSE_FROM_LANGUAGE_MODEL_FILE="bundles/libcommbase/libcommbase/routines/assistant_discourse_from_language_model.sh" ANSWER_UNKNOWN_COMMANDS_USING_AI_CHATBOT_ON="False" AI_CHATBOT_STRING="$COMMBASE_APP_DIR/bundles/commbase-llm-google-gemini-p/commbase_llm_google_gemini_p.py" UNKNOWN_COMMAND_AI_CHATBOT_WORDS_LIMIT="35" @@ -100,13 +105,19 @@ SOUND_SAY_NO_WITHOUT_SPEAKING="$COMMBASE_APP_DIR/bundles/libcommbase/resources/b SOUND_A_PENDING_TASK_AWAITS_ATTENTION="$COMMBASE_APP_DIR/bundles/libcommbase/resources/bundles/sounds/mixkit-unlock-game-notification-253.wav" SOUND_PENDING_TASKS_REMINDER="$COMMBASE_APP_DIR/bundles/libcommbase/resources/bundles/sounds/mixkit-uplifting-flute-notification-2317.wav" PENDING_TASKS_VERIFICATION_FREQUENCY_IN_SECS="3600" +ASSISTANT_RESPONSE_SENTIMENT_ANALYSIS_ON="True" COMMBASE_HARDWARE_NOTIFICATIONS_ON="False" +COMMBASE_HARDWARE_DEVICE_0="/dev/ttyACM0" +COMMBASE_HARDWARE_SPEECH_TO_TEXT_ENGINE_COMPONENT_ON="True" COMMBASE_HARDWARE_NOTIFICATION_LISTENING_START_ON="True" COMMBASE_HARDWARE_NOTIFICATION_LISTENING_STOP_ON="True" COMMBASE_HARDWARE_NOTIFICATION_PROCESSING_START_ON="True" COMMBASE_HARDWARE_NOTIFICATION_PROCESSING_STOP_ON="True" -COMBASE_HARDWARE_DEVICE_0="/dev/ttyACM0" COMMBASE_HARDWARE_COMMAND_LISTENING_START_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/listening_start.dat" COMMBASE_HARDWARE_COMMAND_LISTENING_STOP_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/listening_stop.dat" COMMBASE_HARDWARE_COMMAND_PROCESSING_START_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/processing_start.dat" COMMBASE_HARDWARE_COMMAND_PROCESSING_STOP_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/speech_to_text_engine_component/processing_stop.dat" +COMMBASE_HARDWARE_SENTIMENT_COMPONENT_ON="True" +COMMBASE_HARDWARE_COMMAND_RANDOM_NEGATIVE_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/sentiment_component/random_negative.dat" +COMMBASE_HARDWARE_COMMAND_RANDOM_NEUTRAL_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/sentiment_component/random_neutral.dat" +COMMBASE_HARDWARE_COMMAND_RANDOM_POSITIVE_FILE="/src/client/serial_communication/arduino/arduino_0/commbase_hardware_notifications/sentiment_component/random_positive.dat" diff --git a/bundles/commbase-llm-openai-gpt-p/commbase_llm_openai_gpt_p.py b/src/client/bundles/terminal-chat-gpt/terminal_chat_gpt.py similarity index 98% rename from bundles/commbase-llm-openai-gpt-p/commbase_llm_openai_gpt_p.py rename to src/client/bundles/terminal-chat-gpt/terminal_chat_gpt.py index aef67e56..aa2987b6 100644 --- a/bundles/commbase-llm-openai-gpt-p/commbase_llm_openai_gpt_p.py +++ b/src/client/bundles/terminal-chat-gpt/terminal_chat_gpt.py @@ -1,6 +1,6 @@ #!/usr/bin/env python ################################################################################ -# commbase_llm_openai_gpt_p.py # +# terminal-chat-gpt # # # # A terminal version of OpenAI's ChatGPT # # # @@ -30,7 +30,7 @@ # along with this program; if not, write to the Free Software # # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # -# commbase_llm_openai_gpt_p.py +# terminal_chat_gpt.py # A terminal version of ChatGPT # ChatGPT is a chatbot built using the GPT (Generative Pre-trained Transformer) # architecture developed by OpenAI. diff --git a/src/server/skills_db/bash/f/firefox/open_the_web_browser.json b/src/server/skills_db/bash/f/firefox/open_the_web_browser.json new file mode 100644 index 00000000..a0c931d1 --- /dev/null +++ b/src/server/skills_db/bash/f/firefox/open_the_web_browser.json @@ -0,0 +1 @@ +{"messages":[{"control":""},{"current_request":"Open the web browser"},{"previous_request":"Open the web browser"},{"current_response":"Opening the web browser."},{"runtime":"bash"},{"source_code_display":"session_pane_six"},{"source_code":"#!/usr/bin/env bash\n(echo \"Launching firefox ...\")\n(firefox)"}]} diff --git a/src/server/skills_db/bash/o/openai/open_terminal_chat_gpt.json b/src/server/skills_db/bash/o/openai/open_terminal_chat_gpt.json new file mode 100644 index 00000000..8d1c8b69 --- /dev/null +++ b/src/server/skills_db/bash/o/openai/open_terminal_chat_gpt.json @@ -0,0 +1 @@ + diff --git a/src/server/skills_db/bash/o/openai/open_terminal_chat_gpt.sh b/src/server/skills_db/bash/o/openai/open_terminal_chat_gpt.sh new file mode 100644 index 00000000..9fddd906 --- /dev/null +++ b/src/server/skills_db/bash/o/openai/open_terminal_chat_gpt.sh @@ -0,0 +1,5 @@ +# TODO: + +# open_terminal_chat_gpt.sh +# Runs src/client/bundles/terminal-chat-gpt/terminal_chat_gpt.py as a terminal +# app in the terminal apps pane, pane 1 of the Commbase tmux session. diff --git a/user/memory_store/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/src/chat_log_20240609203647_4314.txt b/user/memory_store/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/src/chat_log_20240609203647_4314.txt new file mode 100644 index 00000000..462adfe1 --- /dev/null +++ b/user/memory_store/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/src/chat_log_20240609203647_4314.txt @@ -0,0 +1,51 @@ +[2024-06-09 19:42:10] app: DEBUG: โœจ ASSISTANT: Mute the microphone to pause recording. +[2024-06-09 19:42:39] app: DEBUG: โœจ ASSISTANT: ๐ŸŒ™โœจ Under the moon's silver glow, fairy dust whispers secrets to those who dare to listen. ๐Ÿงšโ€โ™€. A sprinkle of pixie magic can make your dreams dance like fireflies on a midsummer night! ๐ŸŒนโœจ But remember, every spell needs a pinch of responsibility. So use this mystical knowledge wisely and keep the balance in our enchanting world. ๐ŸŒ๐Ÿ”ฎ Now go forth with wonder in your heart and let magic guide you gently through life's twinkling journey! โœจ๐Ÿš€๐Ÿ’ซ +[2024-06-09 19:42:41] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: +[2024-06-09 19:42:49] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Hey how you doing today? +[2024-06-09 19:43:07] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Tell me about yourself. +[2024-06-09 19:43:27] app: DEBUG: โœจ ASSISTANT: I am eva ai. +[2024-06-09 19:43:51] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Nice to meet you, Eva. What can you do? +[2024-06-09 19:44:15] app: DEBUG: โœจ ASSISTANT: I can help with various tasks such as answering questions, managing your schedule, setting reminders, and even controlling smart home devices. How can I assist you today? +[2024-06-09 19:44:20] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: +[2024-06-09 19:44:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: This is the Zero Day. +[2024-06-09 19:44:51] app: DEBUG: โœจ ASSISTANT: Could you please provide more details about what you mean by "Zero Day"? Are you referring to a specific event or topic? +[2024-06-09 19:45:12] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: It's the day we launch our new project. We've been preparing for this for months. +[2024-06-09 19:45:35] app: DEBUG: โœจ ASSISTANT: That's exciting! Do you need any assistance with the launch? Perhaps help with scheduling, reminders, or any last-minute checks? +[2024-06-09 19:45:52] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, can you set a reminder for our team meeting at 10 AM? +[2024-06-09 19:46:09] app: DEBUG: โœจ ASSISTANT: Sure, I've set a reminder for your team meeting at 10 AM. Anything else I can help with for the launch day? +[2024-06-09 19:46:27] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Not right now, but I'll let you know if something comes up. Thanks, Eva. +[2024-06-09 19:46:41] app: DEBUG: โœจ ASSISTANT: You're welcome! I'm here if you need anything. Good luck with the launch! + +[2024-06-09 20:02:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Hey Eva, can you check the weather for today? +[2024-06-09 20:02:47] app: DEBUG: โœจ ASSISTANT: The weather today is sunny with a high of 75ยฐF and a low of 55ยฐF. Perfect day for a launch! +[2024-06-09 20:03:11] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Great! Can you also remind me to call our PR team at 3 PM? +[2024-06-09 20:03:29] app: DEBUG: โœจ ASSISTANT: Reminder set to call the PR team at 3 PM. Anything else on your mind? +[2024-06-09 20:03:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you send an email to the team confirming the meeting at 10 AM? +[2024-06-09 20:04:07] app: DEBUG: โœจ ASSISTANT: Sure, I'll draft the email for you. Here it is: "Hello Team, This is a reminder that we have a meeting at 10 AM to discuss the project launch. Please be on time. Best, [Your Name]." Would you like me to send it now? +[2024-06-09 20:04:27] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, please send it. +[2024-06-09 20:04:41] app: DEBUG: โœจ ASSISTANT: Email sent. Anything else I can do for you? +[2024-06-09 20:05:01] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you play some relaxing music? +[2024-06-09 20:05:18] app: DEBUG: โœจ ASSISTANT: Sure, playing a relaxing music playlist now. +[2024-06-09 20:06:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Thanks, Eva. That helps a lot. +[2024-06-09 20:06:45] app: DEBUG: โœจ ASSISTANT: You're welcome! I'm glad to help. If you need anything else, just let me know. + +[2024-06-09 20:30:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Eva, can you update me on my calendar for the rest of the week? +[2024-06-09 20:31:05] app: DEBUG: โœจ ASSISTANT: Here's your calendar for the rest of the week: Tomorrow you have a team debrief at 9 AM, a client call at 1 PM, and a project wrap-up meeting at 4 PM. On Thursday, you have a strategy meeting at 10 AM and a marketing review at 3 PM. Friday is mostly open except for a lunch meeting at 12 PM. Would you like to make any changes? +[2024-06-09 20:31:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: No changes for now. Can you set a reminder for the client call tomorrow at 12:45 PM? +[2024-06-09 20:31:48] app: DEBUG: โœจ ASSISTANT: Reminder set for the client call at 12:45 PM tomorrow. Anything else? +[2024-06-09 20:32:05] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you check if there are any new emails from the client? +[2024-06-09 20:32:22] app: DEBUG: โœจ ASSISTANT: Checking your emails now... You have one new email from the client confirming the details for tomorrow's call. Would you like me to read it out or do anything else with it? +[2024-06-09 20:32:42] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Just mark it as read. I'll go through it later. +[2024-06-09 20:32:57] app: DEBUG: โœจ ASSISTANT: Email marked as read. Anything else I can assist you with? +[2024-06-09 20:33:15] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you pull up the latest sales report? +[2024-06-09 20:33:32] app: DEBUG: โœจ ASSISTANT: Here is the latest sales report: Total sales for the month are $50,000, a 10% increase from last month. Would you like a more detailed breakdown? +[2024-06-09 20:33:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, can you provide a breakdown by region? +[2024-06-09 20:34:05] app: DEBUG: โœจ ASSISTANT: Certainly. Sales by region are as follows: North America: $20,000, Europe: $15,000, Asia: $10,000, Rest of the World: $5,000. Anything specific you'd like to analyze? +[2024-06-09 20:34:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you compare these figures to the previous month? +[2024-06-09 20:34:45] app: DEBUG: โœจ ASSISTANT: Comparing to the previous month: North America had $18,000, Europe $14,000, Asia $9,000, and Rest of the World $4,000. Overall, each region saw an increase in sales. Would you like to see this in a chart? +[2024-06-09 20:35:08] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, a chart would be helpful. +[2024-06-09 20:35:25] app: DEBUG: โœจ ASSISTANT: Here is the chart comparing sales by region for the past two months. [Chart displayed] Would you like to save or share this chart? +[2024-06-09 20:35:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Save it to the project folder and email it to the team. +[2024-06-09 20:36:07] app: DEBUG: โœจ ASSISTANT: The chart has been saved to the project folder and emailed to the team. Anything else I can do for you? +[2024-06-09 20:36:27] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: That's all for now. Thanks, Eva. +[2024-06-09 20:36:42] app: DEBUG: โœจ ASSISTANT: You're welcome! Have a great day, and good luck with the launch! diff --git a/user/memory_store/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/src/chat_log_20240609220435_18072.txt b/user/memory_store/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/src/chat_log_20240609220435_18072.txt new file mode 100644 index 00000000..22ca6a52 --- /dev/null +++ b/user/memory_store/commbase-genai-slm-ollama-phi3-mini-memory-remote-rag-picone/src/chat_log_20240609220435_18072.txt @@ -0,0 +1,52 @@ +[2024-06-09 21:15:39] app: DEBUG: โœจ ASSISTANT: Mute the microphone to pause recording. +[2024-06-09 21:15:49] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Hey there, Eva. +[2024-06-09 21:16:07] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Tell me about yourself. +[2024-06-09 21:16:27] app: DEBUG: โœจ ASSISTANT: I am eva ai. +[2024-06-09 21:16:51] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Nice to meet you, Eva. What are some of your capabilities? +[2024-06-09 21:17:15] app: DEBUG: โœจ ASSISTANT: I can assist with a variety of tasks including answering questions, managing your schedule, setting reminders, sending emails, and controlling smart home devices. How can I assist you today? +[2024-06-09 21:17:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Today is a big day. We're launching our new app. +[2024-06-09 21:17:51] app: DEBUG: โœจ ASSISTANT: That's wonderful! Congratulations! How can I support you with the launch? +[2024-06-09 21:18:12] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you set a reminder for our team to check the servers at 9 AM? +[2024-06-09 21:18:29] app: DEBUG: โœจ ASSISTANT: I've set a reminder for the team to check the servers at 9 AM. Anything else I can do? +[2024-06-09 21:18:47] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, can you also send a message to the marketing team to confirm the social media posts are scheduled? +[2024-06-09 21:19:05] app: DEBUG: โœจ ASSISTANT: I've drafted the message for the marketing team: "Hi Team, Please confirm that the social media posts for the app launch are scheduled. Thanks!" Would you like me to send it now? +[2024-06-09 21:19:25] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, please send it. +[2024-06-09 21:19:41] app: DEBUG: โœจ ASSISTANT: The message has been sent. Anything else? +[2024-06-09 21:20:00] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you check my calendar and let me know if I have any meetings today? +[2024-06-09 21:20:15] app: DEBUG: โœจ ASSISTANT: You have two meetings scheduled today: A project kickoff meeting at 11 AM and a client feedback session at 2 PM. Would you like to make any changes or set reminders? +[2024-06-09 21:20:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Set a reminder for the client feedback session 15 minutes before it starts. +[2024-06-09 21:20:50] app: DEBUG: โœจ ASSISTANT: Reminder set for the client feedback session at 1:45 PM. Anything else I can do for you? +[2024-06-09 21:21:07] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you play some upbeat music to keep the energy high? +[2024-06-09 21:21:22] app: DEBUG: โœจ ASSISTANT: Sure, playing an upbeat music playlist now. +[2024-06-09 21:21:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Thanks, Eva. This is perfect. +[2024-06-09 21:22:05] app: DEBUG: โœจ ASSISTANT: You're welcome! Glad I could help. Let me know if there's anything else you need. + +[2024-06-09 21:35:30] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Hey Eva, can you provide an update on our app downloads so far? +[2024-06-09 21:35:47] app: DEBUG: โœจ ASSISTANT: Checking the latest data... As of now, there have been 2,500 downloads of your app. Would you like a detailed breakdown by region or platform? +[2024-06-09 21:36:11] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, a breakdown by platform would be great. +[2024-06-09 21:36:28] app: DEBUG: โœจ ASSISTANT: Sure, here it is: iOS: 1,500 downloads, Android: 1,000 downloads. Anything else you need? +[2024-06-09 21:36:45] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you prepare a report summarizing this data for the team? +[2024-06-09 21:37:01] app: DEBUG: โœจ ASSISTANT: I've prepared the report summarizing the download data. Would you like me to email it to the team? +[2024-06-09 21:37:18] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, email it to the team and also save a copy in the project folder. +[2024-06-09 21:37:35] app: DEBUG: โœจ ASSISTANT: The report has been emailed to the team and saved in the project folder. Anything else? +[2024-06-09 21:37:55] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you check if there are any new emails from our investors? +[2024-06-09 21:38:10] app: DEBUG: โœจ ASSISTANT: Checking your emails now... You have two new emails from investors. One is congratulating the team on the launch, and the other is requesting a follow-up meeting next week. Would you like to respond to them? +[2024-06-09 21:38:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, reply to the congratulatory email with a thank you note and schedule the follow-up meeting for next Wednesday at 3 PM. +[2024-06-09 21:38:55] app: DEBUG: โœจ ASSISTANT: I've sent a thank you note and scheduled the follow-up meeting for next Wednesday at 3 PM. Anything else I can do for you? +[2024-06-09 21:39:15] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: That's all for now. Thanks, Eva. +[2024-06-09 21:39:30] app: DEBUG: โœจ ASSISTANT: You're welcome! I'm here if you need anything else. Best of luck with the rest of the launch day! + +[2024-06-09 22:00:45] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Eva, can you update me on the social media engagement for our launch posts? +[2024-06-09 22:01:00] app: DEBUG: โœจ ASSISTANT: Sure, checking the latest social media data... Your launch posts have received 1,000 likes, 200 shares, and 150 comments across all platforms. Would you like a detailed analysis? +[2024-06-09 22:01:27] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, please provide a detailed analysis including the most engaged platform. +[2024-06-09 22:01:45] app: DEBUG: โœจ ASSISTANT: Here is the detailed analysis: Most engaged platform: Instagram with 600 likes, 120 shares, and 100 comments. Facebook follows with 300 likes, 50 shares, and 30 comments. Twitter has 100 likes, 30 shares, and 20 comments. Would you like this data in a report? +[2024-06-09 22:02:10] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Yes, create a report and email it to the marketing team. +[2024-06-09 22:02:28] app: DEBUG: โœจ ASSISTANT: The report has been created and emailed to the marketing team. Is there anything else I can do for you? +[2024-06-09 22:02:50] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Can you summarize the comments for me? +[2024-06-09 22:03:10] app: DEBUG: โœจ ASSISTANT: The comments are mostly positive, praising the app's features and user interface. Some users have requested additional features, and a few have reported minor bugs. Would you like to address these issues now? +[2024-06-09 22: + +03:35] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: Forward the bug reports to the development team and schedule a meeting to discuss new feature requests. +[2024-06-09 22:03:55] app: DEBUG: โœจ ASSISTANT: Bug reports have been forwarded to the development team, and a meeting to discuss new feature requests is scheduled for tomorrow at 10 AM. Anything else? +[2024-06-09 22:04:15] stt-whisper-proactive: DEBUG: ๐Ÿง‘ END USER: That's all for now. Thanks, Eva. +[2024-06-09 22:04:30] app: DEBUG: โœจ ASSISTANT: You're welcome! Have a great day and congratulations again on the launch!