This Streamlit application allows users to interact with various local language models using a chat interface locally. The selected language model is used to process user prompts and provide responses.
- Model Selection: Choose from locally available language models.
- Chat Interface: Interact with the selected model using a chat-like interface.
- Localhost: Use it without internet connection
- Streamlit
- Ollama Python library
Download the ollama : https://ollama.com/download Download any LLM: https://ollama.com/library
git clone https://github.com/itsmeuttu/ChatGPT-clone.git
cd ChatGPT-clone
source venv/bin/activate
# On Windows use `venv\Scripts\activate`
pip install -r requirements.txt
streamlit run main.py
- Select a model from the sidebar.
- Enter a prompt in the chat input box.
- View the model's response.