This is a simple example of how to use the Ollama RAG (retrieval augmented generation) using Ollama embeddings with nodejs, typescript, docker and chromadb. It's a nodejs version of the Ollama RAG example provided by Ollama.
To run the application, you need to have docker installed on your machine. If you don't have docker installed, you can download it from here.
After you have docker installed, you can run the following command to install the required packages and models:
make setup # Setup the application
make ollama_pull # Pull the models
After the models are pulled and packages are installated, you can run the application with the following command:
make run # execute the application
Output:
Prompt: What animals are llamas related to?
Response: {
model: 'llama3.2',
created_at: '2024-10-06T13:30:47.825643583Z',
message: {
role: 'assistant',
content: 'Based on the information provided, it is stated that llamas are part of the camelid family. This means they share a close genetic relationship with other animals within this family. Specifically, llamas are closely related to:\n' +
'\n' +
'1. Vicuñas\n' +
'2. Camels (although not as closely related to domesticated camels)'
},
done_reason: 'stop',
done: true,
total_duration: 7936475467,
load_duration: 36586649,
prompt_eval_count: 66,
prompt_eval_duration: 136286000,
eval_count: 67,
eval_duration: 7720279000