Simple Mule application making use of the MAC (MuleSoft AI Chain) project to connect to Ollama (llama3
model).
Full tutorial: How to connect your local Ollama AI using the MAC project and ACB
- Go to ollama.com and follow the prompts to install in your local computer
- Make a note of which version you started running (like
llama3
orllama3.2
) - To verify ollama is running, you should either be able to interact with it in the terminal or you should be able to run
ollama list
orollama ps
- We will verify the local installation in MuleSoft before deploying to CloudHub
- In ACB > Design an API
- Name it
MAC-Ollama-API
- REST API / RAML 1.0
- Create project
- Use the mac-ollama-api.raml file
- Publish to Exchange
- Select Yes to implement this API in ACB
- Name it
mac-ollama-proj
, select the folder where you want to keep this, select Mule Runtime 4.8 and Java 17 - Once the project finishes loading, open the
mac-ollama-proj.xml
file undersrc/main/mule
- Take a look at the contents from mac-ollama-proj.xml to configure your own
- Create a new file called
llm-config.json
undersrc/main/resources
with the following:
{
"OLLAMA": {
"OLLAMA_BASE_URL": "http://localhost:11434"
}
}
- This will let us test Ollama locally first
- Run the app locally
- Send a POST request to
localhost:8081/api/chat
with a JSON body including your question
curl -H 'Content-Type: application/json' -X POST http://localhost:8081/api/chat -d '"hello"'
- If everything was successful, continue to the next step. Otherwise, please troubleshoot before continuing
- Stop the app
- Download and install ngrok to make your Ollama endpoint publicly available from the internet. This way, CloudHub will be able to access the URL since it's not only in your local (localhost:11434)
- Run the following from your Termina/cmd:
ngrok http 11434 --host-header="localhost:11434"
- Copy the address from the Forwarding field
- Paste it in your
llm-config.json
file undersrc/main/resources
- Save the file and run the app again to verify everything still works correctly
- Stop the app once you verify it works
- Go to your
pom.xml
file and change the version to1.0.0
(remove the SNAPSHOT) - Save the file
- Head to your
mac-ollama-proj.xml
file and click on the Deploy to CloudHub button - Select the options you want (you can leave the defaults) and deploy
- Once the deployment is done, get the Public Endpoint from your application and call it to verify the app works
curl -H 'Content-Type: application/json' -X POST https://mac-ollama-proj-ny7z.5sy6-1.usa-e2.cloudhub.io/api/chat -d '"hello"'
Note
If you experience a lot of issues with the deployment from ACB, you can also get the JAR from the target
folder and deploy manually in Runtime Manager