The sample project is an AI application that uses Azure Container Apps and Azure OpenAI. The application provides AI assistant functionality in a Spring PetClinic application.
You can smoothly get help by having a natural language chat with the AI assistant. The AI assistant can assist you with the following tasks:
- Querying the registered pet owners
- Adding a new pet owner
- Updating the pet owner's information
- Adding a new pet
- Querying the vets' information
For running this lab you will need:
- A GitHub account
- An Azure Subscription
-
Navigate to the GitHub repository of this lab and select Fork.
{: .note }
In case you are using a GitHub EMU account, it might be you are not able to fork a public repository. In that case, create a new repository with the same name, clone the original repository, add your new repository as a remote and push to this new remote.
-
Make sure your own username is indicated as the fork
Owner
-
Select Create fork. This will create a copy or fork of this project in your own account.
-
Navigate to the newly forked GitHub project.
-
Select Code and next Codespaces.
-
Select Create a codespace.
Your codespace will now get created in your browser window. Once creation is done, you can start executing the next steps in the terminal:
- Run
azd auth login
, follow the prompts to login to Azure. - Run
azd up
, input the environment name, select your subscription and location.
The project will be ready in 20 minutes.
You can talk with the agent, it can help to recommend the vet according to the symptoms of the pet. The agent can also help to book an appointment with the vet. Go to the owner page, you can see you are registered as an owner with the help of the agent
Normally a LLM application contains 5 parts: Prompt, Model, Memory, Native Functions (tools)and RAG (content retriever)
The prompt is defined in agent
The model is is defined in model, the model can be used for chat, or can be used for content augmentor
Memory Store: The demo still use the local memory defined in memory store, it means it can not share memory between instances, you can enhance it use memory on Redis
Memory ID: It use the username as memory id Memory ID
The Demo provided two local tools to interactive with native functions
- Vets and their specialist: The agent will know the system can return list of Vets, include their specialist, it can be used to recommend a vet
- Owner and Pets: he agent will know the system register new owner and their pets
It still use the local file as content retriever, it provided the guideline how the agent should work, which is in Term of Use, the content is ingest when the store in initialized here. If you have lots of content, tried to use the AI Search to create the RAG
You can also talk with the agent with your own language, like Chinese
The problem is that your Term of Use is in English, the traditional way is that provide a localized term of use for each language, but you can use openAI to make it easier
We can define a Retrieval Augmentor, and translate your ask into English before you retrieve the content