Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use genai.protos to access the raw protos. #446

Merged
merged 5 commits into from
Jun 7, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,6 @@
"import pandas as pd\n",
"\n",
"import google.generativeai as genai\n",
"import google.ai.generativelanguage as glm\n",
"\n",
"# Used to securely store your API key\n",
"from google.colab import userdata\n",
Expand Down
9 changes: 4 additions & 5 deletions site/en/docs/search_reranking_using_embeddings.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -184,7 +184,6 @@
"import textwrap\n",
"\n",
"import google.generativeai as genai\n",
"import google.ai.generativelanguage as glm\n",
"\n",
"import wikipedia\n",
"from wikipedia.exceptions import DisambiguationError, PageError\n",
Expand Down Expand Up @@ -821,7 +820,7 @@
"In the chat history you can see all 4 steps:\n",
"\n",
"1. The user sent the query.\n",
"2. The model replied with a `glm.FunctionCall` calling the `wikipedia_search` with a number of relevant searches.\n",
"2. The model replied with a `genai.protos.FunctionCall` calling the `wikipedia_search` with a number of relevant searches.\n",
"3. Because you set `enable_automatic_function_calling=True` when creating the `genai.ChatSession`, it executed the search function and returned the list of article summaries to the model.\n",
"4. Folliwing the instructions in the prompt, the model generated a final answer based on those summaries.\n"
]
Expand Down Expand Up @@ -1044,9 +1043,9 @@
],
"source": [
"response = chat.send_message(\n",
" glm.Content(\n",
" parts=[glm.Part(\n",
" function_response = glm.FunctionResponse(\n",
" genai.protos.Content(\n",
" parts=[genai.protos.Part(\n",
" function_response = genai.protos.FunctionResponse(\n",
" name='wikipedia_search',\n",
" response={'result': summaries}\n",
" )\n",
Expand Down
54 changes: 21 additions & 33 deletions site/en/gemini-api/docs/function-calling/python.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,6 @@
"import time\n",
"\n",
"import google.generativeai as genai\n",
"import google.ai.generativelanguage as glm\n",
"\n",
"from IPython import display\n",
"from IPython.display import Markdown\n",
Expand Down Expand Up @@ -206,7 +205,7 @@
"\n",
"To use function calling, pass a list of functions to the `tools` parameter when creating a [`GenerativeModel`](https://ai.google.dev/api/python/google/generativeai/GenerativeModel). The model uses the function name, docstring, parameters, and parameter type annotations to decide if it needs the function to best answer a prompt.\n",
"\n",
"> Important: The SDK converts function parameter type annotations to a format the API understands (`glm.FunctionDeclaration`). The API only supports a limited selection of parameter types, and the Python SDK's automatic conversion only supports a subset of that: `AllowedTypes = int | float | bool | str | list['AllowedTypes'] | dict`"
"> Important: The SDK converts function parameter type annotations to a format the API understands (`genai.protos.FunctionDeclaration`). The API only supports a limited selection of parameter types, and the Python SDK's automatic conversion only supports a subset of that: `AllowedTypes = int | float | bool | str | list['AllowedTypes'] | dict`"
]
},
{
Expand Down Expand Up @@ -327,13 +326,13 @@
"source": [
"Examine the chat history to see the flow of the conversation and how function calls are integrated within it.\n",
"\n",
"The `ChatSession.history` property stores a chronological record of the conversation between the user and the Gemini model. Each turn in the conversation is represented by a [`glm.Content`](https://ai.google.dev/api/python/google/ai/generativelanguage/Content) object, which contains the following information:\n",
"The `ChatSession.history` property stores a chronological record of the conversation between the user and the Gemini model. Each turn in the conversation is represented by a [`genai.protos.Content`](https://ai.google.dev/api/python/google/generativeai/protos/Content) object, which contains the following information:\n",
"\n",
"* **Role**: Identifies whether the content originated from the \"user\" or the \"model\".\n",
"* **Parts**: A list of [`glm.Part`](https://ai.google.dev/api/python/google/ai/generativelanguage/Part) objects that represent individual components of the message. With a text-only model, these parts can be:\n",
"* **Parts**: A list of [`genai.protos.Part`](https://ai.google.dev/api/python/google/generativeai/protos/Part) objects that represent individual components of the message. With a text-only model, these parts can be:\n",
" * **Text**: Plain text messages.\n",
" * **Function Call** ([`glm.FunctionCall`](https://ai.google.dev/api/python/google/ai/generativelanguage/FunctionCall)): A request from the model to execute a specific function with provided arguments.\n",
" * **Function Response** ([`glm.FunctionResponse`](https://ai.google.dev/api/python/google/ai/generativelanguage/FunctionResponse)): The result returned by the user after executing the requested function.\n",
" * **Function Call** ([`genai.protos.FunctionCall`](https://ai.google.dev/api/python/google/generativeai/protos/FunctionCall)): A request from the model to execute a specific function with provided arguments.\n",
" * **Function Response** ([`genai.protos.FunctionResponse`](https://ai.google.dev/api/python/google/generativeai/protos/FunctionResponse)): The result returned by the user after executing the requested function.\n",
"\n",
" In the previous example with the mittens calculation, the history shows the following sequence:\n",
"\n",
Expand Down Expand Up @@ -400,7 +399,7 @@
"source": [
"While this was all handled automatically, if you need more control, you can:\n",
"\n",
"- Leave the default `enable_automatic_function_calling=False` and process the `glm.FunctionCall` responses yourself.\n",
"- Leave the default `enable_automatic_function_calling=False` and process the `genai.protos.FunctionCall` responses yourself.\n",
"- Or use `GenerativeModel.generate_content`, where you also need to manage the chat history."
]
},
Expand Down Expand Up @@ -541,7 +540,7 @@
"\n",
"# Build the response parts.\n",
"response_parts = [\n",
" glm.Part(function_response=glm.FunctionResponse(name=fn, response={\"result\": val}))\n",
" genai.protos.Part(function_response=genai.protos.FunctionResponse(name=fn, response={\"result\": val}))\n",
" for fn, val in responses.items()\n",
"]\n",
"\n",
Expand Down Expand Up @@ -573,17 +572,6 @@
"The `google.ai.generativelanguage` client library provides access to the low level types giving you full control."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "S53E0EE8TBUF"
},
"outputs": [],
"source": [
"import google.ai.generativelanguage as glm"
]
},
{
"cell_type": "markdown",
"metadata": {
Expand Down Expand Up @@ -648,7 +636,7 @@
"id": "qFD4U7ym04F5"
},
"source": [
"This returns the list of `glm.Tool` objects that would be sent to the API. If the printed format is not familiar, it's because these are Google protobuf classes. Each `glm.Tool` (1 in this case) contains a list of `glm.FunctionDeclarations`, which describe a function and its arguments."
"This returns the list of `genai.protos.Tool` objects that would be sent to the API. If the printed format is not familiar, it's because these are Google protobuf classes. Each `genai.protos.Tool` (1 in this case) contains a list of `genai.protos.FunctionDeclarations`, which describe a function and its arguments."
]
},
{
Expand All @@ -657,7 +645,7 @@
"id": "eY6RmFQ76FVu"
},
"source": [
"Here is a declaration for the same multiply function written using the `glm` classes.\n",
"Here is a declaration for the same multiply function written using the `genai.protos` classes.\n",
"\n",
"Note that these classes just describe the function for the API, they don't include an implementation of it. So using this doesn't work with automatic function calling, but functions don't always need an implementation."
]
Expand All @@ -670,16 +658,16 @@
},
"outputs": [],
"source": [
"calculator = glm.Tool(\n",
"calculator = genai.protos.Tool(\n",
" function_declarations=[\n",
" glm.FunctionDeclaration(\n",
" genai.protos.FunctionDeclaration(\n",
" name='multiply',\n",
" description=\"Returns the product of two numbers.\",\n",
" parameters=glm.Schema(\n",
" type=glm.Type.OBJECT,\n",
" parameters=genai.protos.Schema(\n",
" type=genai.protos.Type.OBJECT,\n",
" properties={\n",
" 'a':glm.Schema(type=glm.Type.NUMBER),\n",
" 'b':glm.Schema(type=glm.Type.NUMBER)\n",
" 'a':genai.protos.Schema(type=genai.protos.Type.NUMBER),\n",
" 'b':genai.protos.Schema(type=genai.protos.Type.NUMBER)\n",
" },\n",
" required=['a','b']\n",
" )\n",
Expand Down Expand Up @@ -753,7 +741,7 @@
}
],
"source": [
"glm.Tool(calculator)"
"genai.protos.Tool(calculator)"
]
},
{
Expand All @@ -762,7 +750,7 @@
"id": "jS6ruiTp6VBf"
},
"source": [
"Either way, you pass a representation of a `glm.Tool` or list of tools to"
"Either way, you pass a representation of a `genai.protos.Tool` or list of tools to"
]
},
{
Expand All @@ -787,7 +775,7 @@
"id": "517ca06297bb"
},
"source": [
"Like before the model returns a `glm.FunctionCall` invoking the calculator's `multiply` function:"
"Like before the model returns a `genai.protos.FunctionCall` invoking the calculator's `multiply` function:"
]
},
{
Expand Down Expand Up @@ -889,9 +877,9 @@
"outputs": [],
"source": [
"response = chat.send_message(\n",
" glm.Content(\n",
" parts=[glm.Part(\n",
" function_response = glm.FunctionResponse(\n",
" genai.protos.Content(\n",
" parts=[genai.protos.Part(\n",
" function_response = genai.protos.FunctionResponse(\n",
" name='multiply',\n",
" response={'result': result}))]))"
]
Expand Down
4 changes: 2 additions & 2 deletions site/en/gemini-api/docs/get-started/python.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -419,7 +419,7 @@
"source": [
"Gemini can generate multiple possible responses for a single prompt. These possible responses are called `candidates`, and you can review them to select the most suitable one as the response.\n",
"\n",
"View the response candidates with <a href=\"https://ai.google.dev/api/python/google/ai/generativelanguage/GenerateContentResponse#candidates\"><code>GenerateContentResponse.candidates</code></a>:"
"View the response candidates with <a href=\"https://ai.google.dev/api/python/google/generativeai/protos/GenerateContentResponse#candidates\"><code>GenerateContentResponse.candidates</code></a>:"
]
},
{
Expand Down Expand Up @@ -1488,7 +1488,7 @@
"id": "-fthdIItnqki"
},
"source": [
"Underlying the Python SDK is the <a href=\"https://ai.google.dev/api/python/google/ai/generativelanguage\"><code>google.ai.generativelanguage</code></a> client library:"
"Underlying the Python SDK is the <a href=\"https://ai.google.dev/api/python/google/generativeai/protos\"><code>google.ai.generativelanguage</code></a> client library:"
]
},
{
Expand Down
10 changes: 5 additions & 5 deletions site/en/gemini-api/docs/semantic_retrieval.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@
"\n",
"A common approach used to overcome these constraints is called Retrieval Augmented Generation (RAG), which augments the prompt sent to an LLM with relevant data retrieved from an external knowledge base through an Information Retrieval (IR) mechanism. The knowledge base can be your own corpora of documents, databases, or APIs.\n",
"\n",
"This notebook walks you through a workflow to improve an LLM's response by augmenting its knowledge with external text corpora and performing semantic information retrieval to answer questions using the Semantic Retriever and the Attributed Question & Answering (AQA) APIs of the [Generative Language API](https://ai.google.dev/api/python/google/ai/generativelanguage).\n",
"This notebook walks you through a workflow to improve an LLM's response by augmenting its knowledge with external text corpora and performing semantic information retrieval to answer questions using the Semantic Retriever and the Attributed Question & Answering (AQA) APIs of the Generative Language API.\n",
"\n",
"Note: This API is currently in [beta](https://ai.google.dev/gemini-api/docs/api-versions) and is [only available in certain regions](https://ai.google.dev/gemini-api/docs/available-regions).\n"
]
Expand Down Expand Up @@ -719,7 +719,7 @@
"source": [
"## Attributed Question-Answering\n",
"\n",
"Use the [`GenerateAnswer`](https://ai.google.dev/api/python/google/ai/generativelanguage/GenerateAnswerRequest) method to perform Attributed Question-Answering over your document, corpus, or a set of passages.\n",
"Use the [`GenerateAnswer`](https://ai.google.dev/api/python/google/generativeai/protos/GenerateAnswerRequest) method to perform Attributed Question-Answering over your document, corpus, or a set of passages.\n",
"\n",
"Attributed Question-Answering (AQA) refers to answering questions grounded to a given context and providing attributions(s), while minimizing hallucination.\n",
"\n",
Expand Down Expand Up @@ -769,7 +769,7 @@
"source": [
"### AQA Helpful Tips\n",
"\n",
"For full API specifications, refer to the [`GenerateAnswerRequest` API Reference](https://ai.google.dev/api/python/google/ai/generativelanguage/GenerateAnswerRequest).\n",
"For full API specifications, refer to the [`GenerateAnswerRequest` API Reference](https://ai.google.dev/api/python/google/generativeai/protos/GenerateAnswerRequest).\n",
"\n",
"* *Passage length*: Up to 300 tokens per passage are recommended.\n",
"* *Passage sorting*:\n",
Expand Down Expand Up @@ -870,7 +870,7 @@
"source": [
"## Share the corpus\n",
"\n",
"You can choose to share the corpus with others using the [`CreatePermissionRequest`](https://ai.google.dev/api/python/google/ai/generativelanguage/CreatePermissionRequest) API.\n",
"You can choose to share the corpus with others using the [`CreatePermissionRequest`](https://ai.google.dev/api/python/google/generativeai/protos/CreatePermissionRequest) API.\n",
"\n",
"Constraints:\n",
"\n",
Expand Down Expand Up @@ -912,7 +912,7 @@
"source": [
"## Delete the corpus\n",
"\n",
"Use [`DeleteCorpusRequest`](https://ai.google.dev/api/python/google/ai/generativelanguage/DeleteCorpusRequest) to delete a user corpus and all associated `Document`s & `Chunk`s.\n",
"Use [`DeleteCorpusRequest`](https://ai.google.dev/api/python/google/generativeai/protos/DeleteCorpusRequest) to delete a user corpus and all associated `Document`s & `Chunk`s.\n",
"\n",
"Note that non-empty corpora will throw an error without specifying an `force=True` flag. If you set `force=True`, any `Chunk`s and objects related to this `Document` will also be deleted.\n",
"\n",
Expand Down
1 change: 0 additions & 1 deletion site/en/gemini-api/tutorials/anomaly_detection.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,6 @@
"import seaborn as sns\n",
"\n",
"import google.generativeai as genai\n",
"import google.ai.generativelanguage as glm\n",
"\n",
"# Used to securely store your API key\n",
"from google.colab import userdata\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,6 @@
"import seaborn as sns\n",
"\n",
"import google.generativeai as genai\n",
"import google.ai.generativelanguage as glm\n",
"\n",
"# Used to securely store your API key\n",
"from google.colab import userdata\n",
Expand Down
1 change: 0 additions & 1 deletion site/en/gemini-api/tutorials/document_search.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,6 @@
"import pandas as pd\n",
"\n",
"import google.generativeai as genai\n",
"import google.ai.generativelanguage as glm\n",
"\n",
"# Used to securely store your API key\n",
"from google.colab import userdata\n",
Expand Down
5 changes: 2 additions & 3 deletions site/en/gemini-api/tutorials/extract_structured_data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,6 @@
"import textwrap\n",
"\n",
"import google.generativeai as genai\n",
"import google.ai.generativelanguage as glm\n",
"\n",
"\n",
"from IPython.display import display\n",
Expand Down Expand Up @@ -455,7 +454,7 @@
"source": [
"If you haven't gone through the [Function calling basics](https://ai.google.dev/tutorials/function_calling_python_quickstart) tutorial yet, make sure you do that first.\n",
"\n",
"With function calling your function and its parameters are described to the API as a `glm.FunctionDeclaration`. In basic cases the SDK can build the `FunctionDeclaration` from the function and its annotations. The SDK doesn't currently handle the description of nested `OBJECT` (`dict`) parameters. So you'll need to define them explicitly, for now."
"With function calling your function and its parameters are described to the API as a `genai.protos.FunctionDeclaration`. In basic cases the SDK can build the `FunctionDeclaration` from the function and its annotations. The SDK doesn't currently handle the description of nested `OBJECT` (`dict`) parameters. So you'll need to define them explicitly, for now."
]
},
{
Expand Down Expand Up @@ -701,7 +700,7 @@
"id": "kILNHmG2IED3"
},
"source": [
"The `glm.FunctionCall` class is based on Google Protocol Buffers, convert it to a more familiar JSON compatible object:"
"The `genai.protos.FunctionCall` class is based on Google Protocol Buffers, convert it to a more familiar JSON compatible object:"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,6 @@
"import pandas as pd\n",
"\n",
"import google.generativeai as genai\n",
"import google.ai.generativelanguage as glm\n",
"\n",
"# Used to securely store your API key\n",
"from google.colab import userdata\n",
Expand Down
Loading