Skip to content

Commit c59e663

Browse files
authored
community[patch]: Fix docstring for ollama parameter "keep_alive" (langchain-ai#23973)
Fix doc-string for ollama integration
1 parent 0c1889c commit c59e663

File tree

1 file changed

+5
-3
lines changed
  • libs/community/langchain_community/llms

1 file changed

+5
-3
lines changed

libs/community/langchain_community/llms/ollama.py

+5-3
Original file line numberDiff line numberDiff line change
@@ -112,18 +112,20 @@ class _OllamaCommon(BaseLanguageModel):
112112
"""Timeout for the request stream"""
113113

114114
keep_alive: Optional[Union[int, str]] = None
115-
"""How long the model will stay loaded into memory."""
115+
"""How long the model will stay loaded into memory.
116116
117-
raw: Optional[bool] = None
118-
"""raw or not.""
119117
The parameter (Default: 5 minutes) can be set to:
120118
1. a duration string in Golang (such as "10m" or "24h");
121119
2. a number in seconds (such as 3600);
122120
3. any negative number which will keep the model loaded \
123121
in memory (e.g. -1 or "-1m");
124122
4. 0 which will unload the model immediately after generating a response;
123+
125124
See the [Ollama documents](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-keep-a-model-loaded-in-memory-or-make-it-unload-immediately)"""
126125

126+
raw: Optional[bool] = None
127+
"""raw or not."""
128+
127129
headers: Optional[dict] = None
128130
"""Additional headers to pass to endpoint (e.g. Authorization, Referer).
129131
This is useful when Ollama is hosted on cloud services that require

0 commit comments

Comments
 (0)