You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: libs/community/langchain_community/llms/ollama.py
+5-3
Original file line number
Diff line number
Diff line change
@@ -112,18 +112,20 @@ class _OllamaCommon(BaseLanguageModel):
112
112
"""Timeout for the request stream"""
113
113
114
114
keep_alive: Optional[Union[int, str]] =None
115
-
"""How long the model will stay loaded into memory."""
115
+
"""How long the model will stay loaded into memory.
116
116
117
-
raw: Optional[bool] =None
118
-
"""raw or not.""
119
117
The parameter (Default: 5 minutes) can be set to:
120
118
1. a duration string in Golang (such as "10m" or "24h");
121
119
2. a number in seconds (such as 3600);
122
120
3. any negative number which will keep the model loaded \
123
121
in memory (e.g. -1 or "-1m");
124
122
4. 0 which will unload the model immediately after generating a response;
123
+
125
124
See the [Ollama documents](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-keep-a-model-loaded-in-memory-or-make-it-unload-immediately)"""
126
125
126
+
raw: Optional[bool] =None
127
+
"""raw or not."""
128
+
127
129
headers: Optional[dict] =None
128
130
"""Additional headers to pass to endpoint (e.g. Authorization, Referer).
129
131
This is useful when Ollama is hosted on cloud services that require
0 commit comments