-
Notifications
You must be signed in to change notification settings - Fork 139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Any way to use batch API? #188
Comments
Hi, first of all I need to make sure I fully understand what does batch API mean. Do you mean that you have a list of different API urls and you want to request them with same single question prompt one by one? If you want to do so, make sure all APIs are compatible to OpenAI-format, then you can create agent instances in closure environments with different settings for each one. Code Example: import Agently
def request_model(model_info, prompt):
agent = (
Agently.create_agent()
.set_settings("current_model", "OAIClient")
.set_settings("model.OAIClient.url", model_info["url"])
.set_settings("model.OAIClient.auth", { "api_key": model_info["api_key"] })
.set_settings("model.OAIClient.options", { "model": model_info["model"] })
)
return agent.input(prompt).start()
model_info_list = [...]
prompt = "xxxx"
for i = range(len(model_info_list)):
result = request_model(model_info_list, prompt)
print(f"{ model_info_list['name'] }:", result) |
Actually it's https://platform.openai.com/docs/guides/batch for cheaper inference |
OK, I see. We did not support batch api yet, but I think it is a good todo. So just to clarify the usage story:
Show some imaginary example code would be help too. Thanks! |
Hi! First of all this is an outstanding project. Easy to use and elegant.
However I have a use case where I need to run a lot of single question prompts, so I would like to use batch api if possible. Does the framework has any support of batch api?
Thank you!
The text was updated successfully, but these errors were encountered: