Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any way to use batch API? #188

Open
Curtis-Jiang-2020 opened this issue Jan 10, 2025 · 3 comments
Open

Any way to use batch API? #188

Curtis-Jiang-2020 opened this issue Jan 10, 2025 · 3 comments

Comments

@Curtis-Jiang-2020
Copy link

Hi! First of all this is an outstanding project. Easy to use and elegant.
However I have a use case where I need to run a lot of single question prompts, so I would like to use batch api if possible. Does the framework has any support of batch api?

Thank you!

@Maplemx
Copy link
Collaborator

Maplemx commented Jan 11, 2025

Hi, first of all I need to make sure I fully understand what does batch API mean.

Do you mean that you have a list of different API urls and you want to request them with same single question prompt one by one?

If you want to do so, make sure all APIs are compatible to OpenAI-format, then you can create agent instances in closure environments with different settings for each one.

Code Example:

import Agently

def request_model(model_info, prompt):
    agent = (
        Agently.create_agent()
            .set_settings("current_model", "OAIClient")
            .set_settings("model.OAIClient.url", model_info["url"])
            .set_settings("model.OAIClient.auth", { "api_key": model_info["api_key"] })
            .set_settings("model.OAIClient.options", { "model": model_info["model"] })
    )
    return agent.input(prompt).start()
    

model_info_list = [...]
prompt = "xxxx"
for i = range(len(model_info_list)):
    result = request_model(model_info_list, prompt)
    print(f"{ model_info_list['name'] }:", result)

@cnbeining
Copy link
Contributor

Actually it's https://platform.openai.com/docs/guides/batch for cheaper inference

@Maplemx
Copy link
Collaborator

Maplemx commented Jan 23, 2025

Actually it's https://platform.openai.com/docs/guides/batch for cheaper inference

OK, I see. We did not support batch api yet, but I think it is a good todo.

So just to clarify the usage story:

  • You create a batch file with all settings and prompts in list
  • [?]You want to use Agently to rewrite prompts to make your generation orders easier to be expressed, or to ensure result generation can be parsed, then use the rewrited-prompts to cover the origin ones in the batch file? Or just use your original prompts are OK?
  • [?]You want Agently to help you upload batch file to OpenAI Batch API or do it yourself?
  • [?] You want to retrieve the results manually using OpenAI Batch Client or use Agently to retrieve results (and parsed the results into formatted data or you can describe how do you want to deal with the retrieved results)

Show some imaginary example code would be help too. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants