You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently our benchmarks are visible in a ucc/benchmarks Jupyter notebook, but this is only as accurate as the last time it was run and all the specifics of the individual developer's OS and the versions they used.
The idea with this issue is to systematize the benchmarking to run, say, every time someone merges with main and prominently display the benchmarking results in our README via (most likely) a Github Action.
The text was updated successfully, but these errors were encountered:
Currently our benchmarks are visible in a ucc/benchmarks Jupyter notebook, but this is only as accurate as the last time it was run and all the specifics of the individual developer's OS and the versions they used.
The idea with this issue is to systematize the benchmarking to run, say, every time someone merges with main and prominently display the benchmarking results in our README via (most likely) a Github Action.
The text was updated successfully, but these errors were encountered: