Results of SISAP 2024 Indexing challenge #8
Replies: 2 comments 3 replies
-
Hello @maumueller, Thanks for listing the results of the indexing challenge. I have three comments/questions. First, the query time of our solution ( Second, there seems to be an incorrect size of the private query set, as https://sisap-challenges.github.io/2024/tasks/ indicates that there should be 100K queries, not 10K. Finally, based on the provided plot for Task 3, the Thanks for your answers. David |
Beta Was this translation helpful? Give feedback.
-
Dear @Neiko2002 @cole-foster @elgerpus @danielbenedi6 @eth42 @ProchazkaDavid You can find the private query set and its gold standard on the SISAP Indexing Challenge site Please note that you may need to download using the "Save as" option and accept some security advice from the Chrome browser since these links are under the plain HTTP protocol. Best, |
Beta Was this translation helpful? Give feedback.
-
Dear @Neiko2002 @cole-foster @elgerpus @danielbenedi6 @eth42 @ProchazkaDavid
We are glad to share a summary of the SISAP 2024 indexing challenge results. In particular, the ranking per subset per track; more detailed results will be listed in the overview and discussed during the challenge track of the SISAP 2024 conference. Please recall the task descriptions from https://sisap-challenges.github.io/tasks/.
We prepared docker Linux images using the submitted solutions and ran it on a 28-core (56 threads) Intel(R) Xeon(R) CPU E5-2690 V4 workstation with 512GiB of RAM (enforcing the CPU/memory limits through docker). The evaluation was run on a private query set taken from an unseen batch of the LAION2B dataset. While for the public query set, we removed near-duplicate objects/queries, (considering as near-duplicate any object inside a radius of 0.15 of an already selected query point); for the private query set, we remove near-duplicates within a radius of 0.2, which made the queries more diverse. You will see this reflected in your results. We will make this private query set available soon so that you can reproduce the results.
Thank you for your participation,
Eric (@sadit), Martin (@maumueller), and Vladimir (@VladimirMic)
Overall ranking
Individual tasks
The individual plots showing the performance/quality trade-off that was achieved by the proposed implementations can be seen in the following plots:
You can find the detailed result files that these plots are based on in the following Github repository: https://github.com/sisap-challenges/challenge2024/
Beta Was this translation helpful? Give feedback.
All reactions